Please take one minute to fill out our MIT Docubase user survey x

_Interview with Brett Gaylor

Brett Gaylor answers questions about his new project “Do Not Track”: How did it all start? How does the personalized documentary format work? What are the implications of online tracking? What is the impact of the documentary so far?

Deniz Tortum interviewed Brett Gaylor, director of Do Not Track, a web documentary about online tracking produced by Upian, NFB, ArteTV, and BR. Below are selections from their conversation.

For a detailed study of the production process of Do Not Track, read our case study here!

Beginnings

Deniz: Let’s begin with why you started this project: when did you first learn that you were being tracked? When did it actually click that this has big implications?

Brett: I don’t know if I had that penny-drop moment. I think what really solidified my [interest in tracking] was Facebook. I think all of us are like, “Yeah it’s creepy what Facebook is doing,” so I think I left Facebook three or four years ago because I found it a bit crass: they take everything about your life and then sell it to people. It was very blatant;  it was crass capitalism. I don’t want to support this service in the same way that I don’t want to feed my kids McDonald’s food. It’s not like I’m out there to protest, but I just choose not to.

I think what really made me want to make a film about this was when I started to think about the economic issues of web tracking, and how this constant surveillance was shaping the potential web that we would see; that probably had a lot to do with my actually working in tech.

After I made RIP!: A Remix Manifesto, I worked at Mozilla for about four and a half years. I was a product manager there, and then eventually I had a VP position. There you’re competing for the attention of other products on the web, and you realize you’ve been immersed in this culture of the Valley, but in a good way. They’re trying to use the web to effect positive change, but it’s within the culture of the web, it’s within the culture of these products.

I saw how much the web was becoming based on collecting data about people. If you look at how venture capital is raised, it’s almost all based on this notion of gathering a huge audience and then selling that to somebody else. There’s almost no other business model.

Then I saw how that affected the sort of content that would be on these platforms. You would see it in the listicle culture of “Five things about cheese that you really need to …” You saw the emergence of this click-bait. It was more of ‘a frog in a pot of water’ than a sort of ‘A-ha, Oh my gosh, [moment] where I saw a certain ad.’ It was more about being really steeped in the culture.

The other thing about Mozilla that I still love today is this notion of thinking big and having your sense of scale, so that you can say, “Oh I don’t really like this thing about the internet, what if we changed it? What if we ship this feature, will that change the web?”

I think that’s still really the cool thing about the web, and you still see it all the time: Uber will release an app and transportation is totally changed for good or for bad. That was also the thing that came out of working in that culture–how do we make a documentary that’s sort of born of that impulse to think in a way that you’re creating media and you’re going to be competing with Buzzfeed? It’s understanding the environment that you’re in and then trying to do something that is going to be effective in that environment.

Format & Personalized Documentary

Deniz: How would you define the format, more specifically than an i-doc or web doc?

Brett: It’s funny because in the end it’s pretty straightforward. It’s a series, it’s linear in a way. The hook is meant to be that it’s a personalized documentary. We started getting into the idea of an engagement campaign.

Deniz: Let’s talk a bit about personalized documentary. It’s been around for a while, but it’s been getting attention in the i-doc scene this last year. It is increasing, and I’m thinking of In Limbo and Karen here. What are your thoughts on the personalized documentary? Have you been thinking about implications of personalization? For Do Not Track, how are you designing this personalization?

Brett: Personalized documentary is borrowing some of these techniques that you’ve seen in advertising and in brand approaches. Did you ever see Take This Lollipop or The Wilderness Downtown?

Deniz: Yes.

Brett: My buddy Ben Moskowitz used to call it ‘powered by profile.’ You log in and you give some aspect of your social media, and that’s going to inform the narrative. I always found that these approaches were really difficult, because you have to suspend your disbelief. So you’re like, “Okay, right, there’s a pedophile inside my window and he’s going to bust in here any minute…”

Or they’d be really crude; you log into Facebook and then at the end of the credits of the movie, IT’S YOUR NAME! I know that you have my name because I logged in with my profile. It’s not a great reveal.  I know you have my Facebook. I gave it to you.

Now people are trying to do more subtle approaches. Karen’s idea is that she’s learning from you based on your decisions, and that uses some of the same stuff that we use in episode three, where it’s basically judging one of the big five personality traits, Myer Brigg’s personality assessments. They’re ways that you can generalize people by asking sets of questions.

My thoughts on it are that for a documentary it makes sense if it helps to situate you in a context, or uses your nonfiction information. So it’s not fictional that I use a Mac or that I went to these websites or I’m at this location. It’s helpful because it situates the viewer and we hope that it will give more emotional connection to what we’re talking about, that you find yourself placed within that narrative universe.

If used properly it can be a shortcut that can be, “Okay I get it, I get it.” That’s what we’re trying to do, and it’s fun.

Deniz: Can there be an algorithmic remix culture with personalization, that computers are mixing all your information?

Brett: I would say one of the things that documentary is behind on is data visualization, and using real time data to tell stories. I think a lot of the i-docs that we’ve seen are based on branched narrative or trying to find ways to remix the static database for the viewer.

One thing that was in our style guide is to lean back;  this is a movie and movies are linear experiences in time, but that linear experience should be informed by the viewer. We will have a bit of that: based on your data you see things you wouldn’t have otherwise seen, but it’s chosen for you. I don’t know how much audiences like being their own editor.

Deniz: I also have a similar feeling. Being an editor is not that much fun, or at least not fun if you’re not spending tons of hours. You start liking it after you’ve suffered through it.

Brett: Sometimes edits are very complex pieces of logic; you have to make the audience work with the way you put together scenes. Once you make an audience work like that, reward them, give them something that they can comprehend so that the next time you ask them to do it they’re ready for it, so it’s only so much cognitive load. Asking people to take their own branch in the story, that’s an incredibly high bar to ask, right?

Back to what we were talking about with product thinking. When designing software, you would never ask somebody where they want the shopping cart button, here or there? Where do you want the donate to Barack Obama button to be–you decide, because you’re the boss. You wouldn’t do that, you would measure and say nobody’s understanding this thing, they’re bouncing because we are forcing them to create their own interface each time.

Deniz: I just watched the third episode by the way. It is pretty great.  I think it’s my favorite so far, and I’m assuming it’s just going to become more interesting with each episode.

Brett: Did it work for you?

Deniz: The Facebook API? I think it did.

Brett: That one was really touch-and-go because it uses Facebook, so we had to wait to get it approved by them, and it’s very hard to test because it’s based on how active you are in Facebook, so it doesn’t work for some people because they don’t like anything on Facebook. You obviously have liked things on Facebook before.

Deniz: That’s true. It also has almost this horoscope quality. It works but it is too general.

Brett: Yeah. But it’s real… I mean is it real? They did a fairly rich peer-reviewed paper of this idea and now they can license that technology.

Deniz: I think it works well. The only thing is that it’s still not transparent. Even though you know your data is being tracked and that they can tell everything about you, the way they do it is still a mystery.

Brett: Yeah, it is totally a mystery. You’ll never know and I don’t know. Even when I was interviewing that scientist, I asked him how it’s done and he said they analyze people. I asked what does it actually do, and he said they take likes and analyze them.

I think it’s hard for them to talk about that stuff. It’s like asking me or you, “How do you send an email?” We’re like, “I just do it, I send an email to you.”

Deniz: So they’re not talking about it because of its secrecy, but rather they don’t have any easy way of explaining it?

Brett: Facebook certainly is not talking because of secrecy. But we actually do talk to different data scientists over the course of the series that are just everyday people working on this kind of stuff, and I find that it is hard for them to talk about their work in lay-terms. That’s been difficult, so we try and hold their hand through the storytelling process.

Deniz: In the third episode I hesitated–do I login with my Facebook, or is this a trick?

Brett: We’re going to have a little bit of that ultimate play, and there’s a core to it. There’s a playful, slightly dangerous feeling that we wanted it to have.

Deniz: I think that really helps, that borderline creepy factor…

Brett: It’s funny because I’ve been resistant to creepy, and now I’m okay. When it came out, the poster was too dark and we would get all these reviews such as Vice Magazine saying this is a totally creepy documentary. “No, no it’s not,” then you’re like, “Okay whatever.” We did a CBC Radio interview with millions of listeners. Some said it was the creepiest thing they’d ever done. No it’s not man, it’s got gifs and music. We can’t control that, at least it is an authentic experience, an emotion that people are having.

Most people don’t really have words to describe when their expectations of privacy are upset, other than ‘creepy.’ Our sensitivity around privacy is not developed because everything we’ve been talking about is pretty new.

For people to see “Wait that’s me? That’s happening to me every single day?” That’s why we do the personalization, so that people can have some sort of emotional experience around privacy.

Algorithmic Discrimination & Changing Social and Institutional Norms

Deniz: One question about content: this occurred to me in the third episode of Do Not Track, I actually haven’t thought about it this way before. Throughout the episode you get profiled according to your personality and behaviors, and then within the episode someone says these data systems are good tools to fight against institutional discrimination, or racial or class-based discrimination, but then they’re also creating this personality-based or behavior-based discrimination. Can you talk a bit about that?

Brett: I haven’t really figured that one out yet, but that’s why I had to really leave that in there, because it was totally fascinating.

The interview with Michael really showed that he thinks he’s doing a great thing for the world, he totally believes in what he’s doing. In that world, if it comes to pass, there will be people that, based on a very arbitrary profile, are going to be discriminated against. It highlighted for me that often times this world is not being created with nefarious intentions, it’s just that our institutions are not up to it right now, we have to re-think our institutions: are they going to do checks on these algorithms, how are we going to make sure that they’re fair?

Facebook sends the amazing year, a photo montage and it’s great for 99% of people, except for somebody who had a terrible year. If you have a great job and you live in San Francisco and you’re making six-figures a year, it’s like, yeah, why wouldn’t I want to look at my life last year?

Deniz: I agree that institutions are not up to that, but it is a small group of people creating these systems and it’s hard for them to see outside of their circles. Other conversations around social norms, that it’s rude to Google someone before meeting them for instance, that type of conversation is much needed and I’m glad that you’re getting at that.

Brett: That’s why we did the Illuminus thing [a fictional tracking company appearing in the film] in Do Not Track. We wondered how to film this. We couldn’t really, so we made a satirical company that’s doing things that other companies are starting to do. There’s Lenddo for example, they’re dipping their toes into this sort of financial lending.

What if there’s real science around your personality that can be judged from this, and if they are using these algorithms to decide if you’re trustworthy or not, where does it end? We talked to Emily Bell, she said it might be okay what they’re doing now, but what if they decide to do something that isn’t okay? Who would hold them accountable, what would be the repercussions? I don’t know, nobody knows.

Impact

Deniz: Regarding impact, what are some of the major design decisions you’ve made to influence people to change their minds or take action?

Brett: What people want to know at the end is what to do about it: okay, tell me something to install right now. We created a sort of a how-to for that, and that was a huge part of how we got press pick-up.

A lot of the articles in the press were about five things that you can do to protect your privacy. I know a lot of filmmakers get annoyed by this, that sometimes the coverage of their project will almost entirely be about the subject and not about the work, but in our case that’s what we wanted. Having a certain amount of visitors to the interactive documentary is one thing, but then the impressions and cultural impact within a two-week time has been huge, it’s been like nothing that I’ve worked on before.

We were on national radio in Canada, in Australia, UK and France, Switzerland; huge numbers of people were talking about privacy. I don’t really care if it’s associated to this particular documentary work. Overwhelmingly, we’re realizing that this is the first time that most people are having this conversation. That’s pretty exciting as a social documentary maker–to be able to set the frame of that conversation.

I did this interview on CBC Radio and we did a really good take-down of the nothing-to-hide argument, which if you’re a privacy advocate that’s 101. When I said, “Okay if you think you have nothing to hide then why do you put your letters in an envelope, or why do you have shades on your windows?” They were like, “Oh yeah I never thought about it like that.” That’s impact.

Think of it like a funnel, think of the steps that you’re going to go through: you have to create stuff that’s going to work for most people at the top, and then at the bottom there’s going to be a much smaller audience, so you have to have stuff that will meet people along the way. At the bottom we want people to install add-ons, buy a separate phone, do cryptography, all that kind of stuff. At the top it’s to have that penny drop of, “Oh I never thought about privacy in this way.”

I think that the other thing right now is that you realize the audience for interactive work is smaller than it should be. How do we get in front of people that have never seen this kind of work before?

like 3

0 comments

show comments

Join the Discussion

 
 
powered by_