Source originale du contenu
Tavis McGinn applied for a job at Facebook last year hoping to work in market research. He had previously spent three years at Google, where he helped large advertisers refine their marketing campaigns across the company’s family of products. But part way through the interview process at Facebook, the recruiter told McGinn the company had something else in mind for him. How would he like to track the public perception of Mark Zuckerberg?
It was April, and Facebook was caught up in the fallout of the 2016 US presidential election. After initially discounting the possibility that fake news had contributed to Donald Trump’s victory, Facebook acknowledged that Russia-linked groups had spent more than $100,000 on political advertising. Zuckerberg undertook a nationwide listening tour modeled after a modern political campaign. McGinn would fill another role common to political campaigns: leading an ongoing poll operation dedicated to tracking minute changes in Zuckerberg’s public perception.
“It was a very unusual role,” McGinn says. “It was my job to do surveys and focus groups globally to understand why people like Mark Zuckerberg, whether they think they can trust him, and whether they’ve even heard of him. That’s especially important outside of the United States.”
McGinn tracked a wide range of questions related to Zuckerberg’s public perception. “Not just him in the abstract, but do people like Mark’s speeches? Do they like his interviews with the press? Do people like his posts on Facebook? It’s a bit like a political campaign, in the sense that you’re constantly measuring how every piece of communication lands. If Mark’s doing a barbecue in his backyard and he hops on Facebook Live, how do people respond to that?”
Facebook worked to develop an understanding of Zuckerberg’s perception that went beyond simple “thumbs-up” or “thumbs-down” metrics, McGinn says. “If Mark gives a speech and he’s talking about immigration and universal health care and access to equal education, it’s looking at all the different topics that Mark mentions and seeing what resonates with different audiences in the United States,” he says. “It’s very advanced research.”
Facebook also conducted similar research on behalf of the company’s chief operating officer, Sheryl Sandberg. Surveys measured awareness about Sandberg, whether people liked and trusted her, and how they felt about her speeches, interviews, and Facebook posts. McGinn also surveyed people on whether they associated Sandberg with Facebook or her personal initiatives, such as Lean In and Option B.
The company further measured how Sandberg’s public image compared with Zuckerberg’s. The results were shared directly with Zuckerberg and Sandberg, along with their lieutenants, their communications teams, and external public relations agencies. Research included both surveys and focus groups and was conducted around the world, McGinn says. (He declined to share the budget for the project, but described it as “very, very expensive.”)
Facebook is not unique among tech companies in conducting surveys to gauge perceptions about its brand. Sometimes, those surveys include questions about founders and CEOs. Amid its own crisis last year, Uber surveyed customers on their opinions about the brand and about its former CEO Travis Kalanick. (Perceptions of Kalanick were so negative that the board used the data in an effort to persuade him to quit, according to a report in Bloomberg last month.)
But it is unusual for a company to have a staff person charged exclusively with monitoring perceptions of its CEO full time. Facebook began monitoring Zuckerberg’s perception about two years ago, a spokesman says. The move reflects his close association with Facebook’s brand and his role as the company’s chief spokesman. Zuckerberg regularly posts announcements on his personal Facebook profile, which has more than 102 million followers. Understanding how Zuckerberg’s posts and speeches resonate globally could help the company navigate a difficult period in which it has faced stern criticism from lawmakers, regulators, journalists, and average users.
The company declined to comment on McGinn’s role, but the polling was not designed to influence Facebook products or policies, a spokesman said, and no specific changes have resulted from it.
“Facebook is Mark, and Mark is Facebook,” McGinn says. “Mark has 60 percent voting rights for Facebook. So you have one individual, 33 years old, who has basically full control of the experience of 2 billion people around the world. That’s unprecedented. Even the president of the United States has checks and balances. At Facebook, it’s really this one person.”
McGinn declined to discuss the results of his polling at Facebook, saying nondisclosure agreements prevented him from doing so. But he said he decided to leave the company after only six months after coming to believe that Facebook had a negative effect on the world.
“I joined Facebook hoping to have an impact from the inside,” he says. “I thought, here’s this huge machine that has a tremendous influence on society, and there’s nothing I can do as an outsider. But if I join the company, and I’m regularly taking the pulse of Americans to Mark, maybe, just maybe that could change the way the company does business. I worked there for six months and I realized that even on the inside, I was not going to be able to change the way that the company does business. I couldn’t change the values. I couldn’t change the culture. I was probably far too optimistic.”
</span>
After McGinn left Facebook, he founded a new market research firm named Honest Data. On January 27th, he posted the results of a poll he had conducted regarding opinions of Facebook. The poll, which surveyed 2,000 Americans using Google Consumer Surveys, asked respondents to evaluate a list of companies and mark which ones “are having a negative impact on society.” Among tech companies, 32 percent of Americans said Facebook is harmful. A separate survey, which placed Facebook among other large brands including Walmart, McDonald’s, and Marlboro, found that 27 percent said it is harmful.
The results largely matched McGinn’s own perception. “I think research can be very powerful, if people are willing to listen,” McGinn says. “But I decided after six months that it was a waste of my time to be there. I didn’t feel great about the product. I didn’t feel proud to tell people I worked at Facebook. I didn’t feel I was helping the world.”
McGinn sees further opportunities for research into the company. “It would be interesting to dig in and see where the breakdown in trust was happening,” he says. “Is it because of fake news? It is because Facebook isn’t taking accountability? Is it because they’re addicted to Facebook? I’m interested in digging deeper, and seeing if that trust can be rebuilt. Everyone makes mistakes, but if you break trust and someone says, ‘I’m confident you will make this mistake again because you don’t share my values,’ that’s a harder thing for a company to overcome.”
McGinn says there are “plenty of good people at Facebook trying to make a difference.” He doesn’t believe the company has acted with bad intentions. But he does believe the company’s priorities have had negative consequences.
“I think Facebook could have a really good impact on society,” McGinn says. “I think a lot of what this comes down to is how a company chooses to measure success.” The company’s historic focus on acquiring the maximum number of users, and occupying the maximum share of their time, distorted its perspective, he says.
“Facebook has never had on their report card, in my opinion, true social outcomes,” McGinn says. “From a business perspective, Facebook has done phenomenally well. Facebook is a cash cow. But from a social perspective, those metrics could be inversely related. The more Facebook builds profit, the more it’s at the expense of the American people.”