Behind Facebook's baby step fixes: Defending its ad business
By BARBARA ORTUTAY and ANICK JESDANUN, Associated Press
Mar 22, 2018 7:00 PM CDT
FILE - This Jan. 17, 2017, file photo shows a Facebook logo being displayed in a start-up companies gathering at Paris' Station F, in Paris. Facebook is taking baby steps for now to address the latest privacy scandal after news broke Friday, March 16, 2018, that Cambridge Analytica may have used data...   (Associated Press)

NEW YORK (AP) — Wondering why Facebook seems to be taking baby steps to address the biggest scandal in its history? Stronger safeguards on user data might damage Facebook's core business of using what it knows about you to sell ads that target your interests.

Facebook is proposing only narrow countermeasures that address the specifics of the furor over Cambridge Analytica. That's the data mining firm that worked for Donald Trump's campaign, and now stands accused of lifting data from some 50 million Facebook users for the purpose of influencing voters.

Those measures, announced Wednesday by CEO Mark Zuckerberg, mostly involve new limits on what Facebook apps can do with the user data they collect. One such errant app was central to the Cambridge Analytica debacle.

But those steps don't get at what many outsiders see as bigger problems at Facebook: its rampant data collection from users, its embrace of political ads that target individuals and small demographic groups with precision, and its apparent inability to end malicious use of its service by governments, shady corporations and criminal elements.

"They're being very deft and creating the illusion of trust," said Scott Galloway, a New York University professor of marketing. But by focusing on the mechanics of how apps work on its service, he said, Facebook is failing to take meaningful action to ensure it's not "weaponized" by scammers, manipulators and other nefarious types.

Ultimately, Facebook is a data-collection company and without user data, it would wither and die. But how much data it sucks in, and what it does with it, is a question of major public importance — one that touches on the health of democracy itself, privacy advocates say.

It's just not a question that Facebook seems to want to address.

Facebook made $40 billion in advertising revenue last year, and that's expected to rise 22 percent this year to $49 billion, according to research firm eMarketer. Wall Street analysts who follow Facebook don't seem worried yet, despite the sharp drop in the company's stock this week. That's because analysts don't expect the company to have to change the way it does business.

Like its closest rival Google, Facebook offers companies an unparalleled way to target people for advertising, right down to their most granular details. It can, for instance, single out users who live in Kansas and have listed Bernie Sanders and same-sex marriage as their interests — which is exactly what some Russian-linked ads did as part of a propaganda campaign during the 2016 U.S. presidential campaign.

Apps — everything from dumb personality quizzes to games (remember "FarmVille?") — have been able to harvest user information since 2007, when Facebook opened its service to outside developers. Facebook has since restricted what types of data apps could access, notably in 2014, but as the Cambridge Analytica debacle shows, loopholes remained.

In the wake of the Cambridge Analytica outcry, Facebook is once again cracking down — but solely on apps. Its new restrictions limit what data apps can access and will cut them off from your information if you don't use them for three months. Facebook will also conduct its own audits of apps that appear to suck in large quantities of data, although it has said nothing about allowing independent audits, leaving users no alternative but to trust that Facebook itself has their interests at heart.

One better solution, says Marc Rotenberg of the Electronic Privacy Information Center: Make apps alert users whenever data transfers are taking place. That would let people decide whether or not to keep them.

Zuckerberg apologized for the debacle during a CNN interview Wednesday, but stopped short of endorsing broad privacy legislation and instead said only that he would support the "right" kind of rules. For instance, he said, it might be OK to require online political ads to disclose who paid for them — but then talked up Facebook's own voluntary efforts instead of endorsing an existing bill that would do just that.

Jeffrey Chester, executive director for the consumer-privacy group Center for Digital Democracy, said Facebook is blaming app developers and outside firms like Cambridge Analytica "instead of saying, 'we need to really look at cleaning house.'"

Many users have had an uneasy relationship with the company, largely due to its series of privacy scandals over the years that seem to be growing more serious. Back in 2007, for example, Facebook launched a service called Beacon, which tracked what users did on other websites and published it on people's news feeds, often without the users' knowledge. Facebook was sued over Beacon and shut it down in 2009.

The current scandal has fueled a nascent #deletefacebook movement. While there's no sign of a Facebook exodus so far, the possibility presents yet another threat to the company.

"Until this debacle, I had no idea," said Pat Hager of Bismarck, North Dakota, who is considering deactivating Facebook. "I wouldn't have a problem with them putting ads on the side or in the posts like they do, but the data sharing that was going on behind our backs feels sneaky and devious."

___

AP Technology Writer Mae Anderson contributed to this story.