When its Cambridge Analytica data-mining scandal broke, tech giant Facebook found out there really is such a thing as bad publicity.
The public has been so turned off that the once-indispensable company has mounted a paid advertising campaign to lure users back; its founder and CEO Mark Zuckerberg is scheduled to testify in Congress next week in front of probing lawmakers from both parties who know that grilling Zuckerberg on privacy issues will play well with voters back home.
But after a new report this week, Facebook’s bad publicity just got worse.
According to CNBC, Facebook had been courting major American medical systems to join it in a research project that would match data Facebook collected about its users with medical information about hospital patients.
It was led by Dr. Freddy Abnousi, a cardiologist Facebook employee who, according to CNBC, used his LinkedIn profile to describe his role with the company as “leading top secret projects.” (That description has apparently changed since the CNBC story was published. On Friday morning, Abnousi’s LinkedIn description was a more anodyne “runs confidential projects.”)
Here’s how CNBC described the project:
“Facebook’s pitch, according to two people who heard it and one who is familiar with the project, was to combine what a health system knows about its patients (such as: person has heart disease, is age 50, takes 2 medications and made 3 trips to the hospital this year) with what Facebook knows (such as: user is age 50, married with 3 kids, English isn’t a primary language, actively engages with the community by sending a lot of messages).
“The project would then figure out if this combined information could improve patient care, initially with a focus on cardiovascular health. For instance, if Facebook could determine that an elderly patient doesn’t have many nearby close friends or much community support, the health system might decide to send over a nurse to check in after a major surgery.”
That might sound harmless enough, but how much do Americans really want to have health care systems or private companies to know about their personal lives?
And considering that in the Obamacare era, health care in the United States is largely a province of the government, how much do Americans want their government intruding in places it has no business at all?
Fortunately, the project was aborted as Facebook came under an intense glare of publicity after word broke about its connection to Cambridge Analytica, a political consulting firm founded by Republican mega-donor Robert Mercer and former chief White House strategist Steve Bannon.
But still, the fact that it was even in the works, and could be revived, is more than a little unsettling.
“It’s kind of terrifying stuff, bordering on creepy, but it could have been interesting, had they gone about it the right way,” Chrissy Farr, the CNBC reporter who broke the story, told the network during an interview Thursday.
“The idea would be that this data set could be used for research purposes, at least initially, that’s what Facebook has reassured us. But who knows, you know? If this had been allowed to go ahead, we don’t know how ambitious it would have gotten.”
That’s anybody’s guess — and the answers aren’t likely good.
Considering Facebook’s phenomenal growth in the 14 years since it was founded, the potential of the project is fearsome. And considering the company’s consistently liberal bent, and the liberal obsession with government controlling American health care, the ambitions would undoubtedly run in one direction only — taking more control of health care away from the individual and putting it in government hands.
So, the fact that it’s been aborted — for now — is an unexpected benefit from the fallout of the Cambridge Analytica scandal.
For Facebook, this revelation might be just another round of very bad publicity.
For the American public, whose personal health information was actually at issue, it could have gotten a lot worse.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.