The Facebook engineer was itching to know why his date hadn’t responded to his messages. Perhaps there was a simple explanation—maybe she was sick or on vacation.
So at 10 p.m. one night in the company’s Menlo Park headquarters, he brought up her Facebook profile on the company’s internal systems and began looking at her personal data. Her politics, her lifestyle, her interests—even her real-time location.
The engineer would be fired for his behavior, along with 51 other employees who had inappropriately abused their access to company data, a privilege that was then available to everyone who worked at Facebook, regardless of their job function or seniority. The vast majority of the 51 were just like him: men looking up information about the women they were interested in.
In September 2015, after Alex Stamos, the new chief security officer, brought the issue to Mark Zuckerberg’s attention, the CEO ordered a system overhaul to restrict employee access to user data. It was a rare victory for Stamos, one in which he convinced Zuckerberg that Facebook’s design was to blame, rather than individual behavior.
So begins An Ugly Truth, a new book about Facebook written by veteran New York Times reporters Sheera Frenkel and Cecilia Kang. With Frenkel’s expertise in cybersecurity, Kang’s expertise in technology and regulatory policy, and their deep well of sources, the duo provide a compelling account of Facebook’s years spanning the 2016 and 2020 elections.
Stamos would no longer be so lucky. The issues that derived from Facebook’s business model would only escalate in the years that followed but as Stamos unearthed more egregious problems, including Russian interference in US elections, he was pushed out for making Zuckerberg and Sheryl Sandberg face inconvenient truths. Once he left, the leadership continued to refuse to address a whole host of profoundly disturbing problems, including the Cambridge Analytica scandal, the genocide in Myanmar, and rampant covid misinformation.
The authors, Cecilia Kang and Sheera FrenkelBEOWULF SHEEHAN
Frenkel and Kang argue that Facebook’s problems today are not the product of a company that lost its way. Instead they are part of its very design, built atop Zuckerberg’s narrow worldview, the careless privacy culture he cultivated, and the staggering ambitions he chased with Sandberg.
When the company was still small, perhaps such a lack of foresight and imagination could be excused. But since then, Zuckerberg’s and Sandberg’s decisions have shown that growth and revenue trump everything else.
In a chapter titled “Company Over Country,” for example, the authors chronicle how the leadership tried to bury the extent of Russian election interference on the platform from the US intelligence community, Congress, and the American public. They censored the Facebook security team’s multiple attempts to publish details of what they had found, and cherry-picked the data to downplay the severity and partisan nature of the problem. When Stamos proposed a redesign of the company’s organization to prevent a repeat of the issue, other leaders dismissed the idea as “alarmist” and focused their resources on getting control of the public narrative and keeping regulators at bay.
In 2014, a similar pattern began to play out in Facebook’s response to the escalating violence in Myanmar, detailed in the chapter “Think Before You Share.” A year prior, Myanmar-based activists had already begun to warn the company about the concerning levels of hate speech and misinformation on the platform being directed at the country’s Rohingya Muslim minority. But driven by Zuckerberg’s desire to expand globally, Facebook didn’t take the warnings seriously.
When riots erupted in the country, the company further underscored their priorities. It remained silent in the face of two deaths and fourteen injured but jumped in the moment the Burmese government cut off Facebook access for the country. Leadership then continued to delay investments and platform changes that could have prevented the violence from getting worse because it risked reducing user engagement. By 2017, ethnic tensions had devolved into a full-blown genocide, which the UN later found had been “substantively contributed to” by
By: Karen Hao
Title: Review: Why Facebook can never fix itself
Sourced From: www.technologyreview.com/2021/07/21/1029818/facebook-ugly-truth-frenkel-kang-nyt/
Published Date: Wed, 21 Jul 2021 09:00:00 +0000
Did you miss our previous article…