How a story about ivermectin and hospital beds went wrong

Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer

Over the long weekend, one of the trending topics on a number of social-media platforms was a news item from Oklahoma with some terrifying information: it said that so many people were in hospital due to overdoses of ivermectin—a drug originally designed for horses, which some anti-vaccine sources have been promoting (incorrectly) as a defense against COVID—that there was no room in the intensive-care units for other patients, including those with gunshot wounds. The story, from a local news outlet called KFOR, contained quotes from an interview with Dr. Jason McElyea, a local physician, and the article quickly got picked up by a number of national outlets, including Rolling Stone magazine, the Guardian in the UK, Newsweek magazine, and Business Insider. A producer for MSNBC repeated the claim on Twitter (although she later deleted it), and so did the Rachel Maddow show.

Not long afterward, the story started to spring some major holes. As detailed on Twitter by Drew Holden—a public-affairs consultant in Washington, DC and former assistant to a Republican congressman—and by Scott Alexander on his popular blog, Astral Codex Ten, the first sign that all was not right came with a statement from a large Oklahoma hospital, which said that there was no bed shortage due to ivermectin overdoses, and that the doctor quoted in the KFOR report didn’t work there. Others pointed out that in his original interview with the Oklahoma news outlet, McElyea hadn’t said anything about ivermectin cases crowding out other patients. He mentioned that there had been some ivermectin overdoses, and he said that beds were scarce, but the connection between the two seemed to be a leap that the news outlet and subsequent reports had added.

This was all it took for the story to catch fire with right-wing Twitter trolls and other conservative groups, as yet another example of the mainstream media‘s tendency to make up news stories to either make citizens of rural areas look stupid, or to overstate the risk of non-mainstream COVID treatments. Many latched on to the tweet from the Maddow account, and used it as evidence that no one fact-checks their statements any more, especially when they serve the purpose of making right-wing anti-vaxers and COVID denialists look bad. Others used the Rolling Stone story as an excuse to revisit the magazine’s infamous investigative story from 2014 on an alleged rape at the University of Virginia, which collapsed after the single source it was based on retracted some of her statements.

Continue reading

Facebook plans to show users even less political news

Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer

In February, Facebook announced an experiment designed to test how much political news users wanted in their News Feeds, a test that would remove that kind of content for a small group of users in the US, Canada, Brazil, and Indonesia, and then survey those users for their reactions to the removal. According to an update published on Tuesday, the company saw what it called “positive results” from the experiment, and as a result is now expanding the test to cover users in Ireland, Sweden, Spain, and Costa Rica. In addition, Facebook said it is tweaking the way it measures user behavior when interacting with political content: “We’ve learned that some engagement signals can better indicate what posts people find more valuable than others,” the company said. Instead of looking only at whether someone is likely to comment on or share a political post, Facebook said it will now put more emphasis on newer signals, such as how likely a person is to provide negative feedback about a political post or topic that happens to show up in their News Feed.

Facebook’s announcement is just the latest in a long series of algorithm changes aimed at de-emphasizing not just political news but professional news sources in general. Mark Zuckerberg, Facebook’s chief executive, said in 2018 the social network would be changing the News Feed to prioritize content shared by a user’s friends and family, rather than content from professional publishers and brands. “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions,” he said. Some hoped the changes might spur some media companies to stop relying on Facebook for their traffic, but it’s unclear whether that has happened to any significant extent. Meanwhile, some disinformation researchers have pointed out that Facebook’s prioritization of content from friends and family—including a focus on promoting the use of private groups—may actually have made the problem worse.

Whenever Facebook tweaks its news algorithm, media outlets and publishers around the world tend to hold their breath, because even a small change in such a large and influential platform can impact a publisher’s traffic significantly. When the company made a similar tweak to its algorithms designed to down-rank professional news content in favor of personal posts, some publishers saw traffic declines as high as 30 percent. According to Facebook’s note, the latest changes “will affect public affairs content more broadly [and] publishers may see an impact on their traffic.” The company didn’t say how big an impact they might see, but did add it is planning a “gradual and methodical rollout” for its experiment, and expects to announce further expansions in the coming months.

Continue reading

Facebook “transparency report” turns out to be anything but

Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer

Last week, Facebook released a report detailing some of the most popular content shared on the site in the second quarter of this year. The report is a first for the social network, and part of what the company has said is an attempt to be more transparent about its operations: Guy Rosen, Facebook’s vice president of integrity, described the content review as part of “a long journey” to be “by far the most transparent platform on the internet.” If that is the case, however, the story behind the creation of the report shows the company still has a long way to go to reach that goal.

To take just one example, Facebook’s new content report appears to be, at least in part, a co-ordinated response to critical reporting from Kevin Roose, a Times‘ technology columnist, who has been tracking the posts that get the most engagement on Facebook for some time, using the company’s own CrowdTangle tool, and has consistently found that right-wing pages get the most interaction from users.

This isn’t something Facebook likes to hear, apparently, so the content report tries to do two things to contradict that impression: the first is it tries to argue that engagement, or the number of times someone clicks on a link — which Roose uses as the metric for his Top 10 lists — isn’t the most important way of looking at content, and so it focuses instead on “reach,” or how many people saw a certain post. The second thing it tries to do is show that even the most popular content only amounts to only a tiny fraction of what gets seen on the platform (less than 0.1 percent, according to the report). As Robyn Caplan, a researcher with Data & Society has pointed out, this seems to be an attempt to show that disinformation on the platform isn’t a big deal because so few people see it.

In another blow to Facebook’s self-professed commitment to transparency, just after the report was published, the Times revealed that a previous version had been shelved, because it showed that one of the top links on the site was to a story that claimed a doctor died from the COVID-19 vaccine, a report from the Chicago Tribune that was circulated widely by anti-vaccination groups and pages. Facebook said this piece of data was left out because it was still tweaking the way in which its systems reported views of content — housecleaning, a Facebook spokesman called it — but the fact that the only post that got removed was a story that became hugely popular with anti-vax disinformation groups seemed more than a little suspicious, especially since Facebook has been the subject of some harsh criticism from President Joe Biden about exactly this kind of content.

In a series of interviews this week using CJR’s Galley discussion platform, I spoke with a number of experts in the technology and social media, including Roose. “Facebook was, initially, pretty excited that I was getting so much use out of CrowdTangle,” he said. “But once people started using the top 10 lists to accuse them of being a right-wing echo chamber, they lost their minds. Reporting I’ve done since then has revealed that executives were worried that they’d get blamed for Trump’s re-election in 2020, and they made pushing back on my Twitter lists an Urgent Company Priority.” Roose said it was an odd situation “when a trillion-dollar company spends months of time and effort to create a report whose primary purpose is making you look bad.”

Facebook may have spent a lot of time and effort putting it together, but Ethan Zuckerman, who runs the Initiative for Digital Public Infrastructure at the University of Massachusetts in Amherst and is the former founder and director of the MIT Media Lab, said that the main conclusion he came to after reading the report was how flawed the methodology was, since most of the top posts appeared to be spam. “Had a student turned in this list of URLs for me in a class on big data, I would have told them to redo the analysis,” he said. “That a trillion dollar company released this as a transparency report… is unbelievable.”

Alice Marwick, a researcher at the University of North Carolina, also noted that the report left out any content shared by private accounts, or in groups. “So even if five million people shared a URL on their private Facebook account, it’s excluded from the report,” she said. This seems like a PR effort, not a serious research report.” As if to reinforce that point, Facebook posted a job opening on Wednesday for someone in the marketing department to “help shape the consumer facing Integrity and Transparency narrative.” Our interview series continues on Galley all this week, and wraps up with an all-day roundtable Friday.

Here’s more on Facebook and transparency:

Policy piñata: Juan Ortiz Freuler, an affiliate at the Berkman Klein Center at Harvard, said the problem is “we are overly focused on products and outputs. We are trying to solve much of this as if it were a mathematical riddle that is waiting for the right set of variables to be punched in. But the problem we are facing is a political one. No number of graphs or disagreggated data will solve it.” The broader question is around how society chooses to balance freedom of expression with privacy and other considerations, he said, and whether we let private corporations continue to control that process. “Right now these debates are a bit like playing whack-a-mole…blindfolded. I call it the policy piñata. In this game, big tech gets to move the target around, and can always creep in from behind with a gotcha.”

Shifting focus: Robyn Caplan, a researcher with Data & Society, said that Facebook wants to shift the focus from engagement — which it used to push as the most important metric — to reach. “Focusing on engagement does run counter to the story Facebook wants to tell about itself – that its base is getting older and more right-wing,” she said. “Roose has run those numbers, with the help of CrowdTangle, and we already know who rises to the top. But I think through focusing on ‘reach’ Facebook is trying to do something else – they are trying to push back against a narrative that they are a ‘mass’ media that could have broad impacts on political life or public health. You see this messaging throughout their report. Facebook is pushing back against the impact and power that is being ascribed to their platform.”

CrowdTangled: NBC News reporter Brandy Zadrozny says one of the broader issues beyond the content report is that — as Roose has reported — Facebook appears to be phasing out CrowdTangle, the data tool that has allowed reporters and researchers to track the popularity of disinformation and other phenomena on the social network. “I basically use the tool every day,” she says. “It is the way we were able to tell stories about militia organizing, violence at reopen rallies, inauthentic activity by groups like the Epoch Times, health misinformation, and much much more.” Facebook has been pushing researchers to use other tools, and has broken up the internal team that used to run CrowdTangle, she says. “So Facebook is likely to close the tool any day now and we’ll lose that window into how bad guys use Facebook to trick us, lie to use, make us sick, etc.”

Other notable stories:

Facebook is considering setting up another arms-length body similar to its Oversight Board to advise the company on how to make decisions around election-related issues such as political ads, according to a report in the New York Times on Wednesday. “The social network has contacted academics to create a group to advise it on thorny election-related decisions,” the paper reported, based on interviews with sources close to the company. “The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation.” Facebook is expected to announce the commission this fall, in preparation for the 2022 midterm elections.

An ABC News staffer has filed a lawsuit against Michael Corn, the former top producer at “Good Morning America,” alleging he sexually assaulted her and fostered a toxic work environment, the Wall Street Journal reports. Kirstyn Crawford, a producer on the morning show, alleges that Corn assaulted her in 2015 during a business trip to Los Angeles. The suit also alleges that Jill McClain, another former ABC News producer, was sexually assaulted by Corn when the two worked at ABC’s “World News Tonight.” The suit also names ABC as a defendant, alleging the company received complaints about Corn’s conduct from several women, but failed to take disciplinary action against him.

Rana Cash was named the new executive editor of The Charlotte Observer, the first Black editor in the newspaper’s 135-year history. Cash, 50, is currently executive editor of the Savannah Morning News in Georgia, and has worked in sports and news journalism for three decades at outlets such as The Miami Herald, The Dallas Morning News, The Atlanta Journal-Constitution, The Minneapolis Star-Tribune and The Louisville Courier Journal. The Observer reports that Cash promised to “ask tough questions of leadership and to cover the city’s communities from the ground up.”

The Washingtonian has a profile of Sally Buzbee, the new executive editor of the Washington Post, and her plans to expand the newspaper’s reach and become “the everything newspaper.” Publisher Fred Ryan tells the magazine: “We want to grow. We want to grow domestically in terms of our readership across the country, and we want to grow globally with international readers. A lot of our strategy revolves around that.” The Post‘s audience is currently between 80 million and 100 million per month, the magazine says, and the newsroom has grown from fewer than 600 staff to roughly 1,000.

Evette Dionne, editor-in-chief of Bitch Media, said on Twitter that she is stepping down from her position after three years because she is “burned out” and “needs to rejuvenate.” Dionne said that it was “the honor of my career to helm Bitch and I am leaving it in the best hands. The staff is beyond capable of keeping the mission going, so please continue supporting them.” She added: I have done everything I came to do and now it’s time for me to get out of the way and make space for the next generation of independent media leaders.”

ESPN said it is removing host Rachel Nichols from all of its NBA programming, which includes canceling her daytime show “The Jump.” David Roberts, senior vice president of production at the network, said: “We mutually agreed that this approach regarding our NBA coverage was best for all concerned. Rachel is an excellent reporter, host and journalist, and we thank her for her many contributions to our NBA content.” Nichols is still under contract with ESPN for another year, but has been seen as on thin ice at the network since she made comments about Maria Taylor, a former colleague, in 2020. In a leaked phone conversation, Nichols said Taylor was picked to cover the NBA Finals because of her race.

MSNBC host Rachel Maddow has agreed to stay with the network thanks to a new contract that the Daily Beast reports will pay her $30 million per year to stay until after the 2024 election, according to anonymous sources who spoke with the news site. As part of the deal, Maddow’s long-running nightly show will end next year, and she will host a weekly program, and also have opportunities to develop “podcasts, documentaries, and other types of multimedia projects” across the various news and entertainment divisions of NBCUniversal.

The Atlantic’s subscriber base has grown by nearly 50 percent over the past year to more than 830,000, according to Digiday, based on the latest circulation statement filed with the nonprofit media auditing firm, the Alliance for Audited Media. That increase was fueled by the magazine’s coverage of the pandemic and the US election, but at the same time, the number of unique visitors to its website fell to just 18 million in July of this year, down from nearly 30 million in the same month last year, according to Comscore data.

Apple’s plan to scan images on users’ phones sparks backlash

Note: This was originally published as the daily newsletter for the Columbia Journalism Review, where I am the chief digital writer

Earlier this month, Apple announced a series of steps it is taking to help keep children safe online. One of those new additions is a feature for its Alexa line of intelligent assistants that will automatically suggest a help-line number if someone asks for child-exploitation material, and another is a new feature that scans images shared through iMessage, to make sure children aren’t sharing unsafe pictures of themselves in a chat window. Neither of these new features sparked much controversy, since virtually everyone agrees that online sharing of child sexual-abuse material is a significant problem that needs to be solved, and that technology companies need to be part of that solution. The third plank in Apple’s new approach to dealing with this kind of content, however, triggered a huge backlash: rather than simply scanning photos that are uploaded to Apple’s servers in the cloud, the company said it will start scanning the photos that users have on their phones to see whether they match an international database of child-abuse content.

As Alex Stamos, former Facebook security chief, pointed out in an interview with Julia Angwin, founder and editor of The Markup, scanning uploaded photos to see if they include pre-identified examples of child sexual-abuse material has been going on for a decade or more, ever since companies like Google, Microsoft, and Facebook started offering cloud-based image storage. The process relies on a database of photos maintained by the National Center for Missing and Exploited Children, each of which comes with a unique cryptographic code known as a “hash.” Cloud companies compare the code to the images that are uploaded to their servers, and then flag and report the ones that match. Federal law doesn’t require companies to search for such images — and until now, Apple has not done so — but it does require them to report such content if they find it.

What Apple plans to do is to implement this process on a user’s phone, before anything is uploaded to the cloud. The company says this is a better way of cracking down on this kind of material, but its critics say it is not only a significant breach of privacy, but also opens a door to other potential invasions by the government, and other state actors, that can’t easily be closed. The Electronic Frontier Foundation called the new feature a “backdoor to your private life,” and Mallory Knobel, chief technology officer at the Center for Democracy and Technology, told me in an interview on CJR’s Galley discussion platform that this ability could easily be expanded to other forms of content “by Apple internal policy as well as US government policy, or any government orders around the world.” Although Apple often maintains that it cares more about user privacy than any other technology company, Knobel and other critics note that the company still gave the Chinese government virtually unlimited access to user data for citizens in that country.

Continue reading