Brandy Zadrozny’s appearance on Friday’s The 11th Hour on MSNBC with host Stephanie Ruhle raised the question of whether or not the duo read the same Twitter thread as everybody else, because Zadrozny claimed Matt Taibbi’s thread detailing how Twitter suppressed the New York Post article on Hunter Biden’s laptop actually makes Twitter look good.
Ruhle clearly tried to make the thread conform to her pre-determined conclusion as she asked Zadrozny to explain the matter, “I want to ask you about what he did today. Elon Musk. He spent the majority of his day promoting a heavily criticized thread, Twitter thread, about the laptop of Hunter Biden, a private citizen, someone who's obviously the son of President Biden, but who is not a government official. Can you explain this whole thing to us?”
Heavily criticized by whom? That glaring bit of partisanship aside, Zardrozny proceded to explain the gist of the Post’s story, but then made the hilarious claim that:
it was really helpful thing, actually, the Twitter files today, we got to see how content moderation works. We got to see how when a group of people with different political ideas and ideologies and views, gets together in the spirit of making a platform safe and healthy, right before an election, right when we knew we had just learned of the hack and leaks and Wikileaks and all the stuff that that did to hurt and to affect the 2020--the 2016 election. In 2020, they were heightened. People were trying to do the right thing and inside Twitter.
In the actual thread, Taibbi does concede that content removal requests were available to both sides, but in practice, Democrats had a better chance of success because Twitter is a ultra-liberal company. The idea that there were conservatives inside Twitter who agreed to suppress the Post article is not anywhere in Taibbi’s thread.
What is in the thread is people struggling to accept the hack explanation, but not be willing to do anything about it. Taibbi also reports the extraordinary measure of resorting to suppression, noting that such methods were usually reserved for things such as child pornography.
Yet, Zadrozny still claimed, “it looked like, this weird story of a laptop left in some, like, repair shop, it looked like a hack and leak. That's what people inside Twitter thought it was. And so, they acted a little fast.”
After claiming censoring the article made Twitter look good, Zadrozny unwittingly contradicted herself by reporting that Twitter officials admit they acted too quickly:
And when we heard that from Yoel Roth, the former head of trust and safety two days ago on Kara Swisher’s podcast, he said that. He didn't agree with this idea. The day after it happened, Jack came out and said I don't agree with what happened, that happened too quickly. Because content moderation is a human activity. It's stuff that we do as humans. To, you know, fulfill the north star of whatever organization we have.
It is interesting that Zadrozny cites Jack Dorsey when Taibbi’s thread says the decision was made without his input. Perhaps she should re-read the thread.
This segment was sponsored by Fidelity.
Here is a transcript for the December 2 show:
MSNBC The 11th Hour with Stephanie Ruhle
12/2/2022
11:23 PM ET
STEPHANIE RUHLE: I want to ask you about what he did today. Elon Musk. He spent the majority of his day promoting a heavily criticized thread, Twitter thread, about the laptop of Hunter Biden, a private citizen, someone who's obviously the son of President Biden, but who is not a government official. Can you explain this whole thing to us?
BRANDY ZADROZNY: Sure. So, background in 2020, there was a New York Post story about Hunter Biden's laptop. When that came out, it was suggesting that somehow Hunter Biden, the president's son, was trading his closeness with the president to gain money and a position in Ukraine.
When that came out, we now know, it was really helpful thing, actually, the Twitter files today, we got to see how content moderation works. We got to see how when a group of people with different political ideas and ideologies and views, gets together in the spirit of making a platform safe and healthy, right before an election, right when we knew we had just learned of the hack and leaks and Wikileaks and all the stuff that that did to hurt and to affect the 2020--the 2016 election. In 2020, they were heightened. People were trying to do the right thing and inside Twitter.
And it looked like, this weird story of a laptop left in some, like, repair shop, it looked like a hack and leak. That's what people inside Twitter thought it was. And so, they acted a little fast. And when we heard that from Yoel Roth, the former head of trust and safety two days ago on Kara Swisher’s podcast, he said that. He didn't agree with this idea. The day after it happened, Jack came out and said I don't agree with what happened, that happened too quickly. Because content moderation is a human activity. It's stuff that we do as humans. To, you know, fulfill the north star of whatever organization we have.