Twitter fact-checks Trump, but will it do any good?

Note: I originally wrote this for the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer

When Twitter said earlier this month that it was making some changes to “limit the spread of potentially harmful or misleading content” by adding warning labels to tweets, one of the most obvious questions was whether the company would apply the labels to Trump’s various misinformed tweets. On Tuesday, we got the answer, when labels were added to two tweets posted to Trump’s account on the topic of mail-in voting ballots. The labels appeared just below the text of the tweet, with a hyperlink that said “Get the real story on mail-in voting.” The link took users to a collection of tweets with facts about the topic, curated by Twitter staff into what the company calls a Moment. The move was greeted with cheers in some quarters, while Trump and his supporters — including his son Donald Jr. — angrily tweeted about how the company was clearly trying to manipulate public opinion in advance of the federal election. Trump even promised to “strongly regulate, or close them down,” despite an obvious lack of any federal ability to do this.

On the one hand, it’s a tiny victory in the ongoing battle to stamp out misinformation online. After years of being accused of spreading lies, propaganda, and other noxious substances through its network, and of doing little to stop it, Twitter finally seems to be taking some baby steps towards responsible curation (Facebook said it would not take similar action because it believes that “people should be able to have a robust debate about the electoral process”). But at the same time, Twitter’s move is like taking a tiny drop of poison from a very large ocean and putting a label on it saying “for more information about poison, click here.” In fact, Twitter’s decision to add this kind of warning label raises as many questions as it answers. To take just one example, social researchers have found that fact-checking can cause what is known as the “implied truth effect,” where users assume that because one specific statement has been fact-checked and found to be false, others that haven’t been fact-checked must be true.

As a number of observers pointed out after Twitter added the label, the way that the company chose to phrase it was also imperfect. Saying “Get the facts about mail-in ballots” could be interpreted by some as adding weight to the misinformation rather than debunking it. Researchers such as Whitney Phillips of Syracuse University and Joan Donovan of Harvard’s Shorenstein Center often talk about the risks inherent in calling attention to misinformation, in part because doing so can amplify the incorrect info and cause it to travel much farther than it would have otherwise. Many wondered whether Twitter users would even bother clicking on the link, let alone read the facts in the Twitter Moment. Activist Charlotte Clymer said that Twitter’s label “is the most mild form of accountability” imaginable. The warning doesn’t say Trump is wrong or misleading people, she noted, and most people would probably scroll by without even noticing. “It’s weak and cowardly.”

Continue reading “Twitter fact-checks Trump, but will it do any good?”

One reason why there are so few mummies: People ate them

Egyptians embalming a corpse

Every now and then, I come across something incredible that I have never heard of before, and recently I had one of those experiences when I came across an article in the Smithsonian magazine entitled “The Gruesome History of Eating Corpses as Medicine.” The what? Yes, apparently not that long ago, people — kings and queens, priests and scientists — routinely ingested ground up bones, blood and fat as medicine for everything from headaches to epilepsy. This is discussed at length in two books: Medicinal Cannibalism in Early Modern English Literature and Culture, and Mummies, Cannibals and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians. This practice went on for several hundred years, peaking in the 16th and 17th centuries, and during that time mummies were stolen from Egyptian tombs, and gravediggers robbed and sold body parts.

“The question was not, ‘Should you eat human flesh?’ but, ‘What sort of flesh should you eat?’ ” says Sugg. The answer, at first, was Egyptian mummy, which was crumbled into tinctures to stanch internal bleeding. But other parts of the body soon followed. Skull was one common ingredient, taken in powdered form to cure head ailments. Thomas Willis, a 17th-century pioneer of brain science, brewed a drink for apoplexy, or bleeding, that mingled powdered human skull and chocolate. And King Charles II of England sipped “The King’s Drops,” his personal tincture, containing human skull in alcohol. 

The little-known Canadian author behind The Hardy Boys series

Tens of thousands of young boys growing up in the 1920s and 1930s got a secret thrill whenever they picked up a new book about The Hardy Boys, those intrepid young lads who solved mysteries and generally lived lives filled with tremendous adventure. The books were written by a man named Franklin W. Dixon. But who was Dixon? How did he come to write The Hardy Boys? And why was his writing so terrible, as those young boys often realized when they grew up and learned more about literature? Gene Weingarten wrote about the real story in a 1998 piece for The Washington Post. As he explains, Franklin Dixon was what the book business called a “house name,” or pseudonym, which allowed multiple people to write under one byline. And from 1927 to 1946, the books were written by a Canadian man named Leslie McFarlane — and he hated The Hardy Boys with a passion.

It turns out the story of the Hardy Boys isn’t about the worst writer who ever lived, not by a long shot. It  is about a good writer who wrote some bad books, and if you wonder why  that happened, as I did, then you are likely not very old and not very  wise. Sometimes homely things are done for the best reasons in the  world, and thus achieve a beauty of their own. Leslie McFarlane kept voluminous diaries. His family has them. He wrote  in fountain pen, in elegant strokes that squirrelled up a little when he  was touched by despair or drink. In these diaries, “The Hardy Boys” is  seldom mentioned by name, as though he cannot bear to speak it aloud. He  calls the books “the juveniles.” At the time McFarlane was living in northern Ontario with a wife and infant children, attempting to make a  living as a freelance fiction writer.

McFarlane’s diary entries make for sad reading. Nov. 12, 1932: “Not a nickel in the world and nothing in sight. Am simply desperate with anxiety. . . . What’s to become of us this winter? I don’t know. It looks black.” Jan. 23, 1933: “Worked at the juvenile book. The plot is so ridiculous that I am constantly held up trying to work a little logic into it. Even fairy tales should be logical.” Jan. 26, 1933: “Whacked away at the accursed book.” June 9, 1933: “Tried to get at the juvenile again today but the ghastly job appalls me.” Jan. 26, 1934: “Stratemeyer sent along the advance so I was able to pay part of the grocery bill and get a load of dry wood.” Finally: “Stratemeyer wants me to do another book. . . . I always said I would never do another of the cursed things but the offer always comes when we need cash. I said I would do it but asked for more than $85, a disgraceful price for 45,000 words.”

As Gene Weingarten explains, McFarlane was a 5-foot-4-inch Irishman, who got his start writing for newspapers in places like Cobalt, Ontario and Sudbury, and turned to writing the Hardy Boys books out of desperation after the Great Depression. Afterwards, he was fiction editor of Maclean’s magazine for a number of years, and produced a number of acclaimed documentary films, wrote an excellent hockey novel, and TV scripts for “Bonanza” and “The U.S. Steel Hour.” He died of complications from diabetes in 1977.

The worst corporate contest ever

Lots of companies have sweepstakes or contests that offer prizes and rewards, and some backfire — but nothing like Pepsi’s potentially multibillion-dollar gaffe in the 1990s. The soft drink company wanted to break into the Philippine market, so executives decided to do a promotional campaign that promised to have lots of payouts: Each bottle cap would have a number that correlated to a prize, and there were to be lots of small winners and then two huge winners of $40,000 each. There was to be a total of $2 million in prizes. But a software glitch at one of Pepsi’s vendors caused them to make about 800,000 bottles that had the winning number “349” on the cap, which meant that the holder was theoretically entitled to $40,000. If the company had paid out all those winnings, it would have cost $32 billion.

That obviously wasn’t a realistic possibility, so Pepsi said an error had been made and that it planned to pay each of the false winners $18 — but those who thought they had won a fortune were not pleased. They started demonstrations calling on Pepsi to pay out the full amount, at which point police and the army were called in to quiet them down, and the protests turned into full-scale riots. Pepsi records show that at least 32 delivery trucks were stoned, torched or overturned. Armed men threw homemade bombs at Pepsi plants and offices. In the worst incident, police say a fragmentation grenade tossed at a parked Pepsi truck in a Manila suburb Feb. 13 bounced off and killed a schoolteacher and a 5-year-old girl and wounded six other people.

Pepsi faced thousands of court claims that it had to work through in the following years. There was a lengthy trial and the Philippine commerce courts eventually ruled that Pepsi’s mistake wasn’t malicious and it hadn’t committed a crime. At the end of everything, Pepsi’s total combined losses, between physical, legal, and brand equity costs, would top $20M. Their market share would plummet and take years to rebound.

Harry Allen, notorious thief and bootlegger, born Nellie Pickerell

From 1900 to 1922, Harry Allen  was one of the most notorious men in the Pacific Northwest, says an article from Atlas Obscura. “The West was  still wide and wild then, a place where people went to find their  fortunes, escape the law, or start a new life. Allen did all three.  Starting in the 1890s, he became known as a rabble-rouser, in and out of  jail for theft, vagrancy, bootlegging, or worse. Whatever the crime, Allen always seemed to be a suspect.” During his short life (he died at the age of 40), Allen was a bronco-buster, a longshoreman, and worked as a second in boxing matches. He was also the subject of lurid news stories in tabloid newspapers because he was born female and his given name was Nell Pickerell. “This Girl Refuses to Wear Skirts; Nellie Pickerell Acts, Talks and Dresses Like a Man, and says She Ought to Have Been One,” said one story in the Boston Post from 1900.

In an 1908 interview with The Seattle Sunday Times, Allen described his discomfort with his assigned sex. “I did not like to be a girl; did not feel like a girl, and never did look like a girl,” he said. “So it seemed impossible to make myself a girl and, sick at heart over the thought that I would be an outcast of the feminine gender, I conceived the idea of making myself a man.” Allen’s identity fascinated local papers, which cast it as part of the zeitgeist of the American frontier. One publication framed him among “the scum of the West” for his active career of saloon brawling, bootlegging, bronco busting, and horse stealing.

And Allen wasn’t the only one who found new possibilities for reinvention in the New World, as the Atlas Obscura piece notes. When 80-year-old lumberjack Sammy Williams died in Montana in 1908, the undertaker discovered his assigned sex, dumbfounding the community that had only ever known him as a man. Joseph Lobdell, born and assigned female in Albany, New York, surfaced in Meeker County, Minnesota and became known as “The Slayer of Hundreds of Bears and Wild-Cats.” Historian Peter Boag, who researched some of the early trans pioneers, says “If people thought you were a man, you wouldn’t be bothered or molested, there’s good evidence that some women dressed as men to get better paying employment.” The best job most women could hope for was cooking or housekeeping.

The rise and fall of the pineapple as a sign of social status

It may seem strange to picture the humble pineapple as a signal of social standing, but for about two hundred years, in the 1600s and 1700s, it was exactly that. The trend seems to have started after pineapples were brought back by explorers from tropical climates, and then rich folk throughout Britain started growing them and displaying the results on their tables, to the point where the pineapple became a symbol of wealth. A hothouse-grown pineapple cost about £60, or roughly £11,000 in today’s terms, and plates decorated with leaves and pineapple symbols were created to hold them so they could be displayed as a centerpiece. King Charles II even commissioned a painting of himself being presented with a pineapple at court, reportedly the first such fruit grown on British soil (although other reports say it was brought back as a juvenile and ripened in Britain).

Concerned about wasting such high-value fruit by eating it, owners displayed pineapples as dinnertime ornaments on special plates which would allow the pineapple to be seen and admired but surrounded by other, cheaper, fruit for eating. These pineapples were expensive enough to warrant security guards, and maids who transported them were considered to be at great risk of being targeted by thieves. The 1807 Proceedings of the Old Bailey show several cases for pineapple theft, Dr O’Hagan points out, including that of a Mr Godding who was sentenced to seven years transportation to Australia for stealing seven pineapples.

Because the ever-aspiring middle classes were anxious to get their mitts on the fruit but could not afford to cultivate or buy them, canny businessmen opened pineapple rental shops across Britain, according to a history from the BBC. Companies began to cash in on the fruit’s popularity and as with many crazes, the market for pineapple-themed goods exploded. Porcelain-makers Minton and Wedgwood started producing pineapple-shaped teapots, ewers and jelly moulds. Ornately carved clock cases, bookends and paintings extended the trend from the dining table to other rooms in the house. But eventually, steamships started bringing tropical fruit to Europe in great enough numbers that pineapples were no longer scarce, and prices dropped to the point where anyone could have one — even poor people.

Should Google and Facebook be forced to pay for content?

Note: I originally wrote this for the daily newsletter at the Columbia Journalism Review, where I am the chief digital writer

In a recent column for the New York Times, media writer Ben Smith wrote about how regulators in Australia and France are moving to force digital platforms like Google and Facebook to pay media companies directly for the content they carry from publishers, in the wake of new copyright rules set by the European Union last year. A number of other countries have also tried to do this, with varying degrees of success — Germany passed a law in 2013, but has had difficulty enforcing it, while Spain passed a similar law in 2014, at which point Google responded by shutting down its Google News service completely in that country. The rationale for these kinds of moves against the digital platforms, as Smith laid out in his column, is relatively simple: Google and Facebook took control of the advertising industry, and thereby destroyed the media’s ability to earn a living, which in turn has led to a decline in journalism.

But is this true? Or did Google and Facebook just take advantage of the the internet to offer a better product, something media companies could also have done? Why should they be forced to support companies that were in decline long before the internet? To discuss these and other questions, we used CJR’s Galley discussion platform to hold a virtual discussion with a group of experts including Ben Smith of the Times; Jeff Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at CUNY; Monica Attard, the head of journalism at the University of Technology School in Sydney, Australia; Ben Thompson, a technology and media analyst who writes a subscription newsletter called Stratechery; Emily Bell, director of the Tow Center for Digital Journalism at Columbia University, and Rasmus Kleis Nielsen, who runs the Reuters Institute for the Study of Journalism at Oxford University.

Jarvis, a longtime defender of Google and Facebook and author of a book entitled “What Would Google Do?”, argued that the digital platforms are a huge benefit to publishers, because they send them traffic (he also noted that Facebook has donated money to support his work at CUNY). “God did not give newspaper publishers the revenue they had. It is not their eternal entitlement,” Jarvis said. “In the new reality of the internet, new competitors came to offer news companies’ customers — advertisers — a better deal, while publishers insisted on clinging to their old, mass-media business model with all its inefficiencies.” Smith, however, countered that Google and Facebook benefited from laws that helped them grow. “Copyright has been interpreted not to include headlines. Platforms don’t have liability for what is published on them,” he said. “Those aren’t natural laws, they’re just regulations written by legislators.”

Continue reading “Should Google and Facebook be forced to pay for content?”