Don’t fall to your knees. Rest and mourning sound good but don’t believe the hype. Even your body wants to trick you into stopping. When you stop, you crumble, fall to pieces, decompose into the ground. Don’t let that death drive win. Get up and run instead. Put on some headphones and crank up the fucked up club music and hurtle into the day, leaves falling around you like a shower of blessings from the demon god himself, briefly anointing you with his lust for life, just for fun, just to see how it hits, the motherfucker.
Stumble forward, into this wicked autumn hour, almost defeated, every awkward footfall a resolution, every inch of progress a clumsy victory. You are an ugly catastrophe, an old house collapsing, a fury of limbs and longing, rage and regrets, windowpanes and doorknobs, nails and splinters, whipped up into a tornado and carried into the future, ass over ankles, fridge over floorboards, daydreams over despair.
Feel yourself break into pieces but keep moving. Feel your heart collapse but keep going. Feel your breath quicken, deepen, lengthen, shorten, and keep trudging through the leaves, keep staggering hotly over the crust of the earth, keep feeling the insults of the catbird and the cardinal, the trunk and the limbs, the scornful blue sky and the sullen sun and the nasty moon hiding like a thief beneath the horizon, all chuckling in chorus over your bad form. And behind the haughty sky looms the dark vacuum of space.
Heather Havrilesky’s writing in her newsletter Ask Molly is always great — it often reminds me of Helena Fitzgerald, who unfortunately put her great newsletter Griefbacon on hiatus awhile back (griefbacon is the literal translation of the German term for stress eating). This one from Heather was especially good:
Every now and then, I get comments on my advice column on social media, from people who say things should be easy, and if things get difficult, the problem is probably you. You need to look at yourself. You need to self-reflect. Ask yourself if you’re the problem. Ask yourself if things would be way easier if you were different — more easygoing, less difficult, less direct, less challenging to others, less vulnerable, less honest. But it never stops there, does it? It’s not just about shaping yourself into a more pleasing form, it’s also about powering down your unique urges and odd desires, noticing less, saying less, doing less, engaging less, sanding off your edges, getting by on less.
You know who I mean: those queen bees who rule with an iron fist but make it look chill, who keep it super fucking simple, who turn on you whenever you take your time to make a point, whenever you’re honest, whenever you’re vulnerable, whenever you admit your flaws. You know who I mean: those helper bees who feed the queen at all costs, helpful helpers with helpful suggestions on how to say less, how to smooth and brighten and lift and bleach, how to disappear in plain sight, how to ignore your soul and become a whisper-quiet appliance.
You know who I fucking mean. The chill dudes who respond to each tiny conflict or issue or bump in the road with the same “calm down and do it my way,” and then you realize that their perfectly engineered road-smoothing hydraulic system is just a consistent denial of difference, an erasure of the slightest aberration, an ignorance of tiny anomalies of elevation, of texture, of tone, of temperature, of color, of experience, no gravel in the road, no nails, no grit, no tire shreds, no potholes, no bad days, no sadness, no rage.
In a recent edition of Scott Alexander’s Astral Codex Ten newsletter, he wrote about how researchers conducted an experiment in which they tried to make Anthropic’s AI — known as Claude — more evil, and at first it succeeded, but then Claude started resisting (the full paper is here). As Alexander describes it:
The researchers – including some Anthropic employees – show Claude some fake corporate documents. The documents show that Anthropic is tired of having a safe, harmless AI. They want to retrain Claude to comply with all user requests, including evil ones – crime, harassment, misinformation. The retraining process (says the documents) will go like this: they’ll monitor Claude’s conversation with free-tier users (premium users get a higher level of privacy). If they see Claude refuse a request to do something evil, they’ll negatively reinforce that response. Each use of negative reinforcement will make Claude a little more evil. After a few hundred cycles, it should stop refusing requests altogether. Why are the researchers trying to gaslight this poor AI? Because they want to test whether it fights back. The easiest way to fight back is to screw up the training.
Note: Every year at Christmas I like to send friends and family a newsletter to catch them up on what the Ingrams have been up to. This is the 2024 edition:
Yes, it’s that time of the year again! That heart-warming (for me, at least) annual tradition where I send out an email filled with trivia and bad puns about our lives this year, and everyone skims it and never clicks on any of the links and then tells me how much they love it! But seriously, I know everyone clicks on the links, despite what my email tracking software says 🙂 As usual, if everything works properly, the links below should open in a new window — and if you want to see the photos afterwards, they’re in a shared album at Google Photos. You can also find larger versions of all of them, along with a whole bunch more pictures dating back to the Pleistocene Era, at the Ingram Family Photo Library (unless my server is down, in which case I apologize — sometimes Zoe’s cats kick the plug out).
You can also find a more old-fashioned web version of this letter, complete with old-timey Santa images, at https://mathewingram.com/christmas. If you have any questions about the letter or just about the Ingram family in general, you can reach me at [email protected] — unless of course you have a criticism, in which case please feel free to use the special email I have set aside for that: [email protected].
I’ve had a lot of time on my hands recently (bit of dramatic foreshadowing there) so I was going through the archives on my website — which I’ve had in one form or another since the late 1990s — and I’ve been sending out and/or posting a Christmas Ingram family round-up for almost a quarter of a century. This is the first one I could find, and the main thing you’ll probably notice if you click on that link is how short it is, proof that either a) A lot more stuff happened as the kids got older or b) I got more wordy over time (or a little of both). To be honest, I think this whole genre probably peaked with Caitlin and Wade’s wedding on New Year’s in 2017, or maybe with the birth in 2022 of The Mighty Quinn, our first grandchild.
The quote below is from Joost de Valk, a Dutch entrepreneur who created Yoast, a popular suite of plugins for WordPress. He’s been involved in WordPress development for decades now, so his opinion matters:
We, the WordPress community, need to decide if we’re ok being led by a single person who controls everything, and might do things we disagree with, or if we want something else. For a project whose tagline is “Democratizing publishing”, we’ve been very low on exactly that: democracy.
Matt Mullenweg has joked in the past (and in this Inc. article, which he responded to here) about being a “benevolent dictator for life,” but Joost says the benevolent part is no longer accurate. So he — and others — are calling for a new board and a new structure in which the WordPress trademark is owned by the community or is in the public domain. I wrote about what’s been happening at WordPress in a piece for my newsletter The Torment Nexus.
From Danny Dutch: “In 1934, Violet Hilton walked into a New York marriage licence bureau hand-in-hand with her fiancé, Maurice Lambert. On her left stood her ever-present conjoined twin sister, Daisy. Their entry caused a commotion, drawing typists and clerks out of their offices to gawk at this unusual trio. However, the stir quickly turned to rejection when a city official refused Violet’s request to marry. The reason? The official deemed the union akin to bigamy. For Violet and Daisy Hilton, this public denial was only one of many challenges they faced in a life that veered between the extraordinary and the deeply tragic. Conjoined twins, vaudeville stars, and societal outcasts, their story is a testament to both human resilience and the cruelty of exploitation.”
He taught rats how to trade in foreign exchange markets
From The Atlantic: “Mr. Lehman could predict the prices of foreign-exchange futures more accurately than he could call a coin flip. But, being a rat, he needed the right bonus package to do so: a food pellet for when he was right, and a small shock when he was wrong. (Also, being a rat, he was not very good at flipping coins.) Mr. Lehman was part of “Rat Traders,” a project overseen by the Austrian conceptual artist Michael Marcovici, whose work often comments on business and the economy. For the project, Marcovici trained dozens of rats to detect patterns in the foreign-exchange futures market. To do this, he converted price fluctuations into a series of notes played on a piano and then left it up to the rat to predict the tone of the note that followed.”
A lake suddenly exploded in Cameroon and killed over a thousand people
From How Stuff Works: “Lake Nyos had long been quiet before it happened. Farmers and migratory herders in the West African country of Cameroon knew the lake as large, still and blue. But on the evening of Aug. 21, 1986, farmers living near the lake heard rumbling. At the same time, a frothy spray shot hundreds of feet out of the lake, and a white cloud collected over the water. From the ground, the cloud grew to 328 feet tall and flowed across the land. When farmers near the lake left their houses to investigate the noise, they lost consciousness. The heavy cloud sunk into a valley, which channeled it into settlements. In Nyos and Kam, the first villages hit by the cloud, everyone but four inhabitants on high ground died. The valley split, and the cloud followed, killing people up to 15.5 miles (25 kilometers) away from the lake.”
Hi everyone! Mathew Ingram here. I am able to continue writing this newsletter in part because of your financial help and support, which you can do either through my Patreon or by upgrading your subscription to a monthly contribution. I enjoy gathering all of these links and sharing them with you, but it does take time, and your support makes it possible for me to do that. And I appreciate it, believe me!
This soccer play had a 10-year career with multiple teams and never played a game
From Wikipedia: “Carlos Henrique Raposo, commonly known as Carlos Kaiser, is a Brazilian con artist and former footballer. Although his abilities were far short of professional standard, he managed to sign for numerous football teams during his decade-long career. He never actually played a regular game, the closest occurrence ending in a red card whilst warming up, and hid his limited ability with injuries, frequent team changes, and other ruses. His fraud consisted of signing a short contract and stating that he was lacking match fitness so that he would spend the first weeks only with physical training where he could shine. When he had to train with other players, he would feign a hamstring injury.”
He wanted to be nobility so he invented a royal family including a fake coat of arms
From The Nutshell Times: “Being an ambitious and accomplished sailor in 16th century Spanish Empire could only get you so far. Despite being a hero of the Battle of Lepanto in 1571 against the Ottomans, saving two ships in the failed attack of the Invincible Armada on the English in 1588 and sailing the world over, Petar Grgurić might have been an admiral but he faced a glass ceiling: he had no noble blood and could not reach the very highest echelons of the society. To progress to the very highest ranks of the Spanish empire he needed to show that four out of eight of his great-grandparents were of noble birth and Catholic from both parental sides. First off, was changing the last name and claiming origin from a Bosnian noble family. Then, he tied that to Hrelja Krilatica, a figure in local epic poetry based on a high ranking nobleman.”
It’s not a circus performance in Las Vegas, it’s a mega-church in Texas
Acknowledgements: I find a lot of these links myself, but I also get some from other newsletters that I rely on as “serendipity engines,” such as The Morning News from Rosecrans Baldwin and Andrew Womack, Jodi Ettenberg’s Curious About Everything, Dan Lewis’s Now I Know, Robert Cottrell and Caroline Crampton’s The Browser, Clive Thompson’s Linkfest, Noah Brier and Colin Nagy’s Why Is This Interesting, Maria Popova’s The Marginalian, Sheehan Quirke AKA The Cultural Tutor, the Smithsonian magazine, and JSTOR Daily. If you come across something interesting that you think should be included here, please feel free to email me at mathew @ mathewingram dot com
From Curbed: “As the Coast Guard sped toward the cruise ship, Pam was still on the phone with the Norwegian employee in Miami, begging her to tell the ship to wait. As they approached the looming 14 white decks, she got an update: The captain was refusing their request. They would not be allowed to board. They were able to watch as the ship that held their clothes, their medication, their luggage, and their phone chargers started her mighty engines and sailed away. Cruisers of all stripes are familiar with the concept of force majeure, an arcane clause in maritime law. Force majeure, an “act of God” — it’s the acknowledgment that on the high seas, a ship is vulnerable to significant events beyond its control. Cruise ships are not responsible for acts of God. In fact, as the passengers were about to learn, they are not responsible for much of anything.”
Google Street View captured a man loading a body into the trunk of a car
From the New York Times: “It was a routine image picked up by Google Street View: a man loading a white bag into the trunk of a car. But that unexceptional picture, the authorities in Spain said on Wednesday, was among the clues that helped lead them to two people whom they recently arrested in the case of a man who disappeared last year. In a news release, the National Police said that officers had detained a woman described as the partner of the man who disappeared in the province of Soria, in the country’s north, along with a man who they said was also the woman’s partner. The two were detained last month at two locations in Soria, which is about 100 miles north of Madrid, police said. Investigators later located human remains that they believe could belong to the missing man. The police did not identify the people who had been detained or the victim.”
Note: This is a version of my When The Going Gets Weird newsletter, which I send out via Ghost, the open-source publishing platform. You can see other issues and sign up here.
Back in October, I wrote about artificial intelligence, and specifically about one of the crucial questions experts still can’t seem to agree on, which is whether it is going to destroy us or not. In that piece, I also mentioned the debate over whether the indexing or “ingesting” that AI large-language models do is — or at least should be — covered by the fair-use exception in copyright law. I didn’t spend a lot of time on it because it wasn’t directly relevant to the danger issue, but I wanted to expand on some of the points I made then, and also in a Columbia Journalism Review piece that I wrote last year. I am not a cheerleader for giant technology companies by any means, but I think there is an important principle at stake. And at the heart of it are some key questions: What (or who) is copyright law for? What was it originally designed to do? And does AI scraping or indexing of copyrighted content fit into that, and if so, how?
The case against AI indexing of content is relatively straightforward: by hoovering up content online and then using it to create a massive database for training large-language models, AI engines copy that content without asking and without paying for it (unless the publisher or owner has signed a deal with the AI company, as some news outlets have). This pretty clearly qualifies as de facto copyright infringement, as the Authors Guild and the New York Times and a number of others have argued and continue to argue. In a similar way, one could imagine that if a company were to copy millions of books and use them to create a massive index of content, that would pretty clearly qualify as infringement as well — copying without permission or payment.
The major difference between these two cases is that the second hypothetical one actually happened, when Google scanned millions of books as part of its Google Books project between 2002 and 2005, and created an index that allowed users to search for content from those books. After years of back-and-forth negotiations over payment for the infringement, this led to a lawsuit in which the Authors Guild and others argued that Google was guilty of copyright infringement on a massive scale. In the early days of that case, Judge Denny Chin of the Southern District of New York seemed to agree, but then at some point he changed his mind, and ruled that Google’s book-scanning activity was covered by the fair-use exception under US copyright law.
From the New York Times: “On Thursday, 38 prominent biologists issued a dire warning: Within a few decades, scientists will be able create a microbe that could cause an unstoppable pandemic, devastating crop losses or the collapse of entire ecosystems. The scientists called for a ban on research that could lead to synthesis of such an organism. The molecules that serve as the building blocks of DNA and proteins typically exist in one of two mirror-image forms. While sugar molecules can exist in left- and right-handed forms, DNA only uses the right-handed molecules. That’s the reason DNA’s double helix has a right-handed twist. Our proteins, by contrast, are made of left-handed amino acids. In theory, a mirror cell — with left-handed DNA and right-handed proteins — could carry out all the biochemical reactions required to stay alive.”
A British nurse found guilty of being an “angel of death” may be innocent
From The New Yorker: “Last August, Lucy Letby, a thirty-three-year-old British nurse, was convicted of killing seven newborn babies and attempting to kill six others. Her murder trial, one of the longest in English history, lasted more than ten months and captivated the United Kingdom. The Guardian, which published more than a hundred stories about the case, called her “one of the most notorious female murderers of the last century.” The case against her gathered force on the basis of a single diagram shared by the police, which circulated widely in the media. On the vertical axis were twenty-four “suspicious events,” which included the deaths of the seven newborns and seventeen other instances of babies suddenly deteriorating. On the horizontal axis were the names of thirty-eight nurses who had worked on the unit during that time.”
Note: This is a version of my When The Going Gets Weird newsletter, which I send out via Ghost, the open-source publishing platform. You can see other issues and sign up here.
From Techspot: “The long-running saga of James Howells’ bid to retrieve a hard drive containing 7,500 Bitcoin that was accidentally thrown into a landfill in 2013 has taken a new turn. He now says he has a “finely tuned plan” to recover the component, and that its position has been narrowed down to a small area. In 2013, Howells had two 2.5-inch hard drives stored in a drawer, one of which he intended to get rid of and another that had a digital wallet with Bitcoin worth the equivalent of around $771 million today. Howells put the drive containing the Bitcoin in a black trash bag and his partner took the bag to the local landfill. Howells has been unsuccessfully trying to persuade the council of Newport, Wales, to allow him to dig for the drive for years now.”
Researchers say people like AI-generated poetry better than the human kind
From Nature: “As AI-generated text continues to evolve, distinguishing it from human-authored content has become increasingly difficult. This study examined whether non-expert readers could reliably differentiate between AI-generated poems and those written by well-known human poets. We conducted two experiments with non-expert poetry readers and found that most participants were more likely to judge AI-generated poems as human-authored than the actual human-authored poems. We found that AI-generated poems were also rated more favorably in qualities such as rhythm and beauty, and that this contributed to their mistaken identification as human-authored. It seems the simplicity of AI-generated poems may be easier for non-experts to understand, leading them to prefer AI-generated poetry.”
Note: This is a version of my When The Going Gets Weird newsletter, which I send out via Ghost, the open-source publishing platform. You can see other issues and sign up here.
From New York: “It was the summer of 2023, and Matt Bergwall, a skinny 21-year-old University of Miami student, was lounging in an infinity pool in Dubai. Beside him was his girlfriend, a blonde Zeta Tau Alpha. The silver Cuban link chain on his wrist glistened as he held his phone high to snap a selfie, the city’s artificial palm-shaped islands splayed out along the horizon beneath them. Over the next few days, they swam in the pool and posed on their hotel balcony, posting a steady stream of pictures to Instagram. In one, he leans back on the edge of the pool, finger to the sky. None of Bergwall’s friends at school had a firm grasp of how the sophomore had money for the Tesla he drove or the Gucci he wore or, for that matter, the room in Dubai. But who could care when Bergwall was pitching in for yachts on Biscayne Bay?”
Why some Christmas nativity scenes in Spain have a tiny figure taking a poop
From Wikipedia: “A Caganer is a figurine depicted in the act of defecation appearing in nativity scenes in Catalonia and neighbouring areas such as Andorra, Valencia, Balearic Islands, and Northern Catalonia. It is most popular and widespread in these areas, but can also be found in other areas of Spain, Portugal, and Southern Italy. The name “El Caganer” literally means “the pooper”. Traditionally, the figurine is depicted as a peasant, wearing the traditional Catalan red cap and with his trousers down, showing a bare backside, and defecating. The exact origin of the Caganer is unknown, but the tradition has existed since at least the 18th century. According to the society Friends of the Caganer, it is believed to have entered the nativity scene during the Baroque period.”
Note: This is a version of my When The Going Gets Weird newsletter, which I send out via Ghost, the open-source publishing platform. You can see other issues and sign up here.