“What is the civil war in ‘Civil War’ about?” – Read Max

“Ultimately the movie seems much less concerned with making a particular political or moral statement (or even exploring the politics or morals of its fictional scenario) than it does with efficiently and energetically moving its truck of adrenaline junkies from one suspenseful action set-piece to the next. It’s like finding a 1967 alternate-history novel published by Del Rey with the tagline “They Crossed a War Zone Between New York and D.C.–to Photograph the President’s Murder!”

https://maxread.substack.com/p/what-is-the-civil-war-in-civil-war

“the cybernetic history of the crisis” – Dan Davis (back of mind)

“A real intelligence function, though, is explicitly concentrated on those parts of the environment that aren’t yet relevant to what it’s doing. This capability was weak in the central banks; they were not looking for things which might have upset their policymaking framework. The information was there, but it hadn’t been organised into the decision-making process and didn’t shape the view at the management or operational levels. It remained as ‘other data’ or was attenuated away by simply ignoring it; the ‘information processing system of last resort’”

https://backofmind.substack.com/p/the-cybernetic-history-of-the-crisis

“Large language models, explained with a minimum of math and jargon” – Understanding AI

“At the moment, we don’t have any real insight into how LLMs accomplish feats like this. Some people argue that examples like this demonstrate that the models are starting to truly understand the meanings of the words in their training set. Others insist that language models are “stochastic parrots” that merely repeat increasingly complex word sequences without truly understanding them.

This debate points to a deep philosophical tension that may be impossible to resolve. Nonetheless, we think it is important to focus on the empirical performance of models like GPT-3. If a language model is able to consistently get the right answer for a particular type of question, and if researchers are confident that they have controlled for confounds (e.g., ensuring that the language model was not exposed to those questions during training), then that is an interesting and important result whether or not it understands language in exactly the same sense that people do”

https://www.understandingai.org/p/large-language-models-explained-with

“Crashing the S and S party” – Shush

“Two days before the first of its centenary events, the New York Times reported that Simon & Schuster—this great American company, this remarkable piece of cultural history, this repository of some of the world’s most popular books as well as its finest literature, including sixty-one Pulitzer prizewinners and eighteen National Book Award winners—had been sized up as AI fodder.

The large language models that drive new artificial intelligence services such as ChatGPT are trained on mountains of text. The higher the mountains, the better the training. Trouble is, there’s a shortage of readily available text not covered by copyright. According to the Times, Facebook owner Meta, which is competing with Google, OpenAI, and a lot of other firms to develop large language models, last year seriously considered buying S&S not because it admired its business or its books, but simply to feed its AI machine”

https://shush.substack.com/p/crashing-the-s-and-s-party

“02024Q2” – optional.is

“In the UK, British Sugar dominates around two-thirds of all the country’s sugar production. Operating within a highly regulated and controlled market, pricing structures leave little room for profit margins, emphasizing the importance of operational efficiencies. At some point those processes were so streamlined there was no room to squeeze profits.

Then someone suggested that all the hot air from the sugar making process be collected and pumped into a neighboring greenhouse.

And that’s how in 02006, British Sugar’s flagship beet factory in West Norfolk became the country’s tomato capital!”

https://optional.is/newsletter/02024Q2/

“How Narendra Modi became India’s influencer-in-chief” – Barkha Dutt (restofworld)

“Till 2012, India accounted for only 2% of the world’s mobile data traffic. Then came the dramatic fall in prices, triggered by the launch of Mukesh Ambani’s Jio 4G mobile network. With a tantalizing offer of 4GB free data per day, along with cheap handsets, Ambani ensured that Indians were consuming 1 billion GB of data every month within six months of Jio’s launch, compared to 200 million GB earlier. Today, Indians use more data than the U.S. and China combined. And India accounts for more than 21% of the world’s mobile data traffic”

https://restofworld.org/2024/narendra-modi-youtube-influencer/

“Issue 54 – Cases Continue” – Citation Needed

“Regulators and prosecutors will continue their slow game of whack-a-mole while more people are taken for all they’re worth by thieves who use the same old playbooks. As Bankman-Fried becomes a more and more distant memory, the industry will try to convince people that it’s cleaned itself up since its chaotic Wild West days. Without intervention, it may succeed, and the cycle will begin anew.

But I will be here, shining a light on it to the best of my abilities.”

https://www.citationneeded.news/issue-54/

“Drop-shipping is a lifeline for unemployed graduates in South Africa” – Kimberly Mutandiro (restofworld)

“China ships over $100 billion worth of goods to Africa, of which countries like South Africa, Congo, and Zambia account for the biggest portion. In 2022, online retail in South Africa crossed 50 billion rand ($2.6 billion), fueling the popularity of drop-shipping. But there are concerns about Chinese counterfeits — which currently account for up to 10% of the South African economy, according to the Consumer Goods Council — and growing worries over Chinese brands making South Africa their dumping ground”

https://restofworld.org/2024/dropshipping-south-africa/