This may be the first time in my life I have wanted to decrease freedom of information. In particular, blatant lies and harmful nonsense. halcrawford.substack.com/p/we-need-misi…
60 years ago, @RANDCorporation made a series of predictions about the future of science and automation. Here are those predictions in an easy-to-read format. The lesson for now? Add 15 years to any new long-range predictions you hear. open.substack.com/pub/halcrawfor…
I few weeks back I described in graphic detail how news media would die horribly. This week, the antidote to the news apocalypse: how we leave the valley of tears and move into the sunny uplands of big, engaged audiences and increased revenue. open.substack.com/pub/halcrawfor…
Why should Meta and Google fund news?
The new parliamentary report into Meta's promise to walk away from news in Australian underlines how bad the News Media Bargaining Code is. We need to base this on rationality, not delusion. halcrawford.substack.com/p/why-should-d…
I loaded @GuardianAus daily news into NotebookLM to see if it provided a compelling, conversational news interface. I think the "knowledgeable friend" interface is going to work for audiences and publishers.
I didn't feel like being optimistic this week. Instead I thought it would be fun to run through the worst case scenario for news media. What happens in the news armageddon? open.substack.com/pub/halcrawfor…
As neural networks become more powerful, there are an increasing number of things their makers will not allow them to do. halcrawford.substack.com/p/this-is-our-…
Compare that to the modest energy requirements of the human brain. It takes just 20 watts of energy to keep our brains operational, sufficient to power a light globe. Over a “training period” of 18 years, the brain consumes 3155 kWh of energy, 139 billion times less.
The amount of power that it would take to train such an AI is mind boggling.
Training GPT4 reportedly took 10,000 MWh of electricity. BeastGPT would gobble up hundreds of thousands times more than that. In fact, it would take 10 percent of entire US electricity output for a year.
You think trillion parameter LLMs are big? A BeastGPT with 86 billion nodes - the same number of neurons as a human brain - arranged in 120 layers, would contain around 60 quadrillion parameters. More than all the grains of sand on several thousand beaches.
My first reaction to the Australian government's foreshadowing of the decision not to ban gambling ads: bad argument. I went through a few changes of mind in drafting this piece, but I ended up back where I started. halcrawford.substack.com/p/tv-needs-mon…
Education experts have known for at least two years that the essay is no longer a valid assessment tool - thank you @sharplm and @stephenm_nz ... but most teaching hasn't changed and now people are talking about a "deluge of cheating".