Artificial Intelligence
Last updated: April 15, 2025
I think AI is probably the most important thing of our time. AI developments are going very fast, and we are not prepared for how this will affect our society and culture. To quote Stephen Fry, when Benz first demonstrated a gasoline-powered carriage, ‘not one single person would declaim — “Yes! I foresee interstate highways three or four lanes wide crisscrossing the nations, I foresee flyovers, bypasses, Grand Prix motor racing, traffic lights, roundabouts, parking structures ten, twenty storeys high, traffic wardens, whole towns and cities entirely shaped by these contrivances.” No one would have seen a thousandth part of such a future.’.
I recommend starting with the article that quote is from. It is a great introduction that is accessible and non-technical.
Introductory Explainers
- Why AGI could be here by 2030 - After reading the Fry article, I would recommend starting here to see what has been going on recently.
- The Most Important Time in History Is Now - Short overview of recent developments, and how fast it’s going.
- The AI Revolution: The Road to Superintelligence - from 2015, but with some great graphics. Has an even better part 2
And if you want to dive deeper:
- Future of AI Course - Excellent short course by BlueDot Impact explaining the societal impacts of Artificial General Intelligence.
- aisafety.dance - Visual introduction to technical AI Safety by Nicky Case.
- Robert Miles AI Safety - Videos explaining technical concepts behind modern AI systems, and why it might be dangerous.
- The AI Safety Atlas - Comprehensive textbook about all aspects of AI Safety.
- Or if you just want to look at AI Safety memes, be my guest.
Want more resources? Check out aisafety.com and AI Safety Support’s Lots of Links page.
Scenarios
- AI 2027 - A very detailed scenario for the coming ~5 years released in April 2025, written by people who have previously made good predictions.
- How AI Takeover Might Happen in 2 Years by Joshua Clymer
- A History of the Future by Rudolf Laine
Reading Tips
- Capital, AGI, and Human Ambition, or “By default, capital will matter more than ever after AGI”
- Your A.I. Lover Will Change You
Staying Informed in AI Safety
For more, check aisafety.com’s Stay Informed page