Artificial Intelligence
Last updated: November 1, 2025
I think AI is probably the most important thing of our time. AI developments are going very fast, and we are not prepared for how this will affect our society and culture. To quote Stephen Fry, when Benz first demonstrated a gasoline-powered carriage, ‘not one single person would declaim — “Yes! I foresee interstate highways three or four lanes wide crisscrossing the nations, I foresee flyovers, bypasses, Grand Prix motor racing, traffic lights, roundabouts, parking structures ten, twenty storeys high, traffic wardens, whole towns and cities entirely shaped by these contrivances.” No one would have seen a thousandth part of such a future.’.
I recommend starting with Stephen Fry’s post that the quote is from. It is a great accessible and non-technical introduction to the topic.
Doing Something - The TL; DR
- Understand the problem and the risks by reading the resources on this page.
- Subscribe to newsletters and podcasts lower on this page to stay connected and informed.
- Find a local community, or join the global community by attending conferences such as EA Global.
- Figure out how you can contribute: Will you take action alongside your current commitments, or change your career? Read the sections below for resources for both paths.
- Speak to an advisor to get feedback on your plan and support in executing it. I recommend 80,000 hours and AI Safety Quest, or email me at navigation@mickzijdel.com.
Understanding AI Risk
To learn more about the problem of AI safety, I recommend starting with the following articles:
- Why AGI could be here by 2030: After reading the Fry article, I would recommend starting here to see what has been going on recently.
- The Most Important Time in History Is Now: Short overview of recent developments, and how fast it’s going.
- The AI Revolution: The Road to Superintelligence: from 2015, but with some great graphics. Has an even better part 2
- The Problem: An explanation by the Machine Intelligence Research Institute (MIRI), one of the first organisations focused on AI Risk, on why they think building Artificial Super Intelligence is dangerous.
- Future of AI Course: BlueDot’s 2 hour course explaining the societal impacts of Artificial General Intelligence, and join their community.
Books
- Human Compatible by Stuart Russell - A leading AI researcher’s perspective on building beneficial AI
- The Alignment Problem by Brian Christian - Accessible overview of AI safety challenges
- Superintelligence by Nick Bostrom - Classic text on potential risks from advanced AI
- Life 3.0 by Max Tegmark - Explores how AI might reshape life and society
- If Anyone Builds It, Everyone Dies by Eliezer Yudkowsky and Nate Soares - Direct arguments for risks of extinction due to superintelligence
Deeper Dives
- The AI Safety Atlas - Comprehensive textbook about all aspects of AI Safety
- aisafety.dance - Visual introduction to technical AI Safety by Nicky Case
- Robert Miles AI Safety - Videos explaining technical concepts behind modern AI systems, and why it might be dangerous
- Or if you just want to look at AI Safety memes, be my guest
Scenarios
- AI 2027 - A very detailed scenario for the coming ~5 years released in April 2025, written by people who have previously made good predictions
- How AI Takeover Might Happen in 2 Years by Joshua Clymer
- A History of the Future by Rudolf Laine
Other Reading Tips
- Capital, AGI, and Human Ambition, or “By default, capital will matter more than ever after AGI”
- Your A.I. Lover Will Change You
- Your Campus Already Has AI — And That’s the problem
Ways To Take Action
- Contact your representatives and ask them to support AI regulation.
- Talk to your friends and acquaintances about AI Safety, and how they can help.
- Donate to AI Safety causes
- See if your skills can be applied to AI Safety, and change career if you think it’s a good fit.
- Join MicroCommit for a few weekly less-than-5-minute tasks to help AI Safety and spread awareness.
Build Skills
- Complete BlueDot’s 2-hour Future of AI Course to understand the basics
- Explore AI Safety training programs and events; from weekend hackathons to multi-month bootcamps
- Self-study with curated resources from AISafety.com/self-study
- Participate in Apart Research hackathons: monthly weekend research sprints open to all levels, with mentorship and publication support
Focus on projects over courses. Building something concrete teaches you more than studying alone, and is a much stronger signal to hiring managers and funders.
Stay Connected
Community
- Connect with others at EA Global conferences
- Find your local AI Safety community
Newsletters
- Transformer: Weekly newsletter covering latest AI developments
- Zvi Mowshowitz: Long articles covering basically everything going on with transformative AI
- AI Safety Events and Training: Weekly update on upskilling opportunities
- More resources on AISafety.com’s Staying Informed page
Podcasts
- Dwarkesh Patel - Deep research interviews with influential people in AI
- 80,000 Hours Podcast - Career advice and AI safety conversations
- Cognitive Revolution
Career Paths
Check out 80,000 hours’ page on AGI careers for more information on the different areas.
Possible Areas
- Theoretical technical research: Iliad conference and people who go there. Singular Learning Theory (Timaeus), Agent Foundations, Scalable Oversight, Safety-by-Debate
- Applied technical research: Evals, Control, Mechanistic Interpretability (upskilling resources, 80,000 hours career review)
- Governance (80,000 hours career review)
- Technical governance (open problems paper)
- Advocacy
- Operations (80,000 hours career review, my own experience and advice)
Career Transition Resources
Mid-career people transitioning into AI Safety usually have valuable skills, but lack the AI Safety context and a network. These resources will help you get this.
Programs for Career Changers
- SuccessIf - Personalized career advising for experienced professionals transitioning to AI safety
- High Impact Professionals - For experienced professionals seeking high-impact roles
- CEA Career Pivot Bootcamp - Free 4-day online bootcamp to identify high-impact career paths, apply to opportunities, and build your network
Additional Resources
- Speak with an advisor. I recommend 80,000 hours and AI Safety Quest.
- Open Philanthropy career transition funding: Financial support for career changes
- How to have an impact when the job market is not cooperating: Practical advice for finding a role in AI Safety and other impactful areas
- Why experienced professionals fail to land high-impact roles: Focus on building context in the field