Join anincubator program of emerging AI safety contributors and help shape the future of responsible AI.
Safe AI Germany (SAIGE), a newly established, nationwide initiative dedicated to promoting the safe and responsible development of artificial intelligence (AI). The applications for the Spring 2026 cohort of the Incubator Program are now
open! It is a 3-months, part-time and remote-first intensive program, focusing on AI Safety.
Why is this important?
AI is improving at an unprecedented pace and with it grows the potential for catastrophic risk. Yet, as of date, only a tiny amount of people work full-time on making AI safe. What value does this program bring to you? This incubator is designed to be the ultimate launchpad for a career pivot into AI Safety. Real-world impact: In this cohort, you will work in small teams (up to 5 people) under the guidance of established AI Safety researchers and policy experts, from institutions such as Bonn University, Ellis Institute Tübingen, MIRI, MATS, Pivotal, FIG, Stanford Center for AI Safety and many more. For a full list
of projects and mentors, see here.
Build your track record:
The hardest part of entering this field is getting that first credible project on
your CV. This program gives you the mentorship, structure, and platform to start this track record.
Topics:
You can apply to work on targeted projects in Technical AI Safety, AI Governance & Policy, Technical AI Governance, or Communications & Field-Building.
Commitment:
The program rans for about 3 months from April 20–July 19 2026. The 10–20 hour/week commitment is designed to run alongside your current studies or full-time job, allowing you to rigorously test your fit for the field without taking on massive career risk.
The Perks:
Alongside your mentored project, you will grow your research skills in workshops, build your knowledge in AI systems and safety, connect with other emerging AI Safety enthusiasts, receive personalized career advice, and gain priority access to our global speaker series.
Prerequisites:
While prior knowledge and/or experience in AI Safety may be helpful, it is not a requirement. We intentionally open up our program to mentees with diverse backgrounds and career stages e.g. computer science, math, law or program management. If you are new to AI Safety, the program will bring you up to speed on the fundamentals before project work begins.
Browse the mentor projects here and apply before March 22, 23:59 CET via: https://safeaigermany.fillout.com/mentee-applications-stage1
For any questions, simply email info@safeaigermany.org.