Skip to content
Astra Fellowship

Astra is Constellation’s flagship fellowship, built to accelerate AI safety research and talent.


As AI advances at unprecedented speed, preparing for its risks is critical. Astra brings exceptional people into the field and connects them with leading mentors and research opportunities.

Complete this expression of interest form to be notified when our next cohort launches!

constellation-11-17-25-drew-bird-274-min

Astra is a fully funded, 3-6 month, in-person program at Constellation’s Berkeley research center.

Fellows advance frontier AI safety projects with guidance from expert mentors and dedicated research management and career support from Constellation’s team.

 

Over 80% of Astra’s first cohort are now working full-time in AI safety roles in organizations such as Redwood Research, METR, Anthropic, OpenAI, Google DeepMind, the Center for AI Standards and Innovation, and the UK AI Security Institute. This round, we want to go further: placing even more people into the highest-impact roles, and helping fellows launch new initiatives to tackle urgent but neglected problems.

Three researchers collaborating around one laptop.
Trusted by the best
1
2
3
4
5
6
7
1
2
3
4
5
6
7
1
2
3
4
5
6
7
1
2
3
4
5
6
7
1
2
3
4
5
6
7

What we're looking for


We’re looking for talented people that are excited to pursue new ideas and projects that advance safe AI. You may be a strong fit if you:
  • Are motivated to reduce catastrophic risks from advanced AI

  • Bring technical or domain-specific experience relevant to the focus areas (e.g., technical research, security, governance, policy, strategy, field-building)

  • Would like to transition into a full-time AI safety role or start your own AI safety focused organization

Prior AI safety experience is not required. Many of our most impactful fellows entered from adjacent fields and quickly made significant contributions. If you're interested but not sure you meet every qualification, we’d still encourage you to apply.

Two researchers deep in discussion while sitting at a table.
One solo figure writing with pen and paper on a couch in front of large windows.
Close up image of two people typing on laptops with a cup of tea on the table.
Eli Lifland & Romeo Dean, AI Futures Project

“Astra was a really important program for us. We first started working with Daniel Kokotajlo through Astra, and the scenario we started developing during the program eventually became AI 2027. After Astra, we also co-founded the AI Futures Project together.”

Michael Chen, METR

"I worked with METR during Astra, and joined METR’s policy team immediately after the fellowship. Participating in Astra directly led to my current role.”

Aryan Bhatt, Redwood Research

“Astra was an incredibly important opportunity. I was able to work closely with my mentor Buck, who taught me a lot about doing good research. That eventually led me to my role at Redwood Research, where I now run an entire team.”

Martin Soto, Member of Technical Staff, UK AISI

"I'm really glad I participated in Astra! Endless conversations (both with my mentor and everyone else at Constellation), ranging from theoretical alignment to frontier governance, were an invaluable source of learning, knowledge and opportunities."

Mentors

Strategy
Empirical Research
Governance & Policy
Security
Field Building
Daniel Kokotajlo headshot

Daniel Kokotajlo

Executive Director | AI Futures Project
Eli Lifland headshot

Eli Lifland

Researcher | AI Futures Project
Thomas Larsen headshot

Thomas Larsen

Researcher | AI Futures Project
Hazel Browne headshot

Hazel Browne

Senior Program Associate | Coefficient Giving
Jake Mendel headshot

Jake Mendel

Associate Program Officer | Coefficient Giving
Alex Tamkin headshot

Alex Tamkin

MTS | Anthropic
Collin Burns headshot

Collin Burns

MTS | Anthropic
Erik Jones headshot

Erik Jones

MTS | Anthropic
Ethan Perez headshot

Ethan Perez

MTS | Anthropic
Fabien Roger headshot

Fabien Roger

MTS | Anthropic
Ben Angel Chang

Ben Chang

Formerly OSTP, CSET
Michael Chen

Michael Chen

METR
Fynn Heide

Fynn Heide

Safe AI Forum
Isabella Duan

Isabella (Fengyu) Duan

Safe AI Forum
Saad Siddiqui

Saad Siddiqui

Safe AI Forum
Keri Warr

Keri Warr

Anthropic
Nicholas Carlini

Nicholas Carlini

Anthropic
Ben Angel Chang

Ben Angel Chang

Formerly OSTP, CSET
buck_shlegeris

Buck Shlegeris

Redwood Research
Arden Koehler

Arden Koehler

80,000 Hours
Alexandra Bates

Alexandra Bates

Constellation
Henry_Sleight

Henry Sleight

Constellation
Strategy
Strategy
Empirical Research
Governance & Policy
Security
Field Building
Daniel Kokotajlo headshot

Daniel Kokotajlo

Executive Director | AI Futures Project
Eli Lifland headshot

Eli Lifland

Researcher | AI Futures Project
Thomas Larsen headshot

Thomas Larsen

Researcher | AI Futures Project
Hazel Browne headshot

Hazel Browne

Senior Program Associate | Coefficient Giving
Jake Mendel headshot

Jake Mendel

Associate Program Officer | Coefficient Giving
Alex Tamkin headshot

Alex Tamkin

MTS | Anthropic
Collin Burns headshot

Collin Burns

MTS | Anthropic
Erik Jones headshot

Erik Jones

MTS | Anthropic
Ethan Perez headshot

Ethan Perez

MTS | Anthropic
Fabien Roger headshot

Fabien Roger

MTS | Anthropic
Ben Angel Chang

Ben Chang

Formerly OSTP, CSET
Michael Chen

Michael Chen

METR
Fynn Heide

Fynn Heide

Safe AI Forum
Isabella Duan

Isabella (Fengyu) Duan

Safe AI Forum
Saad Siddiqui

Saad Siddiqui

Safe AI Forum
Keri Warr

Keri Warr

Anthropic
Nicholas Carlini

Nicholas Carlini

Anthropic
Ben Angel Chang

Ben Angel Chang

Formerly OSTP, CSET
buck_shlegeris

Buck Shlegeris

Redwood Research
Arden Koehler

Arden Koehler

80,000 Hours
Alexandra Bates

Alexandra Bates

Constellation
Henry_Sleight

Henry Sleight

Constellation

Application

Applications Open
Aug 28
Applications close
Sep 26

Decisions & Onboarding

Acceptances sent
Nov 6
Onboarding finishes
Dec 31

 

Program starts
Jan 5
Program ends
Mar 31
Extension starts
Mar 31
Extension ends
Jun 31
Fellowship Benefits

We provide the resources and support needed for fellows to pursue full-time research.

Receipt

Stipends

Competitive financial support for the duration of the program.

magnify

Research Budget

~$15K per fellow per month for compute.

envelope

Visa Support

We provide support and guidance for international applicants navigating the visa process.

Additional benefits

1
Workspace & Community
Ongoing collaboration at Constellation’s Berkeley research center, where you’ll have access to ~150 network participants and ongoing AI safety focused convenings (e.g., shared daily meals, seminars, workshops, table-top-exercises, conferences).  
2
Mentorship & Research Management
Weekly mentorship from senior experts and research management support from Constellation’s team (via 1:1s, small group meetings, office hours, Slack collaboration).
3
Placement Services
Many fellows are expected to join organizations that are participating in Astra. Others are actively connected to opportunities across our network.
4
Incubation Services
We also provide advisory services for fellows launching new projects and organizations (e.g., business operations, communication, hiring, fundraising & more).

What we're looking for

We’re looking for talented people that are excited to pursue new ideas and projects that advance safe AI. 

You may be a strong fit if you:
  • Are motivated to reduce catastrophic risks from advanced AI
  • Bring technical or domain-specific experience relevant to the focus areas (e.g., technical research, security, governance, policy, strategy, field-building)
  • Would like to transition into a full-time AI safety role or start your own AI safety focused organization

Prior AI safety experience is not required. Many of our most impactful fellows entered from adjacent fields and quickly made significant contributions. If you're interested but not sure you meet every qualification, we’d still encourage you to apply.

How to apply

Applications for our January 2026 cohort are now officially closed!

Complete this expression of interest form to be notified when our next cohort launches—possibly as soon as Summer 2026!

Join the next cohort

Complete this expression of interest form to be notified when our next cohort launches!

Person wears Constellation hoodie. They are laughing and smiling while two other people at the table smile back.

Explore other programs and discover new ways to engage with our network