AI Safety

Coefficient Giving is seeking proposals for biosecurity projects

The philanthropic fund seeks proposals to prevent engineered pandemics, citing unprecedented AI and biotech convergence.

Deep Dive

Coefficient Giving, a major philanthropic organization, has announced a significant new funding initiative through its Biosecurity and Pandemic Preparedness team. The group has issued a Request for Proposals (RFP) aimed at tackling what it describes as increasingly urgent existential risks at the intersection of artificial intelligence and biotechnology. The fund explicitly cites concerns that AI could lower barriers for malicious actors developing novel biological weapons and that advanced AI systems might themselves leverage biotech in harmful ways. With a submission deadline of May 11, 2026, the process begins with a simple 500-word expression of interest form, encouraging a wide range of applicants from academia, nonprofits, industry, or independent researchers.

The initiative plans to deploy more than $100 million in grant funding this year alone, signaling a major acceleration in spending. Funding is targeted across four strategic categories: transmission suppression (e.g., stockpiling PPE, developing air filtration tech), technical safeguards and governance (e.g., DNA synthesis screening, AI misuse classifiers), policy and advocacy, and field-building efforts like fellowships and accelerators. The RFP is part of a broader strategy to 'spend more and spend faster' in response to the perceived acceleration of dual-use risks. For individuals without a specific project, Coefficient is also running a separate Career Transition Development Funding program to attract new talent into the biosecurity field.

Key Points
  • Over $100 million in grants available in 2026 for biosecurity projects, focusing on AI-bio convergence risks.
  • Four funding categories: transmission suppression, tech safeguards/governance, policy/advocacy, and field-building (e.g., fellowships, accelerators).
  • Application starts with a 500-word expression of interest, due May 11, 2026, open to academics, nonprofits, industry, and independents.

Why It Matters

Represents a major, fast-moving capital infusion into a field grappling with catastrophic risks supercharged by rapid AI advancement.