Co-Found Lens Academy With Me. (We have early users and funding)
AI safety startup with early users and funding aims to prevent superintelligence extinction risk through scalable education.
Lens Academy, an AI safety education startup founded by technical generalist Luc Brinkman, is actively seeking a co-founder to help scale its mission of preventing existential risk from superintelligence. The platform focuses specifically on teaching the case for AI existential risk, why alignment is difficult, and strategic thinking about solutions—rather than covering all possible AI risks. With early users and funding already secured, the organization operates a scalable education model costing under $10 per student through volunteer facilitators and automated operations.
The co-founder role offers flexibility for either technical or non-technical generalists, with responsibilities spanning strategy, product development, marketing, community management, and fundraising. The technical stack includes Python backend, React with TypeScript frontend, and Supabase with PostgreSQL. Brinkman emphasizes seeking someone "intelligent and agentic" who can work effectively remotely and demonstrates previous deep motivation on projects. The platform uses active learning methods with measured outcomes and aims for viral growth through student referrals within one week rather than five.
Cofounder matching presents unique challenges, as Brinkman acknowledges that "vibes are as important as skills" for this partnership role. The position involves dividing work based on strengths and interests, with potential to contract out specific needs. Lens Academy represents a growing effort within the AI safety community to address what founders see as a critical shortage of people who deeply understand and can act on superintelligence risks.
- Scalable education model costs under $10 per student using volunteer facilitators and automated operations
- Seeks co-founder for Python/React/Supabase stack platform with focus on strategy, product, or growth roles
- Already has early users and funding while focusing exclusively on superintelligence existential risk education
Why It Matters
Addresses critical shortage of people who understand AI existential risks as superintelligence development accelerates.