Open Source

Vercel will train model on your code

Free-tier users have 10 days to opt out of having their code used for AI training.

Deep Dive

Vercel, the popular platform for deploying web frameworks like Next.js, has introduced a significant update to its terms of service and privacy policy. The change, which has gone viral on developer forums, states that code hosted by users on the platform's Hobby or Free tiers will be used by default to train Vercel's AI models. Users are automatically opted in and have a limited 10-day window from the policy's announcement to manually navigate to their settings and opt out if they wish to exclude their code from this training data pool.

This policy grants Vercel a broad license to use, reproduce, and analyze user code for the purpose of "improving our services and machine learning models." The move is part of a growing trend where platform providers leverage user-generated content to build proprietary AI capabilities, but it raises immediate questions about intellectual property, the transparency of opt-in processes, and the expectations of developers using free services. The reaction has been swift, with many developers expressing concern over the opt-out rather than opt-in approach and the implications for private repositories or proprietary business logic hosted on the platform.

Key Points
  • Vercel's updated terms allow it to train AI models on code from Hobby and Free plan users by default.
  • Affected users have only a 10-day window to manually opt out of this data usage for model training.
  • The policy change highlights growing tensions between platform AI development and user data ownership/consent.

Why It Matters

Developers must actively protect their code IP, as free services increasingly use user data to build commercial AI products.