Linux lays down the law on AI-generated code, says yes to Copilot, no to AI slop, and humans take the fall for mistakes — after months of fierce debate, Torvalds and maintainers come to an agreement
After months of debate, Torvalds and maintainers set strict rules for AI contributions, banning 'Signed-off-by' tags.
The Linux kernel project has resolved a months-long, fierce debate within the open-source community by issuing its first formal policy on AI-generated code. Led by Linus Torvalds and senior maintainers, the new guidelines pragmatically allow the use of AI coding assistants like GitHub Copilot but establish strict transparency and accountability measures. The core rule mandates that AI-generated code cannot be submitted with the traditional 'Signed-off-by' tag, which carries legal weight under the Developer Certificate of Origin (DCO). Instead, contributors must use a new 'Assisted-by' tag to clearly flag any AI involvement, ensuring the project's commit history maintains an auditable trail of human and machine collaboration.
The policy's most significant consequence is that it legally anchors responsibility for the code—and any resulting bugs, security vulnerabilities, or licensing issues—firmly on the human submitter. The AI tool itself cannot be a signatory or bear liability. This decision addresses core concerns about code quality ('AI slop'), security audits, and the legal framework of open-source licensing. It represents a major precedent for large-scale open-source projects, balancing the productivity benefits of AI assistance with the need for maintainer oversight and legal hygiene in one of the world's most critical software codebases.
- New policy mandates a transparent 'Assisted-by' tag for AI-generated code, replacing the legally binding 'Signed-off-by' tag.
- Human developers retain full legal responsibility for all AI-assisted contributions, including bugs and security flaws.
- The decision, led by Torvalds, ends months of debate and sets a precedent for major open-source projects using AI tools.
Why It Matters
Sets a critical legal and procedural standard for accountability as AI coding tools become ubiquitous in software development.