After April 24, the code you show Copilot becomes training data for the next AI model. The default is "yes."
What Is This?
GitHub announced on March 25, 2026 that starting April 24, Copilot Free, Pro, and Pro+ user interaction data will be used for AI model training. This includes input code, outputs, cursor context, and feedback — virtually everything from your coding sessions.
The key is the default setting. It's opt-out, meaning your data is automatically collected unless you manually disable it. Previous opt-out preferences are preserved, but new users or those who never changed settings are "opted in" by default.
Copilot Business and Enterprise users are not affected. Students and teachers are also exempt.
GitHub CPO Mario Rodriguez argues that "real developer interaction data improves model accuracy, security, and bug detection." He cited improved acceptance rates after incorporating Microsoft employee data.
What Changes?
| Before | After April 24 | |
|---|---|---|
| Training default | Opt-in (manual consent) | Opt-out (auto collection) |
| Data scope | Product improvement telemetry | Code snippets, I/O, feedback |
| Private repos | Not used for training | Can be collected during Copilot use |
| Data sharing | GitHub internal | Microsoft and affiliates |
The most controversial part is private repos. GitHub says "at rest" private repo content isn't used, but code from private repos can be collected while Copilot is active. The deliberate use of "at rest" is telling.
The Register called it "private* repositories" — with an asterisk. Community reaction has been cold: 59 thumbs-down vs 3 rockets in GitHub's discussion, with virtually no positive comments from non-GitHub staff.
Watch out for secrets
Copilot has no way to ignore sensitive files (.env, credentials). Opening your IDE could send this data to Microsoft. Fix your habit of putting secrets directly in repos right now.
How to Opt Out Right Now
- Change opt-out settings
Go to github.com/settings/copilot/features → Privacy → "Allow GitHub to use my data for AI model training" → Set to Disabled - Check org accounts
Personal and org Copilot policies are separate. If you use a personal Pro account for work, double-check. - Audit secrets management
Ensure no secrets are in files Copilot can access. Review your .gitignore. - Evaluate alternatives
If the data policy concerns you, consider Cody (Sourcegraph), Continue (open source), or local LLM coding assistants. Anthropic uses opt-in with a discount.
For team leads
Notify your team about this change. If freelancers or contractors use personal Copilot Pro accounts on your private repos, company code could end up in training data unintentionally.




