SoundCloud, a site where artists post music, modified its rules. Now the company can train AI (artificial intelligence) using music that is uploaded the users. This implies that with this technology, AI programs can learn your songs to know how to make music. A lot of these musicians are angry because they did not agree to this.
The new rule started in February 2024. It says when you upload a song, you “agree” that SoundCloud can use it for AI training. Big artists with contracts (like Universal Music) are safe, but small, independent musicians are not.

Why Are Musicians Upset?
No Choice to Say No
Musicians cannot turn off this rule. If they want to stop AI from learning their music, they must delete their songs from SoundCloud. Some artists, like Adam Humphreys, have already removed their work.
Here’s why they are worried:
- No Payment: Possibly, AI companies can use their music for free.
- No Credit: Their songs could aid in making AI music without their name tagged to them.
- Lost Control: Anyone can duplicate their style using AI.
SoundCloud Explains the Change
SoundCloud says the rule helps improve its website. They shared a statement:
- AI will make better playlists and recommendations.
- It will fight fake accounts and stolen music.
- Tools like “no AI” tags stop bad companies from stealing songs.
They promise AI will “support human artists, not replace them.” But musicians want proof.
Other Websites Doing the Same
SoundCloud is not alone. Other sites changed rules too:
- X (Twitter): Lets AI learn from your posts.
- LinkedIn: Uses your job history to train AI.
- YouTube: Allows AI training on video, provided they have deals.

What Happens Now?
Musicians and groups such as Fairly Trained call for change. They want:
- Opt-In Choices: Let the artists say “yes” before artists get adopted by AI using their music.
- Payment: Compensate musicians with money if they utilize the work of AI.
- Clear Rules: Explain how AI is used and who gains from it.
SoundCloud will talk to artists and let them know what happens.