Announcements

Help Wizard

Step 1

NEXT STEP

FAQs

Please see below the most popular frequently asked questions.

Loading article...

Loading faqs...

VIEW ALL

Ongoing Issues

Please see below the current ongoing issues which are under investigation.

Loading issue...

Loading ongoing issues...

VIEW ALL

Mark / Disable AI Generated Songs

The platform is increasingly flooded with AI-generated songs (especially the Release Radar), making it harder for users to discover authentic, human-created music. To improve the listening experience, Spotify should introduce a clear label for AI-generated songs and provide an option to filter them out entirely.

Comments
Dalberon

@AxMn I know about autotune, I mean I know the tech predates having a computer chip and while likely ported to code now I have never seen source code for any autotune.

 

Human's train on a smaller set of music than the AI, so by your opinion all human music is even bigger theft than AI.  This argument is so silly.  We have rules in place for how much something must be changed to be original and most human and AI music pass this test.

 

jonny99

Today i opened my discower weekly playlist out of 30 tracks 11 was AI generated with fake so called artists it´s a shame.

Now some good news today I could set spotify to play in so called loosless format 24 bit/ 44.1kHz-flac format. But please stop sending me the AI **bleep**.

AxMn

Lets simplify things.

 

Someone steals a car, lets make it your car, breaks it down to it's component parts, then builds a new slightly differt car with those exact same parts, maybe adds a splash of new paint. Is the car still stolen goods? Spoiler alert, yes it is. And I'd suspect you'd be pretty **bleep** off to boot.

 

Now imagine this times say one hundred million, this is what AI music is made from. Stolen goods enmasse, billions of dollars of goods.

 

Remember the 90's, when the FBI would raid some kid's grandmothers house because he used her internet to download a few tracks? Last I checked this is still illegal in most parts of the USA, so why does AI get a pass when it downloads BILLIONS of these same tracks not to listen to, but to cut up, repackage, and sell?

 

Theft is theft.

Dalberon

@AxMn

Good example except they don't "steal" your car in the example.  They copy it and change just enough to be legal and it is their design then.  You can go out and look at the cars passing on the road and see this is clearly done almost exclusively.  

It isn't theft to use other artists for training purposes as long as it doesn't match what one of them already did.  Your definition means every human that didn't pop out of there mom snapping their fingers and singing stole music.  I don't think you understand how training an AI works.

 

AxMn

From Copyright.gov:

 

"Uploading or downloading works protected by copyright without the authority of the copyright owner is an infringement of the copyright owner's exclusive rights of reproduction and/or distribution. Anyone found to have infringed a copyrighted work may be liable for statutory damages up to $30,000 for each work infringed and, if willful infringement is proven by the copyright owner, that amount may be increased up to $150,000 for each work infringed. In addition, an infringer of a work may also be liable for the attorney's fees incurred by the copyright owner to enforce his or her rights."

 

As per US regulations, It does not even matter what they used it for, all that matters is how they obtained most of their material, many platforms illegally TORRENTED the bulk of the training music, this has already been backed up by concrete evidence in several cases, there's no realistic way any of them could even remotely cover the licensing cost of millions of artists complete catalogs and approval to use their likenesses in any reproductions.

 

And this was certainly willful infringement in most cases, so let assume a conservative 1 billion songs were used in training, which is pretty reasonable considering how detailed these AI's are getting, at a fine of $150,000 a pop as per US copyright regulations, that should workout somewhere in the realm of about one hundred and fifty trillion US Dollars ($150,000,000,000,000 USD), per AI platform that was illegally trained on illegally obtained music.

 

The law isn't really clear on how recipients of services that use these illegally obtained materials should be treated, but I'd hazard a guess that most country's will probably update their laws in the near future to clarify that grey zone. Either way, people using these services should know how the sausage is made and how many artists have been infringed on for them to make their slop.

 

Now, there are some platforms that claim to have trained on ~150 million licensed songs, but those are the outliers.

 

PS: Retired software engineer with 25 years professional IT service for fortune 500 companies, pharmaceuticals, governments foreign and domestic. Specializing in data analytics, code refactoring, and logistics systems, which included development and training early proto AI's for the purposes of automation. I do tend to over simplify to make very difficult concepts easier to understand by those with slightly less experience.

 

Mic drop...?

xringeling

Fantastic idea, this is exactly what I need.

Spotifest

Release radar is supposed to show new music releases of artists I follow (~60) or I'm interested in.

Today, around 10 of my 30 release radar songs were low quality AI Artists. "Don't play this artist" works for one, but they keep popping up since these low-effort pastiches are mass produced under AI-generated artist names with AI-generated logos. I can see where this is going. Keep up the pace Spotify, you're losing ground.

lucky4d9

12 / 30 Release Radar songs today were trash sounding AI slop. I tell Spotify not to play from these artists as they pop up, but this is a legitimate plague. I'd be telling Moses to GTFO already.

 

Something desperately needs to be done.  

Dalberon

Apologies @AxMn, you and I were talking about different topics. If they stole the material for training that is theft. It isn't about the API any more than stealing a truck would be about moving loads of gravel.  I'm all for supporting artists and if they use art to train an ai they should be paying for it.


jonny99

As I said in an earlier post seems like we are discussing two different topics in this tread if it´s legal or not. This tread was started about the question if we subscribers should be forced to recieve AI generated music from spotify or not. The training of AI with real artists work is in my mind a legal question and should be handled on a different level by the authorities. So why don´t you who will discuss the legality start your own thread on that topic. It will make the topics a little easier to follow. Give us subscribers an option to opt out  or in AI generated music with fake so called artsts.

Don´t mean no harm by this comment.