- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Maven, a new social network backed by OpenAI’s Sam Altman, found itself in a controversy today when it imported a huge amount of posts and profiles from the Fediverse, and then ran AI analysis to alter the content.
This is what I’ve been saying the entire time. It sucks, and it’s wrong, but the fediverse is built from the ground up as an open sharing platform, where amour data is shared with anyone. It shouldn’t be, and it’s wrong, but there is nothing to stop anyone from doing it. To change that would alter federation at a core level
I would rather my content be open to the world for however it wants to use it than owned by a single company that gets to profit off aggregating and selling it.
Fully agree. The annoyances of free and open are vastly outweighed by the negatives
Yeah but doesn’t hubzilla (https://hubzilla.org/page/info/discover) applies a privacy layer to how its content it is distributed? The issue then lies also in how the social network gets implemented in function of its purpose, in hubzilla vs lemmy case for instance is a public board vs a social network
That doesn’t mean it’s licensed to be used in a for profit software.
I’ve had this argument with other people, but essentially at this point there is no licensing beyond server ownership here, and most servers don’t have any licenses defined. Even if they do, then sure they did something wrong… but how would you ever prove it or enforce it? The only way to actually disallow them is to switch from open federation to closed - which goes against what we’re trying to build with federation.
There has been instances before where LLMs gave up clues as to what source it used. When that happens, they can be sued.
Im okay with people using our data for whatever, since it’s all open and it should be. But I rather put a little bit of effort to make for profit use technically illegal. It’s better than nothing.
If it ends up being ruled that training an LLM is fair use so long as the LLM doesn’t reproduce the works it is trained on verbatim, then licensing becomes irrelevant.