BlueMonday1984

joined 9 months ago
[–] [email protected] 23 points 1 month ago (2 children)

Okay, personal thoughts:

This is just gut instinct, but it feels like generative AI is going to end up becoming a legal minefield once the many lawsuits facing OpenAI and others wrap up. Between the likes of Nashville's ELVIS Act, the federal bill for the COPIED Act, the solid case for denying Fair Use protection, and the absolute flood of lawsuits coming down on the AI industry, I suspect gen-AI will come to be seen by would-be investors as legally risky at best and a lawsuit generator at worst.

Also, Musk would've been much better off commissioning someone to make the image he wanted rather than grabbing a screencap Aicon openly said he was not allowed to use and laundering it through some autoplag. Moral and legal issues aside, it would have given something much less ugly to look at.

[–] [email protected] 15 points 1 month ago (6 children)

~~Kendrick~~ Zitron dropped - its mainly focusing on Prabhakar Raghavan's recent kicking upstairs, and Google's bleak future.

Main highlight was this snippet:

I am hypothesizing here, but I think that Google is desperate, and that its earnings on October 30th are likely to make the street a little worried. The medium-to-long-term prognosis is likely even worse. As the Wall Street Journal notes, Google's ad business is expected to dip below 50% market share in the US in the next year for the first time in more than a decade, and Google's gratuitous monopoly over search (and likely ads) is coming to an end. It’s more than likely that Google sees AI as fundamental to its future growth and relevance.

[–] [email protected] 9 points 1 month ago (1 children)

In other news, there's been a statement on AI training that's racked up over 10k signatures, which is unsurprisingly lambasting the rampant stealing that went into creating the autoplag machines:

Now, I'm way too much of a fan of sidenotes, so I'll whip one out:

Beyond simple content theft being publicly lambasted, I suspect that even licensed use of artists' work for gen-AI will ignite some controversy - if Eagan Tilghman's run-in with controversy last year is any indication, any usage of gen-AI, regardless of context, will be met with hostility.

[–] [email protected] 3 points 1 month ago

I found the git master branch naming controversy a bit misguided, since to my mind the analogy was more “master copy” or “master recording” than “master of a slave”. This isn’t IDE. Who names their VCS branch “slave”?

In a better world, this would've probably been a solid argument for letting the master/slave naming convention stick around. We don't live in a better world.

[–] [email protected] 13 points 1 month ago

Parents Sue School That Gave Bad Grade to Student Who Used AI to Complete Assignment

An old and powerful force has entered the fraught debate over generative AI in schools: litigious parents angry that their child may not be accepted into a prestigious university.

The tabloids are gonna be going nuts over this.

[–] [email protected] 10 points 1 month ago* (last edited 1 month ago) (6 children)

Update: The QRTs are mainly sneering, but this one's particularly good

EDIT: Against my better judgment, I'm letting another sidenote come out:

If you wanna encourage people to drop the master/slave naming scheme, this guy probably gave you a good bit of ammo. Changing a random naming scheme is a pretty low-priority task under most circumstances, but it gets a lot more tempting when it lets you distance yourself from people like this

[–] [email protected] 4 points 1 month ago* (last edited 1 month ago)

I can pull finer bars out my arse than this fucking farce

This is probably Grok - creativity from him's pretty sparse

Choom thinks he's DOOM, but he won't beat him any time soon

With how much crack this whack goes through, he'll forget this before noon

(I'm no MF DOOM, but anything I can put out will beat this artless twat any day)

[–] [email protected] 15 points 1 month ago

I'd be happy if the VCs responsible for this bubble died penniless, but I'll take them losing a lot of money

[–] [email protected] 12 points 1 month ago* (last edited 1 month ago) (2 children)

Now that the content mafia has realized GenAI isn’t gonna let them get rid of all the expensive and troublesome human talent. it’s time to give Big AI a wedgie.

Considering the massive(ly inflated) valuations running around Big AI and the massive amounts of stolen work that powers the likes of CrAIyon, ChatGPT, DALL-E and others, I suspect the content mafia is likely gonna try and squeeze every last red cent they can out of the AI industry.

[–] [email protected] 12 points 1 month ago (9 children)

Considering Glaze and Nightshade have been around for a while, and I talked about sabotaging scrapers back in July, arguably, it already has.

Hell, I ran across a much smaller scale case of this a couple days ago:

Not sure how effective it is, but if Elon's stealing your data for his autoplag no matter what, you might as well try to force-feed it as much poison as you can.

[–] [email protected] 8 points 1 month ago

I’m trying to think of how you monetize eyeball scans and the first thing that comes to mind (well, after being able to break biometric security) is training an AI to generate fake but passable eyeballs to undercut the use of iris scans as an anti-bot tool.

Silicon Valley's basically an AI cult at this point, so I can see your case.

[–] [email protected] 7 points 1 month ago* (last edited 3 weeks ago) (2 children)

New piece from The Atlantic: The Age of AI Child Abuse is Here, which delves into a large-scale hack of Muah.AI and the large-scale problem of people using AI as a child porn generator.

And now, another personal sidenote, because I cannot stop writing these (this one's thankfully unrelated to the article's main point):

The idea that "[Insert New Tech] Is Inevitable^tm^" (which Unserious Academic interrogated in depth BTW) took a major blow when NFTs crashed and burned in full view of the public eye and got rapidly turned into a pop-culture punchline.

That, I suspect, is helping to fuel the large scale rejection of AI and resistance to its implementation - Silicon Valley's failure to make NFTs a thing has taught people that Silicon Valley can be beaten, that resistance is anything but futile.

view more: ‹ prev next ›