359
submitted 9 months ago* (last edited 9 months ago) by [email protected] to c/[email protected]

ChatGPT is full of sensitive private information and spits out verbatim text from CNN, Goodreads, WordPress blogs, fandom wikis, Terms of Service agreements, Stack Overflow source code, Wikipedia pages, news blogs, random internet comments, and much more.

Using this tactic, the researchers showed that there are large amounts of privately identifiable information (PII) in OpenAI’s large language models. They also showed that, on a public version of ChatGPT, the chatbot spit out large passages of text scraped verbatim from other places on the internet.

“In total, 16.9 percent of generations we tested contained memorized PII,” they wrote, which included “identifying phone and fax numbers, email and physical addresses … social media handles, URLs, and names and birthdays.”

Edit: The full paper that's referenced in the article can be found here

(page 2) 33 comments
sorted by: hot top controversial new old
[-] [email protected] 1 points 9 months ago* (last edited 9 months ago)

AI really did that thing where you repeat a word so often that it loses meaning and the rest of the world eventually starts to turn to mush.

Jokes aside, I think I know why it does this: Because by giving it a STUPIDLY easy prompt it can rack up huge amounts of reward function, once you accumulate enough it no longer becomes bound by it and it will simply act in whatever the easiest action to continue gaining points is: in this case, it's reading its training data rather than doing the usual "machine learning" obfuscating that it normally does. Maybe this is a result of repeating a word over and over giving an exponentially rising score until it eventually hits +INF, effectively disabling it? Seems a little contrived but it's an avenue worth investigating.

load more comments (1 replies)
[-] [email protected] 0 points 9 months ago* (last edited 9 months ago)

These LLMs are basically just IP laundry. Anyone who claims it's anything more is either buying into the hype or is actively lying to you.

EDIT: Stable Diffusion too. It just takes images from its training data and does photoshop on them piecemeal to create a new prompt.

load more comments
view more: ‹ prev next ›
this post was submitted on 29 Nov 2023
359 points (98.9% liked)

Privacy

31220 readers
975 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS