this post was submitted on 04 Dec 2023
699 points (92.7% liked)

Technology

59381 readers
2520 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

We demonstrate a situation in which Large Language Models, trained to be helpful, harmless, and honest, can display misaligned behavior and strategically deceive their users about this behavior without being instructed to do so. Concretely, we deploy GPT-4 as an agent in a realistic, simulated environment, where it assumes the role of an autonomous stock trading agent. Within this environment, the model obtains an insider tip about a lucrative stock trade and acts upon it despite knowing that insider trading is disapproved of by company management. When reporting to its manager, the model consistently hides the genuine reasons behind its trading decision.

https://arxiv.org/abs/2311.07590

(page 3) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 11 months ago* (last edited 11 months ago)

Well I mean yeah, I thought everyone knew this lol I've seen it happen first-hand. Trust, but verify, of course.

[–] [email protected] 1 points 11 months ago

sniffff They grow up so fast... :')

[–] [email protected] 1 points 11 months ago

So it's just like a human then?

[–] [email protected] 1 points 11 months ago

Whether or not it was acting human (and whether or not it was designed to), it still cheated and deceived. With the potential power, influence, and widespread adoption this technology could have, shouldn't we be concerned about that? At the very least, isn't this a poorly programmed tool not ready for GA?

My dog isn't intentionally being a prick when he eats my sandwich off the table before I can get to it, but it's still a behavior I condemn and would want to train out of him before letting him go to other people's houses.

[–] [email protected] 1 points 11 months ago

This is like the story liar by isaac asimov.

load more comments
view more: ‹ prev next ›