this post was submitted on 06 May 2024
164 points (90.2% liked)

Technology

59670 readers
2762 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 44 points 6 months ago (1 children)

lol this is going to be fantastically catastrophic. ChatGPT is going to end up indirectly writing so much code. And I am fully aware how often ChatGPT give you absolute nonsense when asked to write some code. It’s got a decently high hit rate for relatively unchallenging stuff, but it is nowhere NEAR 100% accurate.

TL;DR stackoverflow doesn’t understand how many developers naively copypaste shit from stackoverflow I guess? Wcgw

[–] [email protected] 4 points 6 months ago* (last edited 6 months ago)

I don’t think this will affect StackOverflow website though? The blog implies that ChatGPT will use StackOverflow API to use as a knowledge source (and probably be paid for it).

OpenAI and Stack Overflow are coming together via OverflowAPI access to provide OpenAI users and customers with the accurate and vetted data foundation that AI tools need to quickly find a solution to a problem […]. OpenAI will also surface validated technical knowledge from Stack Overflow directly into ChatGPT, giving users easy access to trusted, attributed, accurate, and highly technical knowledge and code backed by the millions of developers that have contributed to the Stack Overflow platform for 15 years.

This seems to be exactly to prevent hallucinations when there’s a good vetted answer already.

Either people didn’t read the blog or is there something I’m missing?