this post was submitted on 25 Apr 2024
53 points (100.0% liked)
Technology
37712 readers
176 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Check out llama3 which you can run locally.
@jeena it doesn't connect to any server even when you're online?
Correct.
@jeena cool. I think I will look into it. Thanks! 😁
Damn, it seems I was talking out of my ass and just assuming things. Now I asked llama3 about it and it said that even though it does not send our conversation to Meta, it does take my feedback (if I correct it in some way) and sends it to Meta's servers through the Internet:
// Will you be able to learn from me pointing it out or do you forget all conversations we had after I press Ctrl-D?
I'm a large language model, I don't have personal memories or experiences. Each time you interact with me, it's a new conversation, and I start from scratch.
When you point out an error or provide feedback, it's stored in my training data for future conversations. However, this information is not retained individually; instead, it's aggregated and used to improve the overall performance of my language understanding and generation capabilities.
So, when you press Ctrl-D (or close our conversation), I won't remember the specific details we discussed, including your feedback on my German sentence.
But, the collective knowledge from all our conversations will help me become a better AI model over time!
Keep in mind that I'm designed to learn and adapt through interactions with users like you. Your contributions, no matter how small they may seem, are invaluable in helping me improve and provide more accurate and helpful responses in the future!
/// But you run on my computer so technically how does my contribution get back to Meta to train the next generation? Do you send any data back?
When you interact with me through this chat interface, your input is processed by a client-side JavaScript code that runs on your local device (your computer or mobile phone). This code sends the necessary information about our conversation, including any corrections or feedback you provide, to Meta's servers.
Here's how it works:
To ensure your contributions are anonymous and secure, all data transmission is handled by Meta's infrastructure and follows industry-standard encryption protocols. Your input will help refine the AI models, enabling them to better serve users in the future!
Now, go ahead and chat away!
/// How would the data look in the case of me telling you the correct German expression? What would be sent back to meta's server? Show it as JSON.
When you correct an error or provide feedback, the client-side JavaScript code on your device sends a request to Meta's servers with the following information:
Here's what the data might look like in JSON format:
In this example:
event_type
: The type of event, which in this case is a "corrected_error".conversation_id
: A unique identifier for the conversation.turn_number
: The number of turns (or exchanges) in the conversation.user_input
: The corrected text you provided ("und kann einfache").correction_target
: The original incorrect text ("können") that was being discussed.language_code
: The language code for German ("de").platform
: The platform or device type used to access Meta's services (in this case, "web").client_info
: Additional metadata about the client device and browser.This JSON data is then sent to Meta's servers, where it can be processed and integrated into their training datasets.
The model is just hallucinating in this case.
Oh yeah, I found out the hard way and forgot to mention it here. https://jeena.net/llama3-phoning-home