this post was submitted on 23 Oct 2024
26 points (96.4% liked)

Apple

17418 readers
249 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 week ago (2 children)

“Do you want me to use ChatGPT to do that?”

No. I don’t. I really really don’t.

[–] [email protected] 15 points 1 week ago

You can turn off the ability for it to request chatGPT if it can’t resolve the request on its own

[–] [email protected] 12 points 1 week ago

IMHO, this is how integrations with chat GPT should work. Default to a local model or a private cloud model, and if that doesn’t work, ask the user if they want the query to go to another LLM. And let people turn off the external LLM prompts entirely.

Let make it opt it. And make opting out very prominent.