this post was submitted on 17 May 2024
300 points (98.7% liked)

Technology

34701 readers
391 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

They offer a thing they're calling an "opt-out."

The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn't actually opt-out.

When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won't be shared with other companies.

LOL no. That's not an opt-out. The way to opt-out is to stop using Slack.

https://slack.com/intl/en-gb/trust/data-management/privacy-principles

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 43 points 5 months ago (1 children)

Yeah that’s not standing in europe… especially for PII…

[–] [email protected] 5 points 5 months ago (2 children)

That’s not strictly speaking true. It requires more oversight and mechanisms of control but those very well could already be in place.

[–] [email protected] 10 points 5 months ago (2 children)

If there's any PII in slack (which in itself is wrong), you cannot use this data for training, since the people whose data is being used have not given their consent. Simple as that.

[–] [email protected] 3 points 5 months ago (2 children)

That’s not true at all. If you obfuscate the PII it stops being PII. This is an extremely common trick companies use to circumvent these laws.

[–] [email protected] 2 points 5 months ago

How do you anonymise without supervision ? And obfuscation isn’t anonymisation…

[–] [email protected] 2 points 5 months ago (1 children)

You could say it's to "circumvent" the law or you could say it's to comply with the law. As long as the PII is gone what's the problem?

[–] [email protected] 2 points 5 months ago

LLMs have shown time and time again that simple crafted attacks can unmask the training data verbatim.

[–] [email protected] 4 points 5 months ago (1 children)

Well then explain me how you propose to apply data subject rights to a llm… you can’t currently un-train those as far as I know. And that’s not touching IP which isn’t exactly the same here and there.

I’m professionally watching what’s happening with this very topic and the current state of the law and related decisions makes everyone in the business cautious at the very least. Doesn’t prevent business to take risks but it’s risk taking indeed.

[–] [email protected] 1 points 5 months ago

That is very much what the EU AI act is trying to get at. LLMs are covered under GPDR and EU AI act, it is not a simple matter