this post was submitted on 17 May 2024
300 points (98.7% liked)

Technology

34701 readers
391 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

They offer a thing they're calling an "opt-out."

The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn't actually opt-out.

When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won't be shared with other companies.

LOL no. That's not an opt-out. The way to opt-out is to stop using Slack.

https://slack.com/intl/en-gb/trust/data-management/privacy-principles

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 29 points 5 months ago* (last edited 5 months ago) (1 children)

I informed my SecOps team and they reached out to Slack. Slack posted an update:

We've released the following response on X/Twitter/LinkedIn:

To clarify, Slack has platform-level machine-learning models for things like channel and emoji recommendations and search results. And yes, customers can exclude their data from helping train those (non-generative) ML models. Customer data belongs to the customer. We do not build or train these models in such a way that they could learn, memorize, or be able to reproduce some part of customer data. Our privacy principles applicable to search, learning, and AI are available here: https://slack.com/trust/data-management/privacy-principles

Slack AI – which is our generative AI experience natively built in Slack – is a separately purchased add-on that uses Large Language Models (LLMs) but does not train those LLMs on customer data. Because Slack AI hosts the models on its own infrastructure, your data remains in your control and exclusively for your organization’s use. It never leaves Slack’s trust boundary and no third parties, including the model vendor, will have access to it. You can read more about how we’ve built Slack AI to be secure and private here: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/

[–] [email protected] 14 points 5 months ago

Sooo... they're still gonna do it. But it's ok because they promise to keep it separated from other stuff. 🙂