Slack trains machine-learning models on user messages, files and other content without explicit permission. The training is opt-out, meaning your private data will be leeched by default. Making matters worse, you’ll have to ask your organization’s Slack admin (human resources, IT, etc.) to email the company to ask it to stop. (You can’t do it yourself.) Welcome to the dark side of the new AI training data gold rush.
Corey Quinn, an executive at DuckBill Group, spotted the policy in a blurb in Slack’s Privacy Principles and posted about it on X (via PCMag). The section reads (emphasis ours), “To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content, and files) submitted to Slack as well as Other Information (including usage information) as defined in our Privacy Policy and in your customer agreement.”
The opt-out process requires you to do all the work to protect your data. According to the privacy notice, “To opt out, please have your Org or Workspace Owners or Primary Owner contact our Customer Experience team at feedback@slack.com with your Workspace/Org URL and the subject line ‘Slack Global model opt-out request.’ We will process your request and respond once the opt out has been completed.”
I’m sorry Slack, you’re doing fucking WHAT with user DMs, messages, files, etc? I’m positive I’m not reading this correctly. pic.twitter.com/6ORZNS2RxC
— Corey Quinn (@QuinnyPig) May 16, 2024
The company replied to Quinn’s message on X: “To clarify, Slack has platform-level machine-learning models for things like channel and emoji recommendations and search results. And yes, customers can exclude their data from helping train those (non-generative) ML models.”
How long ago the Salesforce-owned company snuck the tidbit into its terms is unclear. It’s misleading, at best, to say customers can opt out when “customers” doesn’t include employees working within an organization. They have to ask whoever handles Slack access at their business to do that — and I hope they will oblige.
Inconsistencies in Slack’s privacy policies add to the confusion. One section states, “When developing Al/ML models or otherwise analyzing Customer Data, Slack can’t access the underlying content. We have various technical measures preventing this from occurring.” However, the machine-learning model training policy seemingly contradicts this statement, leaving plenty of room for confusion.
In addition, Slack’s webpage marketing its premium generative AI tools reads, “Work without worry. Your data is your data. We don’t use it to train Slack AI. Everything runs on Slack’s secure infrastructure, meeting the same compliance standards as Slack itself.”
In this case, the company is speaking of its premium generative AI tools, separate from the machine learning models it’s training on without explicit permission. However, as PCMag notes, implying that all of your data is safe from AI training is, at best, a highly misleading statement when the company apparently gets to pick and choose which AI models that statement covers.
Engadget tried to contact Slack via multiple channels but didn’t receive a response at the time of publication. We’ll update this story if we hear back.
Trending Products