We learned last week that Slack may be using customer private data, including messages, content and files, to train some of its AI models. According to their own communications and published privacy policy, all customers have been automatically enrolled in this data sharing program with an Opt Out option. On its surface, this represent a huge risk to businesses who use Slack for privileged and private communications including intellectual property and legally protected communications such as PII and PHI. The reality seems a bit more muddy in terms of what information Slack is actually using and how they’re using it. Ars Technica has published a good article with great context and a timeline here.
At this time, we’re recommending that all users submit the formal Opt Out email required to limit the use of private corporate data in the training of Slack’s AI models.
Additional information can be found below:
- User Outcry as Slack Scrapes Customer Data for AI Model Training – SecurityWeek
- Privacy principles: search, learning and artificial intelligence | Legal | Slack
- How We Built Slack AI To Be Secure and Private – Slack Engineering
- Slack users horrified to discover messages used for AI training | Ars Technica
If you have any additional questions or concerns about this information, feel free to reach out to me directly.
0 Comments