AI data protection guide 2024: Secure use of ChatGPT, Copilot and Co.

May 3, 2024
6
min read

KI-Datenschutz-Leitfaden 2024: Sicherer Einsatz von ChatGPT, Copilot und Co

Dieser Blogpost entstand auf Basis eines Blog-Posts von VISCHER, einer renommierten Schweizer Anwaltskanzlei, die sich in Sachen Datenschutz und KI über die Schweiz hinaus einen Namen gemacht hat. Den Original Post zum Thema finden Sie hier!

Table of Contents

AI data protection guide 2024: Secure use of ChatGPT, Copilot and more

This blog post was created on the basis of a blog post by VISCHER, a renowned Swiss law firm that has made a name for itself in terms of data protection and AI beyond Switzerland. The original post on the subject can be found here!

In the world of artificial intelligence (AI), tools such as ChatGPT Teams, ChatGPT Enterprise, the OpenAI API, Microsoft Copilot for 365, and Azure OpenAI offer a wide range of options to keep companies and individuals on the safe side. ChatGPT Teams and ChatGPT Enterprise are specifically designed to meet the needs of teams and companies, offering advanced features and security measures that are essential for working in professional environments. The OpenAI API allows developers to build and integrate custom AI solutions, giving them complete control over the use and implementation of AI in their projects. Microsoft Copilot for 365 is an innovation that seamlessly integrates with popular Office applications and helps users work more efficiently and effectively, while Azure OpenAI combines the power of OpenAI's advanced AI models with the reliability and scalability of Microsoft's cloud infrastructure. Each of these tools offers unique benefits and security features that ensure users can harness the power of AI while remaining protected and compliant with data protection standards.

Why is data protection important with AI tools?

Data protection in AI tools is of central importance for companies for a number of reasons. First, to ensure compliance with legal regulations such as the GDPR, which not only ensures legal compliance but also avoids potential penalties and sanctions. Second, protecting users' privacy and personal data is critical to maintaining the trust and credibility of the company. Failure to do so could result in serious reputational damage. In addition, companies must ensure that the data and insights generated by AI meet ethical standards and do not lead to discriminatory or biased results. In a world where data is a valuable asset, the correct handling of personal data is a crucial component of a company's long-term success and sustainable growth.

Current data protection laws and AI regulation

The EU AI Act is a step towards regulating artificial intelligence. It establishes standards to make AI systems secure, transparent and environmentally friendly. This law divides AI systems into different risk categories, with specific regulations for each category. In particular, generative AI models such as OpenAI's GPT must meet higher transparency requirements. This includes disclosing that content was generated by AI, preventing illegal content from being created, and publishing summaries of copyrighted data used. The aim is to create a balance between promoting innovative technologies and protecting fundamental rights and people's safety. For more information, visit the European Parliament website.

Advantages and disadvantages of common AI tools

Here is a detailed summary common AI tools and what you need to know in terms of data protection (Source: Vischer, 2024)

  • ChatGPT by OpenAI
  • Microsoft Copilot and Azure OpenAI Service
  • Google Gemini and Vertex AI

1. ChatGPT from OpenAI

OpenAI provides ChatGPT in various versions, from free to paid corporate versions. While free and basic versions for private customers are not suitable for processing sensitive data, corporate versions such as “ChatGPT Enterprise” and “OpenAI API” allow data protection-compliant use by concluding a Data Processing Agreement (DPA). These versions also include confidentiality obligations. The confidentiality obligations are sufficient for trade secrets, but not for legal professional secrets.

Since January 2024, OpenAI has been offering “ChatGPT Team” for smaller companies, i.e. it is no longer necessary to purchase several hundred accounts as with ChatGPT Enterprise, but a start is possible with just 2 accesses.

This version is similar to “ChatGPT Enterprise” and the “OpenAI API” in terms of use and also allows you to conclude a Data Processing Agreement (DPA). There is also a confidentiality clause and the content is not used for training purposes. For customers in the EEA and Switzerland, OpenAI Ireland Ltd. acts as a contractual partner, which offers legal advantages. The DPA is automatically concluded with the contractual partner, while the EU standard contractual clauses continue to be agreed with OpenAI, LLC,.

The ChatGPT Free and ChatGPT Plus service in the EU is currently provided by the US company OpenAI. However, in accordance with the new “Europe Terms of Use,” which apply from February 15, 2024, OpenAI's Irish subsidiary will assume responsibility for customers in Switzerland and the European Economic Area (EEA).

Like you ChatGPT without login You can use it, by the way, in another post from us.

Source: Vischer, 2024

2. Microsoft Copilot and Azure OpenAI Service

Microsoft has different contractual rules for their AI tools. The “co-pilot” available via Edge and Web allows the use of input and output to improve the service and gives Microsoft a license to use this data, which makes it unsuitable for corporate use.

Microsoft has also been offering “Copilot Pro” for private users since January 2024. This version, which runs under the service contract for private customers, is also not intended for use in companies, as the necessary contractual agreements are missing.

Microsoft offers the “co-pilot with commercial data protection” in the Edge browser for business users. This is part of the M365 business packages. Users who want to use this service sign in with their M365 business account, and a special privacy notification appears.

“Your personal and company information is protected in this chat.”

It could be assumed that Microsoft's “co-pilot with commercial data protection”, which is available in the Edge browser, falls under business contracts such as the “Microsoft Customer Agreement” (MCA) and the “Data Processing Addendum” (DPA). However, this is not the case, as users must accept additional conditions that apply to private customers. Microsoft guarantees that the data will not be used for its own purposes and will only be stored as long as the browser window is open and that no license to use the data will be transferred to Microsoft. However, the “co-pilot with commercial data protection” is excluded from the DPA and Microsoft acts as the person responsible for data processing.

The use of Microsoft's “co-pilot with commercial data protection” in companies raises data protection issues, as Microsoft normally acts as a contract processor, which is not the case here. Companies should therefore be careful about using personal data in Copilot as there is no Data Processing Agreement (DPA). Although the Irish Microsoft company, the contractual partner, is subject to the GDPR and therefore provides adequate data protection, it is considered a third party under data protection law — and companies in particular will not simply want to hand over their personal data to third parties.

The contractual protection of data protection and confidentiality for companies is ensured by “Copilot for Microsoft 365” or “Azure OpenAI Service”. These services are purchased under business contracts such as the MCA and follow Microsoft's DPA, with the option to make additional agreements for specific requirements. “Copilot for Microsoft 365,” available since January 2024, provides a clearer contractual basis for corporate use compared to the “Copilot with commercial data protection.” However, users must be aware of the differences between these versions.

Source: Vischer, 2024

3. Google Gemini (formerly Bard) and Vertex AI

Google's “Gemini” is a free AI chatbot that is primarily intended for private users and is not suitable for processing personal data, as Google points out. Therefore, its use in everyday working life is limited. In contrast, there is “Vertex AI,” Google's corporate service, which is operated under the Google Cloud Platform terms of use and with a Data Processing Agreement (DPA). Vertex AI is suitable for use with personal data and confidential information, but requires additional contractual assurances of professional and official secrets from Google.

Source: Vischer, 2024

Summary and outlook

In summary, in 2024, AI tools such as ChatGPT Teams, ChatGPT Enterprise, OpenAI API, Microsoft Copilot for 365 and Azure OpenAI will be crucial for data protection in companies. These tools provide specialized features and security measures to meet businesses' needs. Data protection is of central importance in order to comply with legal requirements and maintain trust. The EU AI Act introduces new standards to make AI systems secure and transparent. Especially for processing personal data and trade secrets, it is important to choose suitable versions of AI tools that comply with data protection regulations. The outlook shows that companies must continuously adapt to changing legal and technological frameworks in order to be able to fully utilize the benefits of AI. You can also find more about this topic in our blog post Focus on data security and GDPR compliance.

May 3, 2024
6
min read
+

Discover our AI training programs for your team

Want to become more efficient and happier with AI?

You agree that we may contact you to send you our weekly newsletter, among other things
Vielen Dank!
Beim Absenden des Formulars ist ein Fehler passiert.