Sofia GPT Data Handling and Security

  • Updated

This article explains how data is handled by Sofia GPT.

Important

This article focuses on the security information most relevant to Sofia GPT.

For complete details on Azure OpenAI data security, please see the Data, privacy, and security for Azure OpenAI Service article from Microsoft.

Important

In short, all components of Sofia GPT (the AI engine, data, etc.) are contained within OnePlan's Azure subscription. This means that using Sofia GPT is no more or less secure than using OnePlan in general. See Sofia GPT Architecture and Security for more information on Sofia GPT's architecture and how it relates to the overall security of the AI.

How is data processed by Sofia GPT?

The following chart is directly pulled from the Data, privacy, and security for Azure OpenAI Service article from Microsoft. Please visit this article for more information on the Azure OpenAI service security and data handling.

AzureOpenAiDataflow.png

What data is retained by Sofia GPT?

Specifically for Sofia GPT, prompts and completions data may be temporarily stored by the Azure OpenAI Service in the same region as the resource for up to 30 days in the event of an abusive content review.

Microsoft explains that "this data is encrypted and is only accessible to authorized Microsoft employees for (1) debugging purposes in the event of a failure, and (2) investigating patterns of abuse and misuse to determine if the service is being used in a manner that violates the applicable product terms." See Data, privacy, and security for Azure OpenAI Service for more information.

Is my data used to train the Sofia GPT?

No, user data is not used to train the AI.

Sofia GPT uses the derivative engine models built on top of the OpenAI API that is hosted in Azure. So, under Azure's privacy policy, users' data gathered from interacting with the AI is not used to train or improve OpenAI's AI models.

Can data retained by Sofia GPT be flushed?

Currently, there is no option to directly flush data retained by Sofia GPT.

However, in the future OnePlan may consider opting-out of opt out of the logging and human review process (which requires data logging and retention) established by Microsoft to review potentially abusive content. Opting out of this process will ensure that there is no data history stored in the AI Service.

See the "Can a customer opt out of the logging and human review process?" section from the Data, privacy, and security for Azure OpenAI Service article for more information.

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request