[Beta] Use Code Interpreter AND stay compliant with data privacy regulations

Hello OpenAI Community,

Like many of you, we’re big fans of the ChatGPT Code Interpreter. Its power and versatility have transformed how we think about data analysis. However, we also understand that in some situations, data privacy regulations and requirements can limit our ability to fully utilize this tool.

With that in mind, we’re excited to announce the development of a new product designed to enhance data privacy in the ChatGPT Code Interpreter. Our tools provide an added layer of protection, allowing you to maintain the integrity of your analysis while ensuring full data privacy.

This product is designed to integrate seamlessly with the ChatGPT Code Interpreter, providing a more secure environment for data analysis across various fields, particularly those where privacy is crucial.

We believe in the power of community and the value of user feedback. That’s why we’re currently looking for beta testers to help us refine and improve this new product. If you’re interested in getting early access and playing a part in shaping the future of privacy-friendly data analysis, we would love to hear from you.

Please reach out to us at contact.plausibleai@gmail.com to join our beta program.

How exactly are you “enhancing the security” of the Code Interpreter?

The system is already running on a temporary virtual machine and users have the option of disabling logging.

Manipulation of the Web interface via SDK, scaping or reverse engineering is against the Term of Service, so I’m wondering exactly what is being proposed?

1 Like

Probably trying to sell some github API code interpreter back to the people that wrote it.

They’re talking about “data privacy regulations,” probably suggesting GDPR?
One approach that could help here, would be anonymizing the data before uploading it, and then de-anonymyzing it on the download.
As long as the anonymizing/de-anonymizing happens within the protected region, and cannot be reversed (which is the hard part) then this might be GDPR compliant.
There’s a whole field of running analytics on anonymized, or even encrypted, data.

That being said – is that what @DataPrivacyPreserved has built? Seems unlikely, because if they did, there would be more details available about their approach.
Anything that is “secret” is, 99% of the time, actually just snake oil.

Thanks! It’s along those lines. We specialize in Data Privacy and AI/ML with a team of experienced PhDs who have previously developed similar systems for major companies like Amazon Alexa and Google. Currently, we are in the process of testing our cutting-edge solution with a select group of beta users. If you are interested, we would be delighted to share more details with you and provide access to our application.

Of course. Our solution is not against Term of Service. Please, see my reply to @jwatte below.

Thank you for your reply, I hope you understand that I need to check.

Please, see my reply below. If not interested, you don’t have to add false information. P.S. In general, it’s good to be optimistic. Try it out.

We kindly request a reference from a publicly accessible website to support this information. Thank you for your cooperation.

In the meantime this topic will be unlisted.


I searched for plausibleai and found.

image

(ref)

However there is https://plausible.io/ but in checking the email address to contact them it is hello@plausible.io which is not the same as yours contact.plausibleai@gmail.com.

1 Like