The app's popularity comes with a warning: businesses need to lock down data before it leaks. Credit: Thinkstock OpenAI shipped its ChatGPT app for iPads and iPhones just a week ago, but it has already become one of the most popular applications in the last two years, with over half a million downloads in the first six days. That’s a real achievement, but also a challenge — that’s half a million potential data vulnerabilities. Not to rest on its laurels, this year’s favorite smart assistant (so far) is now also available in 41 additional nations. There’s little doubt that this has been one of the most successful software/service introductions of all time, but that doesn’t change the inherent risk of these technologies. Keep the red flag flying The popularity of the app should wave a red flag for IT leaders, who must redouble efforts to warn staff not to input valuable personal or corporate data into the service. The danger in doing so is that data gathered by OpenAI has already been attacked once, and it’s only a matter of time until someone gets at that information. After all, digital security today isn’t about if an incident happens, but when. To coin a phrase from Apple’s playbook, the best way to protect data online is not to put the information there in the first place. That’s why iPhones and other products from Cupertino (via China, India, and Vietnam) work on the principle of data minimization, reducing the quantity of information collected and taking pains to reduce the need to send it to servers for processing. That’s a great approach, not just because it reduces the quantity of information that can slip out but because it also reduces the opportunity for humans to make mistakes in the first place. The humans are coming We don’t have that protection with ChatGPT apps. Beyond a wholesale ban on using the service and application on managed devices, IT admins are almost completely reliant on trust when it comes to ensuring their staff don’t share confidential data with the bot. Still, humans are humans, so it’s inevitable that — no matter how stern the exhortations against such use — we can be certain some people will accidentally share confidential data through the app. They may not even realize they are doing it, simply seeing it as the equivalent of searching the web. It’s a similar threat to that of shadow IT, with humans accidentally sharing confidential information in exchange for what seems to be convenience. Private dancer IT must consider the App Privacy label OpenAI has attached to its product at the App Store. That label makes it clear that when using the app, the following data is linked to the user: Contact info — email, name, phone number. User content — “other” user content. Identifiers — User ID. Usage data — Product interaction. Diagnostics — Crash, Performance, Other diagnostic data. Available online, OpenAI’s own Privacy Policy should also be explored, although the company has not disclosed the training data it uses for its latest bots. The challenge here is that IT must consider the limitations of the latter alongside the inevitability of human nature. Regulators are already concerned about the privacy implications. In Canada, privacy regulators are investigating the company’s privacy practices, with similar activity taking place in Europe. (OpenAI seems pretty concerned about these investigations and has warned that it may or may not shut shop in Europe if the law is too rigorous.) Purple haze The deluge of activity around generative AI in general, and ChatGPT in particular ,should not mask the sweeping repercussions of these technologies, which offer vast productivity benefits, but threaten job security at a mass scale. In the short term at least, IT admins should do their utmost to ensure these consumer-simple products don’t threaten confidential business data. And for that to happen, users must be warned not to share data with these services until ratified under company security and privacy policy. Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe. Related content news analysis Apple earnings: About that iPhone 'slump' in China Based on information from Thursday's earnings report, it seems that data pointing to an iPhone slump in China were over-baked. By Jonny Evans May 03, 2024 9 mins iMac iPhone Apple news analysis Apple confirms it will open up the iPad in Europe this fall The latest efforts to comply with Europe’s Digital Markets Act mean developers can offer to side load apps to both iPhones and iPads in the EU. Apple has also taken steps to improve what it offers to smaller and non-commercial developers in the By Jonny Evans May 02, 2024 6 mins iPad Apple Mobile Apps news Mosyle and Fleet bring new device management options to Apple enterprise Apple's growing enterprise market share is generating tons of opportunity for the company's partners in the device management market. Their approaches reflect the diversity of use. By Jonny Evans May 01, 2024 4 mins Apple Mobile Device Management Mobile Security feature Apple is intensely focused on its global AI efforts When the ship that is Apple moves in any direction, you can always count on careless whispers to expose the destination. From research labs to sophisticated AI models and Apple Silicon for server farms, here's what we've learned in just one By Jonny Evans Apr 30, 2024 6 mins Apple Artificial Intelligence Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe