ChatGPT Clone Apps Privacy Risks

We analyzed the 10 highest-ranked unofficial ChatGPT clone apps, most of whom rely on OpenAI's new ChatGPT-3 tech, in each of Apple and Google's app stores and identified the biggest privacy risks.
The Privacy Risks of ChatGPT Clone Apps
Simon Migliano

ChatGPT Clones Privacy Analysis

Android Apps

  • Intrusive personal data collection: All 10 top-ranked ChatGPT clone apps in Google’s Play store that we analyzed collect and share data with poor privacy protections.
  • Device fingerprinting: 5 apps allow third-party trackers to fingerprint users’ devices.
  • IP addresses: 2 apps collect and share users’ IP addresses with third parties.
  • TikTok/ByteDance: 1 app (ChatGPT AI Writing Assistant, 100,000 installs) tracks location data and shares with ByteDance, Amazon and others.
  • Risky permissions: 3 apps have permissions with privacy impacts, including 2 with the “record audio” permission but whose speech functions were not available in-app.
  • Risky inactive code functions: Almost all apps feature code with privacy impacts that lack the relevant permissions, ie location, camera, read/write storage, and access to photos and videos.
  • Cash-grab apps: 9 apps take advantage of OpenAI’s currently free GPT-3 tech. 3 apps charge for access, another offers an ad-free tier.
  • Fly-by-night apps: 3 apps were removed from Google Play during our research.

iOS Apps

  • Intrusive personal data collection: All 10 top-ranked ChatGPT clone apps in Apple’s App Store that we analyzed collect and share data with poor privacy protections.
  • Logging chat content: 2 apps were observed to potentially log question and answer content in our tests.
  • Device fingerprinting: 5 apps allow third-party trackers to fingerprint users’ devices.
  • Phoning home: 1 app made over 300 server requests in just four minutes.
  • App Store privacy label inaccuracies: 7 apps had conflicts between data collection practices outlined in their official privacy labels and the results of our analyses.
  • Cash-grab apps: 9 apps take advantage of OpenAI’s currently free GPT-3 tech, 8 of them charge for access – as much as $15,000 per year.
  • Fly-by-night apps: 1 app was removed from the App Store during our research.

Rise of the ChatGPT Clones

ChatGPT has captured the collective imagination of the internet since its launch in November 2022.

Prompting much hand-wringing about its potential for everything from plagiariazed schoolwork[1] to malware-on-demand,[2] the chatbot based on OpenAI’s language-processing AI model known as GPT-3 was also immediately seized upon for its commercial applications.[3]

The buzz around ChatGPT and AI chatbots in general also created an opportunity for those looking to turn a quick profit. ChatGPT is only officially available via a web browser interface on openai.com, which opened the door for mobile developers to flood app stores with unofficial ChatGPT clones.

While ChatGPT is currently free to use during what’s essentially an open beta phase, many of these clone apps charge users for the convenience of using a native app to access the OpenAI technology on their mobile device.

Many clone apps also mine the personal data of OpenAI’s fast-growing userbase: ChatGPT exceeded 1 million users in less than a week of launch and interest has only grown since then.

With our extensive experience of conducting privacy analysis of mobile apps, we decided to do a deep dive into this suddenly-popular niche to identify and quantify the current level of risk associated with this new type of mobile app.

We performed a range of analyses, including the use of open source HTTPS proxy mitmproxy to capture apps’ network traffic in order to determine what personal data they were actively sharing and with whom.

We did a close analysis of all apps’ privacy policies. For iOS apps, this included privacy labels on their App Store listings. We also examined Android apps’ source code to identify risky functions and permissions.

What we found was a clutch of grubby, parasitical apps that added no value for users and instead intruded on users’ privacy to profit from their data.

Our network analysis revealed the great majority of apps to be sharing numerous datapoints about users’ devices, such as screen size or network operator, which on their own might seem innocuous but in aggregate can be used to potentially “fingerprint” devices.

While none of the apps appears to be outright malicious, we found one app to be sharing data with ByteDance, a company under a cloud when it comes to privacy and data security,[4] which should give anyone pause before installing it.

There’s also the ethical consideration of charging sometimes ludicrous amounts for access to a free product while monetizing user data. While weekly charges were typically around the $5 mark, or a not insignificant $250 over the course of a year, one app charged a hair-raising $290 a week, which works out at over $15,000 per year.

ChatGPT Android App Analysis

The following table summarizes the findings of our analysis of the 10 top-ranked ChatGPT apps on Google’s Play Store. Scroll to the right to see all data columns.

For more detailed findings download the datasheet

Biggest Privacy Risks: Android

  • Device fingerprinting: Five apps share enough detailed information about users’ devices that it could potentially be used to identify and track them.
    • TalkGPT – Talk to ChatGPT
    • Open Chat – AI Chatbot App
    • ChatGPT
    • Chatteo – Chat with AI
    • ChatGPT 3: Chat GPT AI
  • IP addresses: Two apps collect and share users’ IP addresses with third parties.
    • TalkGPT – Talk to ChatGPT (shares with Everest Tech)
    • ChatGPT AI Writing Assistant (shares with Google)
  • Risky functions with matching permissions: Two apps have risky functions with the appropriate permission also in place:
    • TalkGPT – Talk to ChatGPT: location code plus both coarse and fine location permissions.
    • ChatGPT AI Writing Assistant: numerous code functions that relate to accessing images from users’ media libraries along with the required permisions, largely in support of creating profile pictures.

      Note: we did not find any evidence of malicious usage of these permissions, however their presence leaves the door open for potential abuse with future app updates.

  • Most common types of risky inactive code functions:
    • Location (7 apps)
    • Camera (4 apps)
    • Read/write storage (3 apps)

    While these functions currently lack the necessary permissions to be active, it’s possible that future app updates might introduce them.

  • Cash-grabs: As well as potentially profiting from user data, three apps reliant on OpenAI GPT-3 tech charged for access after a handful of initial chats despite it being currently free to access. Another app charges to remove advertising. The most expensive Android app was ChatGPT AI Writing Assistant, which offers a $4.99 per week subscription, which works out at almost $260 over a year.
  • Disappearing act: Given the privacy risks of many of these unofficial ChatGPT apps and the trend towards charging for the experience, it concerned us to see how three of the apps were suddenly yanked from the app store. There is no evidence to suggest why this happened, however it does raise the possibility that unscrupulous developers could be taking a smash-and-grab approach to user data and subscription fees while trying to stay under the radar.

Most Potentially Risky App

The single most concerning app in terms of data privacy was TalkGPT – Talk to ChatGPT (100,000 installs). We found the following red flags:

  • Tracks users’ precise location data and shares it with ByteDance, the controversial publisher of TikTok, and Amazon, along with adtech firms Appodeal and InMobi.
  • Screenshot of code functions relating to location data collection and tracking from ChatGPT AI Writing Assistant

    Screenshot of code functions relating to location data collection and tracking from ChatGPT AI Writing Assistant.

  • In addition to location-based permissions, the app also requests permission to record audio. Tapping on the speech function in the app will prompt the permission request, however once granted, the function does not work, raising the potential for abuse of the permission.
  • The privacy policy lacks professionalism. As well as being intrusive in terms of data collection, it is inconsistent in how it refers to the developer, ie TweetsOnGo in the Play Store and BlueTo in the privacy policy. The contact email domain is also inconsistent with either of the above: number1.co.il
  • The app collects and shares users’ IP addresses
  • The app shares users’ device fingerprints with at least five third parties: Facebook, AdColony, Everest Technolgies, Criteo, and Google.

ChatGPT iOS App Analysis

The following table summarizes the findings of our analysis of the 10 top-ranked ChatGPT apps on Apple’s App Store. Scroll to the right to see all data columns.

For more detailed findings download the datasheet

Biggest Privacy Risks: iOS

  • Logging user chats: In most of our network traffic tests, we were only able to observe the content of users’ chats with the AI in direct calls to the OpenAI API made over the secure HTTPS protocol. In such a scenario, access to this chat content is limited to OpenAI and remains inaccessible to the developer of the unofficial app. However in tests of two apps, we observed chat content in requests to developers’ own first-party servers rather than to servers on OpenAI domains. This opens up the possibility that the developers could log this data and use it for their own purposes. The two apps in question were:
    • Alfred – Chat with GPT 3
    • Chat w. GPT AI – Write This
  • Fingerprinting: Five apps share enough detailed information about a user’s device that it could potentially be used to identify and track them.
    • Open Chat – AI Chatbot
    • Genie – GPT AI Assistant
    • Alfred – Chat with GPT 3
    • Chat w. GPT AI – Write This
    • Wiz AI Chat Bot Writing Helper
    • Chat AI: Personal AI Assistant
    • Screenshot of device info shared by Alfred ChatGPT app

      Screenshot of device info shared by Alfred ChatGPT app.

  • Chatty chatbot: Open Chat – AI Chatbot (id1559479889) phones home over 300 times within four minutes of launching the app. It uses the server requests to share fingerprints of users’ devices with Facebook and Google.
  • App Store privacy label conflicts: Four apps use their App Store privacy labels to claim that they collect no data and yet have privacy policies, which require additional clicks to view, that admit to collecting personally identifiable information and sharing it with third parties.
    • ChatGPT – GPT 3
    • Chat w. GPT AI – Write This
    • Wiz AI Chat Bot Writing Helper
    • Wisdom Ai – Your AI Assistant
  • Cash-grabs? As well as potentially profiting from user data, eight apps reliant on OpenAI GPT-3 tech charged for access after a handful of initial chats despite it being currently free to access. The most audacious was Genie – GPT AI Assistant, which offers a $290 per week subscription, which works out at over $15,000 over a year. It also offers a $3,500 annual package.
  • Screenshot of subscription screen in the Genie iOS app

    Screenshot of subscription screen in the Genie iOS app.

Methodology

We identified the top 10 ranking apps in the Apple App Store and Google Play store for the search term “ChatGPT”.

We used open source tools, including mitmproxy, to analyze network traffic in our dedicated testing environment, and identify risky functions within the Android apps’ code.

We also analyzed privacy policies and store pages to determine data collection and sharing policies for each app.

The authors of all our investigations abide by the journalists’ code of conduct.

Additional research by Agata Michalak

References

[1] https://www.theguardian.com/technology/2022/dec/31/ai-assisted-plagiarism-chatgpt-bot-says-it-has-an-answer-for-that

[2] https://www.zdnet.com/article/people-are-already-trying-to-get-chatgpt-to-write-malware/

[3] https://hackernoon.com/11-business-ideas-that-could-become-a-reality-with-chatgpt

[4] https://fortune.com/2022/12/22/tiktok-data-privacy-blunder-china-bytedance-bans-journalist-accounts/