A new app offering to record your phone calls and pay you for the audio so it can sell the data to AI companies is, unbelievably, the No. 2 app in Apple’s U.S. App Store’s Social Networking section.
The app, Neon Mobile, pitches itself as a moneymaking tool offering “hundreds or even thousands of dollars per year” for access to your audio conversations.
Neon’s website says the company pays 30¢ per minute when you call other Neon users and up to $30 per day maximum for making calls to anyone else. The app also pays for referrals. The app first ranked No. 476 in the Social Networking category of the U.S. App Store on September 18 but jumped to No. 10 at the end of yesterday, according to data from app intelligence firm Appfigures.
On Wednesday, Neon was spotted in the No. 2 position on the iPhone’s top free charts for social apps.
Neon also became the No. 7 top overall app or game earlier on Wednesday morning and became the No. 6 top app.
According to Neon’s terms of service, the company’s mobile app can capture users’ inbound and outbound phone calls. However, Neon’s marketing claims to only record your side of the call unless it’s with another Neon user.
That data is being sold to “AI companies,” Neon’s terms of service state, “for the purpose of developing, training, testing, and improving machine learning models, artificial intelligence tools and systems, and related technologies.”
The fact that such an app exists and is permitted on the app stores is an indication of how far AI has encroached into users’ lives and areas once thought of as private. Its high ranking within the Apple App Store, meanwhile, is proof that there is now some subsection of the market seemingly willing to exchange their privacy for pennies, regardless of the larger cost to themselves or society.
Despite what Neon’s privacy policy says, its terms include a very broad license to its user data, where Neon grants itself a:
…worldwide, exclusive, irrevocable, transferable, royalty-free, fully paid right and license (with the right to sublicense through multiple tiers) to sell, use, host, store, transfer, publicly display, publicly perform (including by means of a digital audio transmission), communicate to the public, reproduce, modify for the purpose of formatting for display, create derivative works as authorized in these Terms, and distribute your Recordings, in whole or in part, in any media formats and through any media channels, in each instance whether now known or hereafter developed.
That leaves plenty of wiggle room for Neon to do more with users’ data than it claims.
The terms also include an extensive section on beta features, which have no warranty and may have all sorts of issues and bugs.

Though Neon’s app raises many red flags, it may be technically legal.
“Recording only one side of the phone call is aimed at avoiding wiretap laws,” Jennifer Daniels, a partner with the law firm Blank Rome‘s Privacy, Security & Data Protection Group, tells TechCrunch.
“Under [the] laws of many states, you have to have consent from both parties to a conversation in order to record it … It’s an interesting approach,” says Daniels.
Peter Jackson, cybersecurity and privacy attorney at Greenberg Glusker, agreed — and tells TechCrunch that the language around “one-sided transcripts” sounds like it could be a backdoor way of saying that Neon records users’ calls in their entirety but may just remove what the other party said from the final transcript.
In addition, the legal experts pointed to concerns about how anonymized the data may really be.
Neon claims it removes users’ names, emails, and phone numbers before selling data to AI companies. But the company doesn’t say how AI partners or others it sells to could use that data. Voice data could be used to make fake calls that sound like they’re coming from you, or AI companies could use your voice to make their own AI voices.
“Once your voice is over there, it can be used for fraud,” says Jackson. “Now this company has your phone number and essentially enough information — they have recordings of your voice, which could be used to create an impersonation of you and do all sorts of fraud.”
Even if the company itself is trustworthy, Neon doesn’t disclose who its trusted partners are or what those entities are allowed to do with users’ data further down the road. Neon is also subject to potential data breaches, as any company with valuable data may be.

In a brief test by TechCrunch, Neon did not offer any indication that it was recording the user’s call, nor did it warn the call recipient. The app worked like any other voice-over-IP app, and the caller ID displayed the inbound phone number, as usual. (We’ll leave it to security researchers to attempt to verify the app’s other claims.)
Neon founder Alex Kiam didn’t return a request for comment.
Kiam, who is identified only as “Alex” on the company website, operates Neon from a New York apartment, a business filing shows.
A LinkedIn post indicates Kiam raised money from Upfront Ventures a few months ago for his startup, but the investor didn’t respond to an inquiry from TechCrunch as of the time of writing.
Has AI desensitized users to privacy concerns?
There was a time when companies looking to profit from data collection through mobile apps handled this type of thing on the sly.
When it was revealed in 2019 that Facebook was paying teens to install an app that spies on them, it was a scandal. The following year, headlines buzzed again when it was discovered that app store analytics providers operated dozens of seemingly innocuous apps to collect usage data about the mobile app ecosystem. There are regular warnings to be wary of VPN apps, which often aren’t as private as they claim. There are even government reports detailing how agencies regularly purchase personal data that’s “commercially available” on the market.
Now AI agents regularly join meetings to take notes, and always-on AI devices are on the market. But at least in those cases, everyone is consenting to a recording, Daniels tells TechCrunch.
In light of this widespread usage and sale of personal data, there are likely now those cynical enough to think that if their data is being sold anyway, they may as well profit from it.
Unfortunately, they may be sharing more information than they realize and putting others’ privacy at risk when they do.
“There is a tremendous desire on the part of, certainly, knowledge workers — and frankly, everybody — to make it as easy as possible to do your job,” says Jackson. “And some of these productivity tools do that at the expense of, obviously, your privacy, but also, increasingly, the privacy of those with whom you are interacting on a day-to-day basis.”