Experts reveal the sneaky way your phone is listening to your conversations – and how to stop it

For a long time it was thought to be a myth and was rejected by major tech companies.

But experts have revealed that tapping conversations has become a multi-billion dollar industry.

Earlier this week, a leading marketing firm leaked information that confirmed how companies are using microphones on smart devices to listen to you before selling the data to advertisers.

‘You could be talking to one of your friends about a holiday to Portugal over a phone call, and then a day later or the same day, what do you see? An advert for a trip,’ data security expert Andy LoCascio told DailyMail.com.

The first slide of CMG’s leaked pitch deck describes how their Active Listening software listens to your conversations and extracts real-time intent data

The deck goes on to break down the process step by step, from identifying a “data trail” left by consumers’ conversations and online behavior to creating targeted digital ads.

The deck goes on to break down the process step by step, from identifying a “data trail” left by consumers’ conversations and online behavior to creating targeted digital ads.

Last week’s leak came from a pitch deck from CMG, a marketing partner of Facebook, Amazon and Google.

The deck, which was apparently created specifically for potential clients, contained detailed information about CMG’s “Active-Listening” software, which collects data from people by listening to their conversations.

According to LoCacio, Active Listening software can be enabled through any app on an Android or iPhone device. Other devices, such as smart home assistants, can also listen in.

What’s more, these devices are listening almost all the time, not just when you’re deliberately using your microphone to make calls or talk to Alexa, for example.

“For most devices, there is no device state when the microphone is inactive. It is almost always active when Siri or another voice assistant is present on the device,” LoCascio said.

Companies that want to record and sell your voice data often gain access to your microphone through apps.

Normally, apps get permission to use your microphone through a clause “buried in the myriad permissions you accept when installing a new app,” he added.

This means that many users give permission for their data to be tapped without even realizing it.

“The problem is that consent is an all-or-nothing Faustian deal,” said Sharon Polsky, a data protection expert and consultant.

“So many websites say, ‘We collect information from you and about you. If you use our website, you have given your consent to everything we do.’ You can’t opt ​​out,” she added.

LoCascio explained that CMG and other companies get away with this even in states with wiretapping laws that prohibit recording someone without their knowledge, such as California.

“Just to be absolutely clear, there are no laws on this. If we give someone permission to use the microphone on our device, and we click away from all the other terms of service that none of us have ever read, then they can certainly use it,” LoCascio said.

According to Polsky, that lack of protective legislation has created “an entire data broker industry that is now worth billions.”

Google, Amazon and Facebook are explicitly named as CMG customers, but these tech giants have denied using CMG's Active Listening software

Google, Amazon and Facebook are explicitly named as CMG customers, but these tech giants have denied using CMG’s Active Listening software

The rapid growth of this sector is partly due to the development of very sophisticated, large language models, such as Chat GPT.

These extremely powerful AI tools will make it easier and faster for advertisers and other third parties to sift through our voice data for valuable information, LoCascio said.

“All I have to do is grab one of those transcripts, throw it into the ChatGPT box, and then ask it a simple question. Like, ‘Please tell me what product and services I can sell to someone based on this conversation,’” he explained.

Once that voice data is captured, it can be sold to advertisers to inform and drive targeted marketing. But it can also be sold to other customers, who may use it for entirely different reasons.

“They could be recording those conversations for all sorts of purposes,” LoCascio said.

“It’s one thing to say they’re doing it for advertising, and they can say that, but they’re blindly selling that information to other people. And they’re not scrubbing it, so they’re essentially selling an audio transcript,” he added.

Other examples of voice data buyers include insurance companies, which want to create personalized insurance rates, and the federal government, Polsky said.

“One of the consumers of our information — information about us — everything from our opinions, our preferences, our relationships, our travel routes — is the government,” she said.

And there are even more devious entities who also want to get their hands on our voting data, such as “people on the dark web who want to make a profit by scamming us,” Polsky said.

That means sharing your Social Security Number or other sensitive personal information could put you at risk of identity fraud, LoCascio said.

CMG is an American media conglomerate based in Atlanta, Georgia. The company provides broadcast media, digital media, advertising and marketing services and generated revenues of $22.1 billion in 2022.

CMG did not respond to DailyMail.com’s request for comment.

The leaked deck details the six-step process the company’s Active Listening software uses to collect consumers’ voice data from seemingly any device with a microphone.

It is not clear from the slideshow whether the Active Listening software is listening in continuously or only at specific times when the phone’s microphone is activated, such as during a phone call.

When asked whether lawmakers will take steps to protect the public from this kind of surveillance, LoCascio said it is highly unlikely and likely won’t make a meaningful difference anyway.

Disable an app’s microphone access in just 3 steps

Step 1: Open the Settings app on your phone.

Step 2: Scroll to the app you want to change the settings for and click on it. This will open a menu where you can see everything that app has access to.

Step 3: If the app has access to your microphone, you will see “microphone” with an on/off switch next to it. Switch that switch to OFF to ensure that the app cannot use your microphone.

“They can write as many laws as they want, but the bottom line is we absolve them all the moment we click, ‘Yes, I agree to the terms of your service,’” he continued.

Therefore, it is important that device users are aware of the privacy risks associated with reading an app’s Terms and Conditions and blindly accepting them.

To prevent your voice data from being taken over and sold, LoCasio recommends checking all your apps and deleting anything you don’t use regularly.

Once you’ve narrowed down the number of apps, take a look at the ones that remain and think critically about which apps you trust and which you don’t.

For those who don’t, change the settings to prevent them from accessing your microphone. That should prevent them from potentially listening in on your conversations.

And if you downloaded an app for a specific purpose and you no longer need it, delete it, LoCascio says.

If you granted microphone access when you downloaded it, you can leave the microphone on your phone unused so it can listen to your conversations at any time.

Polsky added that it’s best to turn off your phone and other devices when you’re not using them.

And ultimately, the most important thing is to educate yourself about the privacy risks your devices pose, she said.

“You can’t trust anyone these days,” LoCascio said.