After implanting its always-on, always-listening speaker in your living room, tech giant Amazon wants to take the next step: peer into the depth of your soul and understand your feelings.

In a recent interview with VentureBeat, Alexa chief scientist Rohit Prasad said the Amazon’s Alexa team is beginning to analyze the sound of users’ voices to recognize their mood or emotional state.

Naturally, like every other tech company that monetizes users’ time and attention, Amazon professes of good intentions behind the change. And to be fair, emotion analysis can serve for some positive purposes.

According to Prasad, the initiative will enable Alexa to personalize and improve customer experiences, remove frustration by responding to queries based on emotional states or scan voice recordings to diagnose disease.

Amazon say the plan is to recognize when you’re frustrated and try to help with that

But that’s not all there is to Amazon’s emotion analysis project.

An anonymous source first revealed details of Amazon’s plans to track frustration and other emotions with MIT Tech Review last year, and said that Amazon is viewing it as a way to stay ahead of competitors.

Having first mover advantage, Amazon has the lion’s share of the smart speaker market.

But it is facing fierce competition from Google Home, which launched in late 2016.

Powered by AI and machine learning algorithms, smart assistants heavily rely on user data to improve their performance and features. Amazon already has a fair amount of data about its customers from its voice recordings, but not nearly as much as Google does, and it will only be a matter of time before the Home puts Alexa in its rear-view mirror.

That is, unless Amazon manages to compel its customers to give up more of their data to train its data-hungry algorithms.

A cynical take is that emotional recognition could help Amazon play user data catch-up with Google

Emotion analysis could be a defining factor in this regard. Presently, Alexa can only respond to short queries and can’t participate in a complicated conversation.

According to Prasad, better understanding of user emotions will lead to lengthier conversations with the AI assistant.

Naturally, lengthier conversations will enable Amazon to gather more data about its customers’ tastes and preferences and target them in profitable ways.

Visual online platforms like Facebook monitor your every move, every post you like or share, click or even pause on. It uses this data to show you ads that you’re more likely to click on: a business model that has been under a harsh spotlight in recent weeks.

Alexa’s interface with its customers is the speaker, which poses a different challenge.

The more Amazon knows about your preferences, the easier it will become for it to manipulate its assistant’s responses to your queries in profitable ways.

For instance, it might start making product suggestions in-between its responses based on the data it has about you.

But things will get creepy if Amazon really manages to discern your feelings in real time.

Tech giants have already proven we can’t trust them with our emotions.

Facebook’s privacy struggles aren’t a great sign that Big Tech can be trusted with so much personal data

Last year, a leaked report revealed that Facebook was considering allowing advertisers to target teenagers when they felt “insecure,” “worthless” and “in need of a confidence boost.”

A few years earlier, Facebook conducted a mass experiment that involved manipulating the emotions of hundreds of thousands of users.

How will Alexa evolve when it starts to know about your emotions? Imagine having a conversation with the AI assistant, in which you ask for advice on a topic, which is something Amazon aims to achieve.

How can you trust Alexa to answer with your welfare in mind, and not the interests of its masters sitting in the Amazon headquarters?

Another way things can go wrong is when people start integrating emotion features into their Amazon applications.

We’ve already seen how Cambridge Analytica used Facebook’s vast store of data and news feed algorithms to target its users with fake news and other questionable data, without Facebook being able – or willing – to stop it in time.

How long will it be before Amazon loses control over its platform and third-party developers start making evil uses of Alexa’s increasingly smart features?

Of course, these are all speculations.

Maybe Amazon intends to fight loneliness and help its customers live healthy lives.

But until it proves that it values those good-natured goals above its own bottom line and competitive edge, I’ll give myself the benefit of the doubt.

In the meantime, I suggest you think about what your giving up the next time you utter Alexa’s magical wake word.