Chris Middleton explains the implications of Amazon’s latest research – and its efforts to water down consumer privacy protections.

Scare stories about Amazon’s Alexa eavesdropping on customers are nothing new, and are not surprising. As always-on devices that wait for trigger words and skills-based instructions, it’s known that the Echo and other devices powered by Amazon’s digital assistant listen to their owners. How else would they work?

However, a recent Bloomberg report revealed that Amazon employs thousands of workers – both in-house and outsourced – to annotate voice recordings made after the ‘Alexa’ trigger word is heard.

According to Amazon, the purpose is to refine Alexa’s speech recognition facilities by analysing users’ speech patterns, and thus improve the services that customers receive from their devices.

This information has long been hiding in plain sight in Amazon’s Ts & Cs. “We use your requests to Alexa to train our speech recognition and natural language understanding systems,” the company says in its FAQs.

However, the Bloomberg report acknowledges that workers hear private conversations – including things that may be personal, upsetting, or even indicate criminal activity. Staff can discuss their worries on internal chat services, explains the report.

In a statement to Bloomberg, Amazon said that it has strict operational safeguards in place, zero tolerance of abuse, and employees do not have direct access to data that could identify the customers. The implication is they may have indirect access.

The concept of a spy in the home that not only connects customers – voluntarily – to retail and loyalty opportunities, but also reveals their private lives to thousands of eavesdroppers is, to say the least, a privacy minefield, especially if an employer’s relations with its workers and contractors break down.

More, the temptation that such a facility must offer to security agencies, police forces, hackers, and bad actors is overwhelming, particularly when Amazon sells facial recognition systems to the police, and other technologies into the home, such as smart doorbells, home hubs, security cameras, kitchen appliances, and other connected devices.

Indeed, US police have already attempted to call Alexa as a witness in a 2017 murder case.

Yet there are other dimensions to this story, not mentioned in the Bloomberg report and countless others like it. First, in February last year, security company ESET was one of several to find that the Echo and other smart-home devices contained significant security risks or flaws, beyond those set out by Bloomberg.

ESET advised users to set up voice recognition so only they could use Alexa, to delete recordings of past interactions, to switch off Alexa when devices are not in use, and protect Amazon accounts with two-factor-authentication. That’s good advice.

Second, Amazon is moving into more and more markets, such as healthcare, insurance, financial services, and more. In such a vast, amorphous, and aggressive organisation, the lack of ‘Chinese walls’ between departments must be an ongoing concern. For example, the linking of life insurance with fitness programmes and wearable devices is already happening, with US provider John Hancock the first to do so last year.

Regulators must look at Amazon’s growing potential to influence and manipulate markets and create monopolistic positions within them – conceivably informed by data that no other organisation has access to.

Third, Amazon no longer solely provides audio interactions with Alexa-powered devices; it is also moving into cameras, such as its Echo Look ‘style assistant’ device, launched last year in the US. This is not only designed to connect shoppers to the company’s automated or AI-powered services, but also to human advisors.

Put another way, customers are inviting Amazon staff into their own bedrooms and asking them to rate what they are wearing. Are these devices always on? Like any Alexa device, might they be activated accidentally by trigger words or phrases? Are these advisors incentivised to up-sell the customer more product? And can Amazon staff watch people in their homes at will?

Without wishing to be sensationalist, any device/service that encourages anyone into a commercial relationship with strangers via a camera in their bedroom must, surely, be looked at by regulators.

And fourth – perhaps most significantly for longer-term cyber security and privacy concerns – Amazon is actively researching a world of customer interactions beyond the ‘Alexa’ trigger word.

In April last year, head of the Alexa Brain group, Ruhi Sarikaya, explained how the company is training its digital assistant to learn more about individual users and remember personal details about them.

More, the company is developing Alexa’s ‘context carryover’ feature, which will allow it to understand and respond to follow-up questions without the ‘Alexa’ trigger. “Our goal is to enable more natural interaction with all of these IoT devices, and for these devices to more proactively engage with us,” explained Sarikaya.

“The profound difference in this emerging era is that with the benefit of AI and machine-learning technologies, Alexa and similar services can learn about you and conform to your needs, instead of you having to conform to the system’s interaction model.”

Except that is precisely what Alexa users are doing: conforming with Amazon’s interaction model. And the clear direction of travel is for the assistant to power devices that have natural language interactions with their owners, and which may actively intervene in conversations and decisions.

That means they will not only always be on, but always monitoring conversations and learning about that person. But because these are cloud-linked devices, not ‘islands’ of intelligence in a walled-off home, this information may be made available to thousands of people worldwide, and linked to cameras, hubs, security devices, and more.

And let’s not forget that Alexa-powered devices are increasingly being sold as corporate solutions too, with rising numbers of enterprise skills. So Amazon may be monitoring boardrooms and meeting rooms too.

So what protections do consumers and others have in such a fast-emerging world? In the US, proposed legislation to outlaw unauthorised recordings from Internet of Things (IoT) devices has just been watered down – by an alliance of companies that includes Amazon and Google.

The Keep Internet Devices Safe Act was passed on 10 April 2019 by the Illinois senate. Originally, the bill sought to ban recordings being kept by the device and/or the manufacturer without explicit permission, and to give owners the right to sue manufacturers if they did not comply.

However, after aggressive lobbying from the Internet Association – a vendor group that is ramping up its presence at regulation and privacy conferences – the bill has removed the proposed ability for consumers to initiate private legal actions.

According to the Association, the change was made to prevent frivolous or merit-less litigation, but the presence of industry muscle at the negotiating table is becoming increasingly commonplace – and its message isn’t subtle. The need for such an organisation is questionable, given that regulators and legislators are already grappling with a technology sector that has become too powerful and unaccountable.

And that’s not all. The Internet Association has succeeded in amending the wording of the bill to say, “No private entity may turn on or enable a digital device’s microphone unless the registered owner or person configuring the device is provided certain notices in a consumer agreement (instead of a written policy).”

Put another way, companies have persuaded legislators to push the onus back onto consumers to educate themselves about what they are agreeing to.

However, as many of us see friction of any kind as a disincentive to use digital services, the likelihood is that most people will click, select, or say ‘Agree’ without having much idea of what they are agreeing to. ‘Certain notices’ is hardly a beacon of clarity, and the person agreeing may be a minor, or someone with no connection to the household at all.

That missed opportunity is no recipe for security by design in a connected world. The legislators blinked first, and – perhaps – Alexa recorded them doing it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here