African grey parrots are very good at mimicking human speech. When Amazon launched its Alexa virtual assistant in 2014, it probably didn't think that a bird would expose a potentially significant legal issue with the device. But an African grey parrot named Rocco, living in Blewbury, England, appears to have done just that.
Last month, Rocco made headlines for his habit of secretly ordering goods through his owner's voice-activated Alexa device, which charges purchases to the linked Amazon account. The African grey species, which is renowned for its ability to mimic human speech, successfully ordered fruit, vegetables, ice-cream, a kettle, light bulbs and a kite.
Virtual assistants such as Alexa are growing in popularity. The number of users worldwide is projected to reach 1.8 billion by 2021. Unlike some rival models, such as Google Home, Alexa does not have individual voice recognition capability. Since Alexa cannot currently be trained to respond only to a selected person, anyone in your home could purchase items through your account.
Rocco's ability to manipulate Alexa raises an important question: if someone made an unauthorised purchase on your Alexa device, would you be legally liable to pay for it?
The answer lies in contract law.
You are responsible
The setting for voice purchasing via Alexa can only be be switched on or off. That is, either the function is deactivated so that no one can make vocal purchase orders at all, or it's calibrated to require a vocal confirmation code to authorise purchases.
In the first case, you cannot enjoy one of the technology's most convenient features. In the second case, you are still susceptible to a third party – human or capable animal – overhearing and mimicking your voice to make illegitimate purchases. You must then act swiftly to cancel the order in time.
Amazon's Conditions of Use, which govern voice purchasing through Alexa, state: "You are responsible for maintaining the confidentiality of your account and password and for restricting access to your account, and you agree to accept responsibility for all activities that occur under your account or password."
A golden rule of Australian contract law is that once you sign a contract you are deemed to have read, understood and accepted the terms – even if you haven't. This is also the legal position in the US, whose laws govern the Amazon Conditions of Use.
So, when you sign up to use Alexa, you agree to be responsible for any purchases made on the device by you, your resident parrot, a mischievous friend or relative, or an unwelcome burglar. It doesn't matter whether you intended the purchase or not.
There are exceptions
If your pet is responsible, you will have a stronger case to avoid paying because animals other than humans lack the legal capacity to enter into contracts, so the transaction would be "voidable". If a human is to blame, which is more likely, there is a legal exception that might still save you having to pay up.
Under both Australian and US law, where a party to a contract is mistaken about the identity of their counterpart, the contract may be void under the "doctrine of mistake".
In Australia, this rule applies where parties do not contract face-to-face, which will always be the case when someone orders through Amazon via Alexa. The critical factor is "materiality" – you need to prove that mistaken identity was vitally important to the transaction.
This will be difficult given Amazon has no interest in who specifically is ordering its products, and the Alexa owner would not normally care who at Amazon's end has processed the order. But the fact someone made a purchase without the owner's permission in circumstances where they could not reasonably prevent it might suffice as "material" for the courts.
American law is similar. Section 153 of the influential Restatement (Second) of Contracts states that a party can plead mistake and escape the contract where the mistake is material, and:
enforcing the contract would be unconscionable (unjust), or the other party had reason to know of the mistake or actually caused it through their own fault.
Amazon would never be at fault, nor able to tell if an unauthorised party made a purchase on Alexa, so you would need to prove that the transaction was unjust and that mistaken identity was critically important.
A potential snag is the exception stated in Section 154: this says that Section 153 won't apply if you and the other party have agreed that you will bear the risk. It might come down to how a court reads the Amazon Conditions of Use.
Legal precedents
Recent US court decisions emphasise that the mistake doctrine won't apply where the other party's identity is immaterial or irrelevant. Again, it would certainly be relevant where the Alexa owner had no way of preventing the unauthorised purchase (such as criminal activity). Enforcement would be grossly unfair in that situation.
The courts would probably not be as lenient if it were a friend, relative or pet doing the deed, as their use of Alexa is an assumed risk on the owner's part. But it is still arguable that the owner should be legally excused because they had no involvement whatsoever in the purchase. The nature and value of the products purchased might also weigh into a court's assessment.
To avoid a costly lawsuit, Alexa owners should deactivate voice purchasing when the unit is unsupervised, or discretely implement and use a confirmation code for voice purchases.
Users should also regularly check their accounts to ensure that any unauthorised purchases are picked up early and cancelled in time.
Finally, consider a dog instead of a parrot.
Explore further: Alexa can now connect to Big Mouth Billy Bass, along with twerking Christmas toys