Amazon is being sued over its smart assistant's recordings of children.
Two US cases allege the firm lacks consent to create voiceprints that could let it keep track of a youngster's use of Alexa-enabled devices and thus build up a "vast level of detail about the child's life".
Amazon has said it only stores data once a device-owner has given it permission to do so.
And it says parents can delete a child's profile and recordings.
Lawyers involved in the cases are seeking damages for the two plaintiffs involved, as well as others who are being invited to join the class-action lawsuits in nine states where it's claimed Amazon is in breach of privacy laws.
Amazon said in January more than 100 million devices featuring Alexa had been sold worldwide, ranging from its own Echo speakers to third-party products including headphones, fridges and televisions.
"Amazon has a longstanding commitment to preserving the trust of our customers and their families, and we have strict measures and protocols in place to protect their security and privacy," a spokeswoman told the BBC.
How Alexa works
Software on enabled devices listens out for a wake word - which can be set to be Alexa, Amazon or computer. If it is detected, audio captured just prior to the wake word as well as what was said immediately afterwards, is transmitted to Amazon's computer servers for processing.
Because mistakes are sometimes made, recordings can be transmitted when the wake word is not actually used.
The recordings are stored, allowing Amazon to use them to create a model of a user's voice characteristics to help the service learn to adapt to quirks in the different ways different people make requests as well as to provide tailored responses to different users in the home.
Registered users can prevent this happening by withdrawing consent. They also have the option to actively train the system to better recognise their voice by repeating a series of phrases.
Human operators sometimes listen to the clips to tag them in order to help the machine-learning system involved become more accurate.
Users can delete stored utterances via an app or via Amazon's website. In addition, they can ask Alexa to delete the last recording or last day's worth of recordings via a voice command.
Two class action cases are being pursued, one filed in Los Angeles on behalf of an eight-year-old boy and the other in Seattle on behalf of a 10-year-old girl.
The children are said to have used Alexa to tell jokes, play music, recognise movie references, solve maths problems and answer trivia questions.
In both cases, the children had interacted with Echo Dot speakers in their homes, and in both cases the parents claimed they had never agreed for their child's voice to be recorded.
The complaints say Alexa devices could have been designed to only send a digital query rather than a voice recording to Amazon's servers - although processing the audio locally would have disadvantages such as potentially driving up the cost of the devices involved and making it harder for Amazon to deploy updates to its voice-recognition tech.
Alternatively, it is suggested that Amazon could automatically overwrite the recordings shortly after they have been processed, although this might affect the smart assistant's ability to deliver personalised replies.
Even if neither of these options were adopted, the plaintiffs suggest that more could be done to ensure children and others were aware of what was going on.
"At no point does Amazon warn unregistered users it is creating persistent voice recordings of their Alexa interactions, let alone obtain their consent to do so," the complaints state.
"Neither the children not the parents have consented to the children's interactions being permanently recorded."