Just like Gepetto, toy makers have been looking for ways to make their toys come to life since the mid 19th century. After inventing the phonograph, Thomas Edison even remarked that it could "make Dolls speak, sing, cry". This fascination with making the inanimate animate now seems like a not too distant dream, as advances in robotics and artificial-intelligence (AI) push toys to dizzying new heights. With the news that AI toy maker, Snorble (an AI powered sleep companion for young children), received a $10m investment from GK Venture Partners, it seems as if interest in the area is only growing. But for all the magic involved in seeing a child speak to a toy, should there also be concerns over privacy and the unknown effects of these interactions on children's development?
While AI might be relatively new to the world of toys, the fascination with getting dolls to speak to children has been around much longer. In the mid-1800s, small bellows and reeds were put in dolls to mimic the human voice saying things like "papa". In the 1920s, other dolls like Dolly Rekord would use wind-up mechanisms to create the impression of a doll singing a nursery rhyme. Since then a barrage of speaking toys have been introduced to the market. Chatty Cathy, Teddy Ruxpin, Tickle-Me Elmo and Barbie, just to name a few, have all had vocal mechanisms to entice children. Although popular with children, even talking toys have been criticised for not enabling a child's imagination to flourish. Should this concern be even more when AI is involved?
AI in toys is not necessarily a new thing. One of the first popular toys with an artificial-intelligence-like system was the Tamagochi. The toy, a rough Japanese portmanteau of the words "egg" and "watch", took the market by storm in the late 90s and early 00s. For those that don't remember or perhaps are unaware, the Tamagochi was a small 2D screen that displayed a "pet" which required feeding and attention through the use of 3 small buttons. The Tamagochi, similar to a real pet, could get ill, go to the bathroom, become potty trained and perform a whole host of attention seeking actions. It was programmed to require an ever increasing amount of attention, which if left unattended could die. While the Tamagochi may not have what we consider AI today, it created a blueprint for interactive toys that mimicked real life for children.
In more recent years, AI toys have primarily made the news for more negative reasons. Starting in 2014, Genesis toys created My Friend Cayla. Cayla was an interactive doll that would use the Internet to speak to children and answer their questions. The biggest issue, however, was that Cayla was recording conversations with kids (and everyone else around the doll). This data could then be shared with third-party companies, something that no parent wants. Similarly, Mattel's Hello Barbie also got in trouble for similar privacy-related concerns. Through interactions with the child Barbie could create personas for the children, even though most of the users were under the age of 10. Further concerns were also raised about the possibility of hackers gaining access to the doll to spy on children or families.
Without a doubt concerns should be raised over privacy when it comes to children and smart toys. How the data is used and monitored is of vital importance if we are to create a safe environment for children. Another concern is the potential for these toys to supplant real relationships. Why would a child choose to make friends when Hello Barbie can offer conversation without human pitfalls? But perhaps these anxieties are premature. Recent research found that between 2020 and 2021, the market for smart toys only made up 0.1% of the whole toy industry. Perhaps toys like Snorble will begin to change that and we'll see an increase in the use of AI and childrens' play toys. Without a doubt, AI toys can be used to develop good habits, as well as educate young children, but the ethical and security concerns cannot be ignored.