The BigBrotherAward 2015 in the “Technology” Category goes to

the “Hello Barbie”, represented by its makers, Mattel and Toytalk.

What on earth could be so terrible about a doll that comes with a microphone, speakers and Wi-Fi? Go ahead, you only need to accept the Terms and Conditions! You’re not afraid that your children might be spied upon, are you?

Not to worry: “Hello Barbie” won’t record anything unless a button is pressed.

And then it will send the recording to the Cloud. To Toytalk.

Toytalk is a US company that specialises in language recognition for children. With its funny little smartphone apps, Toytalk has earned multi-million dollar investments, followed by a partnership with toy maker Mattel.

Language recognition is not a bad thing as such. It makes it easier to get inconvenient tasks done, and it helps many of us to overcome barriers in everyday life.

We can dictate texts without getting a tennis arm. We can write text messages while driving, without running over a child. We can call our bank without having to argue with a bad-tempered call centre agent.

There is nothing inherently bad about data either. Using data, diseases can be cured, accidents prevented, and deficiencies in politics can be uncovered.

Many services in the digital age are improved through language recognition.

We can quickly find those Google search results that have been searched many times before. We can avoid traffic jams and bottlenecks that other Google users are caught in at the moment. We can pinpoint the best restaurant in the area, wherever we may be, thanks to other guests’ ratings.

For the use of all these data, we often pay by donating our own data to the large pool. Sometimes this can be a very fair deal. But often it is not.

If all our e‑mails are getting read so that next to them we might be served advertisements relevant to the fact that our grandmother has just died, if Amazon knows about our unplanned pregnancy even before we do, or if Facebook can recognise us on photos whether we want it or not – then all of that is just a small preview of the power that lies in all that hoarded data.

Our behaviour is meticulously registered, analysed and explained, and manipulated through targeted action. Most of the time the purpose is to sell us something. To put it short, our behaviour is being monetised.

As informed and mature people, we need to develop a sensitivity as to which data we give into which hands, and what is being done with it. We must walk the fine line between a golden future of digital progress – and the submission of our lives and interactions to the profit interests of a few large companies.

As a mature human being, one has to balance costs and benefits:

If I disclose where I’m going, I can reach my destination more quickly and comfortably?

Okay, deal!

If I let my calendar be hosted online in the US, it will be easier to synchronise between my smartphone and my laptop?

Well, alright then …

I am supposed to allow eavesdropping on my living room, in order to be able to talk to my TV set?

Now, hold on for a moment!

Yes, George Orwell foresaw a lot. But he didn’t think that we’d be buying the panoptic telescreens voluntarily!

And he didn’t imagine them to come in the shape of Barbie dolls either.

Language recognition, like so many other computing services, has moved to the Cloud. Algorithms are always improving, so devices can understand us better and better. New functions can be deployed without updates to the device software, and on the device itself, storage space and valuable battery life can be saved. On the other hand, large computers in the Cloud keep getting to know more of us and about us, in order to give us the most personal treatment possible.

If language recognition takes place in the Cloud, however, we give up control over the processes behind the services. If, for example, we initiate a search by saying “OK, Google”, or change the channel on our TV using a spoken command, then we are not talking to our device but to a server farm somewhere out there in the vast reaches of the Internet.

And if our smartphone writes a text message to Mum, enters an appointment in our calendar or answers our question, then it is exactly this server farm that remote-controls our device.

A server farm in the hands of a large company, where sound recordings from all over the world are collected, analysed, and stored.

Is this something you are conscious of in every-day life?

Have you ever considered this even once?

One thing is very important to these companies: in order to make it worthwhile to invest in these emerging digital markets, our dear customers need to be made familiar with these technologies as early as possible. Because what was normal in the children’s room is all the more normal later in life.

It begins with Barbie. “Bob the Builder” and Playmobil figures will follow, if customers play along. But Mattel knows the public mood very well: in Europe, “Hello Barbie” will not be introduced at all for now, due to anticipated privacy concerns.

Using Wi-Fi, microphones and speakers, the “Hello Barbie” will let our children talk with a server farm somewhere on the Internet. A technology that is very dubious from the point of view of privacy legislation and without a clear, sensible use case is being put right into children’s rooms.

To make an actual dialogue possible, “Hello Barbie” dolls remember what our children say to them. Later in the conversation they can draw on this information in order to imitate a real friend. And what if the dear parents worry what their children might tell the doll, sorry, server farm about themselves? Not to worry! Of course Mattel will let you in on the collected data: there will be a daily or weekly e‑mail with a protocol of all the worries, dreams and secrets that their child might entrust to their best friend, the server farm of Mattel and Toytalk.

Therefore: heartfelt congratulations on the BigBrotherAward in the Technology category, “Hello Barbie”, Mattel and Toytalk.

[Image above: Mike Licht, Creepy 'Hello Barbie' Doll Will Spy on Your Kids, cc-by-2.0]

Category: 
Year: