A shrink’s take on Apple’s new personal assistant application

Dec 8, 2011 13:08 GMT  ·  By

Dr. Keith Ablow, the author of ‘Inside the Mind of Casey Anthony’, has been using Apple’s personal assistant for a while now, and he has reached the conclusion that Siri is potentially harmful. Very harmful, in fact.

“From my perspective as a psychiatrist, Siri, the iPhone’s virtual assistant, could prove more toxic psychologically than violent video games or some street drugs,” Dr. Ablow writes for Fox News.

Anyone reading into the first lines of his analysis will probably be very curious about his arguments. And I can tell you right now they’re not bad. But they might not be 100% accurate either.

Dr. Ablow starts by enumerating some of the complex commands Siri is able to take and respond to. He offers examples like serving up results for pizza restaurants with an eye on your location.

The psychiatrist is careful to outline that it’s not Siri’s powerful abilities to carry out these tasks that pose a threat. What worries him is the verbal communication between the iPhone user and the personal assistant.

Apple gave Siri a voice, and even a sense of humor - an all-too human trait, as everyone can agree. And this is where it gets interesting, says Dr. Ablow.

“Siri is even funny. Tell her you love her, and she replies, ‘All you need is love. And your iPhone’. Or, ‘You are the wind beneath my wings’.”

“Funny, right? Well, not really—not when you stop to consider that you have just been coaxed to interact with a virtual entity,” he explains. “Perhaps without thinking about it, you have tacitly agreed to use a proper name to refer to a computer program, to agree the computer program has a gender, to laugh at “her” quips and to rely on her to guide you to places to eat or to give you a reminder about when to call home.”

The shrink acknowledges that even some fellow psychiatrists will would say “this is all entirely harmless.” He believes otherwise.

“But I believe that personifying machines and interacting with them as quasi-beings actually dumbs down our interpersonal skills and encourages us to treat other people like machines,” Dr. Ablow writes. “Ultimately, it diminishes our ability to empathize with one another, because we’ve been chatting up a non-existent person and can get used to considering real people as essentially non-existent, too,” he concludes.

Editor’s note I, for one, believe in common sense - something that most psychiatrists seem to overlook. I must stress that Dr. Ablow seems to be right about most aspects touched by his analysis. But I wouldn’t go as far as saying Siri will make me think less of fellow humans. If anything, its glitches will always act as a reminder that it’s a machine working with algorithms. Granted, a convincing one at that.

What’s your take on this? Sound off in the comments.