A.I.: Annoyingly Intrusive – Lessons from Clippy the Paper Clip
By Cecilia Sepp, CAE, ACNP, LPEC
Hammers are useful tools. They can help put things together or help take them apart. You pick up a hammer when you need it; otherwise, it sits in the toolbox waiting until you are ready to use it. But what if your hammer jumped out of the toolbox every time you walked by, forced itself into your hand, and started trying to hammer things – even if you didn’t want it to? It would take your time and energy to keep stopping the hammer and it would interrupt your flow of thought, work, and focus.
The whole world is talking about artificial intelligence and how we can use these tools from our digital toolbox to “get things done,” very much like a hammer. A.I. is supposed to help us do things faster, better and help us to be more focused. We are told that A.I. will make suggestions and support us. And yet, increasingly in my own experience, I find that it is just annoyingly intrusive.
It takes up RAM, slows my computer down, and constantly interrupts when I am trying to work. Between telling Siri no I am not talking to you, telling Microsoft Copilot no I do not need you to write something for me, telling Gmail no I do not want a synopsis of the message – I want to read it!, I barely have enough time to enter my daily calories in my fitness app where I also have to fight off “helpful” A.I. tools.
I find A.I. less and less useful and less and less helpful. You might be thinking to yourself, “well you know Cecilia, maybe what you need to do is get a paid account with ChatGPT or Claude or some other AI tool. Then you would get it.” However, that is really not the problem since I like to use Google Gemini – when I choose to use it. Not when it tells me to use it.
The problem with the intrusion of artificial intelligence is this: companies and organizations have adopted and integrated it in a way that borders on harassment. In their mad dash to not be left behind, to save money, and to jockey for position, they forgot one thing: tools are supposed to be useful. They are not supposed to get in the way.
Which brings me to the Microsoft tool, Clippy the Paper Clip. Remember Clippy? It was that cute little icon that popped up in Microsoft Office when you were trying to work. Do you know why Clippy was retired? Interestingly, I used Google A.I. Assistant to answer that question:
“Clippy was hated primarily because it was intrusive and condescending, interrupting users with unhelpful or obvious advice while hindering their productivity. Its persistent, unwanted presence, coupled with a design that was seen as patronizing and occasionally creepy, made it feel more like a nuisance than a helpful assistant. Users also felt a lack of control, as the assistant often failed to learn preferences or adapt its suggestions, despite being designed to foster empathy and connection.”
Hmm. Sounds a lot like how I describe A.I. tools today. Intrusive and interrupting. Now, you might say, “well you know Cecilia, today’s A.I. tools do learn to write like you, your preferences, and will mimic empathy and connection.” Notice that I say “mimic” because it’s just software folks. It isn’t human.
Clippy was an early version of an A.I. tool that was meant to make things better and to be helpful but ended up quite the opposite. Tools are meant to be wielded by humans when humans need them – not when the tool decides it wants to be used. We need to take a stand on how far we are willing to allow A.I. to go before it takes over every system we have.
Humans like to anthropomorphize the things around us – hence naming software tools Clippy or Claude. It makes us feel more comfortable in our environment but it also distances us from the problems being caused by these tools. Many of my friends are switching to Claude and say they really like it, but I wonder if Claude will suffer the same fate as Clippy. A.I. tools are too much like the friend who is going to stay on your couch for just a few days but never leaves. Eventually you have to kick them out because they are not the friend you thought they were.