The Dark Side of AI Gift-Giving: When Chatbots Go Wrong
As a seasoned Santa, I’ve had my fair share of gift-giving experiences. But this year, I decided to try using AI to shop for my loved ones. I asked ChatGPT to play Santa, but its gift for my mom was deadly.
My Family’s Archetypes
I provided ChatGPT with detailed portraits of my family members, but it boiled them down into neat, purchasable archetypes. My sister became a “tasteful, athletic, high-functioning adult with opinions.” My mom emerged as “high taste, low tolerance for stuff, emotionally anchored to place.” My dad landed as “brilliant, opinionated, under-invested in clothes, ready for upgrades.”
AI’s Gift Ideas
ChatGPT’s gift ideas relied on a phantom version of my family – a certain kind of woman, a certain kind of tidy, neutral, unfussy. For my sister, it suggested cookie-cutter gift ideas, including cookie cutters and a claw clip. But my sister bakes brownies and cakes, rarely cookies, and never uses a claw clip.
The Skipping Rope Debacle
ChatGPT suggested a “premium smart skipping rope (counter + metrics)” for my mom, who has had two knee surgeries and whose athletic abilities aren’t what they were when she was younger. I laughed, and my dad responded, “Is ChatGPT trying to kill your mother?”
The Wrong Kind of Gift-Giving
There are many ways to be wrong in gift-giving. There’s wrong because you guessed. There’s wrong because you didn’t listen. There’s wrong because you bought something you wanted them to have. The skipping rope wasn’t just a miss. It was a bright, chirpy example of what happens when a system doesn’t know the difference between “fitness” as a category and “fitness” as a complicated, emotional, physical reality.
Source: Link




