We recently talked about times AI got it really wrong, and here are 20 of the most ridiculous stories you shared.
1. The fake initiative
Our execs usually send out a hype email right before the annual employee morale survey, emphasizing wins from the past year, basically trying to put people in a positive frame of mind.
Last year’s included the announcement of a major new program we knew employees really wanted. But it was a bit surprising, because it fell in an area my team was responsible for, and we were out of the loop, despite advocating strenuously for this over the years. So I went to the exec to a) convey enthusiasm for his newfound dedication to launching this program and b) ask what support he needed from my team/get us involved again. It turned out the program wasn’t launching at all; he had just asked AI to edit the email to make it sound more exciting and appealing, and it had done so by … launching my initiative.
2. The predator
Sometimes at work my colleague uses AI in Photoshop to extend a background in a photo or clean up the background. We had a photo of a senior staff member outside: the background shows a building to the left of him and some trees and a road to the right, but it was portrait and we needed landscape. He asked Photoshop to extend the background on the right.
It generated a scary looking woman creeping up behind the staff member.
3. The nickname
I was on a Zoom call with AI notetaking software and was referring to a colleague named Bridget–but on the transcription, every time I specifically mentioned her name, it appeared as “Piglet.” This did not happen when others on the call said “Bridget”! It looked like that was just my nickname for her. I was so embarrassed.
4. The fake charity
My company hired an account manager who insisted he was a phenomenal writer and asked if he could contribute to our blog. The first pieces were just AI slop so I politely thanked him and said we had plenty of posts already.
So he posts a third “article” on his own LinkedIn account in which the AI described how our company collaborated with the CDC on researching a certain disease and publishing a groundbreaking study. Then we apparently went into underserved communities and funded a bunch of clinics and immunizations. NONE of this happened. It was hours before I saw it and forced him to take it down, and there were many surprised comments and shares. Months later, we were nominated for an award on our commitment to caring for vulnerable populations.
5. The transcript
I forgot the meeting was being transcribed and was talking to my cat while waiting for others to show up. “Baby, let me put it in” was at the top of the transcript to my absolute horror.
I was talking about his ear mite drops.
6. The grievance meetings
At my former workplace, the HR director did not know that her AI notes tool was recording her classified grievance meetings with the union representatives and sending a full recap after each one to all parties invited on the calendar invite, even if they weren’t in attendance. We got an email after a bit saying no one was allowed to use AI note takers any longer.
7. The “verifiable information”
Me: I’m doing a competitor analysis on [product type] for [customer segment]. Please give me an overview of all the [product type] products offered by banks in [my country] for this type of customer.
AI: (gushes) Sure! What a fantastic question, you’re a genius! (paraphrasing). Here is the overview.
Me: (searching for one of the product names listed … cannot find it anywhere) I can’t find this product anywhere. Did you make it up?
AI: Ooooh … did you mean actual products? Sorry! In future I’ll only reference verifiable information.
Me: (eye roll, crying into coffee mug, closes AI window)
(It continued to manufacture content.)
8. The job description
My mother is on the board of a wildlife habitat nonprofit. They work with wetland preservation and with both bats and owls. They were looking for a new director, so someone on the search committee decided to have AI make up the job listing. It included several useful traits (a reasonable amount of education, experience with fundraising, etc.) – but it also said the position required “five years’ experience teaching birds to fly.”
They rewrote the job listing.
9. The performance review
I had an employee request to use an AI to take notes during her performance review. The summary was one line: “No meaningful conversation took place”. I was glad I decided to take pen and paper notes because it was a very productive conversation indeed. Apparently the AI disagreed!
10. The baby announcement
At the end of a meeting, a colleague asked their boss to stay on the line for a couple of minutes. The colleague then confidentially shared the great news that they were expecting a baby, and they and their boss talked about a few next steps to plan for parental leave. The AI notetaker then sent out notes to everyone who had attended the meeting with the headline, “Colleague Is Having a Baby.”
11. “Dazzling you”
I’ve been involved in beta-testing and quality-controlling AI translation output because my employer wants to see if has utility in professional use cases. Here are some highlights:
– In an AI translation of a report about elder abuse, it randomly inserted the word “child” in front of the word “abuse” in various places. The concept of “child” did not appear in the source text at all.
– Every single abbreviation in the text was incorrect in a different way every single time. There was not a single correct abbreviation, and not a single abbreviation was translated the same way twice.
– The word “negro” was randomly inserted into a sentence for no apparent reason. This was early in my exposure to AI translation and I had no idea it could mess up that badly, so I spent ages trying to figure out if there was some stealth hidden racist dogwhistles in the source text. A colleague of mine also had a recurring problem of the word “bitch” randomly being inserted into sentences.
– Random misnegating – for example, the statement “more work is being done” is translated as “no more work is being done,”
– It translated the standard “Dear Sir or Madam” opening of a letter as “Dazzling you.”
– Rewording the source text in the source language rather than translating it. Yes, all the settings were configured correctly.
12. The Powerpoint
I asked Copilot to create a table comparing two things. It did an okay job. Then Copilot asked me if I wanted a Powerpoint slide of the table. I said sure, since I was going to put it into Powerpoint anyway. Copilot created the ugliest Powerpoint I have ever seen. Three slides (I only needed one) with a color scheme of lavender, salmon pink, and orange. The background of each slide had kind of a plaid pattern a coworker said reminded her of her grandmother’s couch. A random picture in a cartoon cloud shape.
However, that is better than our company’s internal AI. It doesn’t have the ability to output content into powerpoint, excel, etc, but it thinks it does. It’ll offer to create one for you and then do nothing. Coworkers have spent ages trying to figure out where AI is saving their non-existent files.
13. The comp titles
I work in publishing and I wanted to do some research on competing titles for a potential book we had in the pipeline. Asked AI for the bestselling current books on the topic, and it came up with a list that had some really interesting titles on it – great, I thought, I’ve never heard of half of these so we definitely need to check them out. Yep – turns out the AI had just completely made them up.
14. The editable document
Me: Copilot, can you turn this scanned PDF into an editable word document?
Copilot: Sure thing, Another Kristin, here you go!
Me (after opening the file): Copilot, this file is completely blank.
Copilot: Sorry, I made a mistake, here it is!
Me: (opens second file, sees that it is also blank, closes AI window and puts in request for OCR software)
15. The attack
A friend of mine showed me an AI summary of a meeting where the AI notetaker decided to attack someone for no reason– in the middle of the notes about what everyone was saying, it inserted, “Jane contributes nothing to the conversation.” I guess it was accurate because the coworker had been quiet since that part of the meeting wasn’t relevant to her projects … But why did it do that???
16. The scam
We work with a lot of small businesses just starting up, and as a result are asked to recommend professional services often. Knowing this, a client passed on a discreet warning about the bookkeeping firm we’d recommended to them. They had issues with their accounting software, Quickbooks, and called for help. It was right when Google started providing AI summaries for everything, and apparently their account rep pulled the phone number for Quickbooks’ support out of the AI summary, rather than off the website.
You can probably guess where this is going. The number wasn’t legit, but instead put him in contact with a scammer who’d managed to astroturf their way into the AI summary. The account rep gave the scammer full access to our client’s accounting software before he realized his error. Our client didn’t share a lot of details about the damages — I got the sense that they were saying very little because they were planning legal action — but they wanted to let us know so we wouldn’t recommend them again.
17. The transcript, part 2
A woman I work with introduced herself before an online presentation. Her last name is Buckman. The AI transcriber recorded her introduction as “Hi, I’m Amelia. F*ck, man, it’s nice to see you all today.”
18. The transcript, part 3
Two people stayed on the call after the rest of the team had left and complained about others on the project. Not only did the transcription record this, it tagged the individuals being discussed in the summary as an action item: “@Jane needs to stop dragging her feet and get her sh*t together”
19. The equipment
I recently saw a ~$50,000 piece of industrial equipment damaged and taken out of commission for about a month because Google AI search told a worker that the tightening torque of a screw was 50 ft*lb instead of 50 in*lb.
This resulted in them over-tightening the screw by a factor of 12, which unfortunately didn’t strip the threaded hole (which would have been a smaller problem) but instead warped a bearing assembly that required a full rebuild at considerable difficulty and expense.
The kicker is that the correct torque value was clearly printed in the service manual that is stored in the machine.
20. The privacy expert
We once had a IT person come into a meeting to talk about the importance of data privacy and security who didn’t realize he had an AI notetaker signed in until someone pointed it out.
The post the fake charity, the Photoshop predator, and other times AI got it wrong appeared first on Ask a Manager.



