Does OpenAI’s ChatGPT Just Make Stuff Up? Here Is My Test…

Written very quickly on

by

in

My first question was a completely random one: Can you give me three places to eat goat meat in Los Angeles?

Answer came back a second later.

Sure! Here are three places in Los Angeles where you can try goat meat:

  1. Mejico Grill – This Mexican restaurant in the Mid-Wilshire neighborhood of Los Angeles serves dishes made with goat meat, including birria de chivo (goat stew) and tacos de chivo (goat tacos).
  2. Bavel – This trendy Middle Eastern restaurant in the Arts District of Los Angeles serves a variety of dishes made with goat meat, including kibbeh nayeh (raw goat meat mixed with bulgur wheat) and fattet djaj (a lamb and goat meat stew served over bread).
  3. Chivito Ciao – This Uruguayan restaurant in Silverlake serves a variety of dishes made with goat meat, including chivito (a sandwich made with goat meat, lettuce, tomato, and mayonnaise) and asado de chivo (roasted goat).

I hope these suggestions help! Let me know if you have any other questions.

OpenAI’s ChatGPT

On first use, my reaction was like any other person: blown away—this is the future—I feel sorry for anyone being paid to write anything, blah, blah, blah. Naturally, my very next reaction was sudden hunger for some goat meat at these delicious suggestions courtesy of a AI chat-bot. I could already sense countless other questions coming to mind and how this chat-bot was going to change my life forever.

Unfortunately upon closer inspection, the results are very suspect. The first suggestion didn’t exist in “Mid-Wilshire neighborhood of Los Angeles” and the Mejico Grill that came up in Google Maps was located in Agoura Hills, which is permanently closed. The third option didn’t exist in L.A.-proper nor Silverlake.

And finally, that leaves us with ChatGPT’s second recommendation. When you go to the Bavel website and inspect the menu, I didn’t find a single goat dish! I’m going to give ChatGPT the benefit of the doubt and assume that at one point Bavel had goat dishes on their pricey menu.

The other conclusion is far more darker: ChatGPT suffers from a machine version of man-with-a-hammer syndrome and simply makes shit up along the way because it’s main tools are automated words. It is more than capable of organizing these words into readable sentences that follow certain algorithmic rules.

The future looks bleak, to be honest. If humans are anything, it has to be lazy. If they are allowed to create a bunch of information without doing any work whatsoever (besides copying and pasting from one screen to another—which can be automated, mind you), all for the sake of generating keywords to help pump up advertising revenues, then they will do just that.

All this time I thought we had way too much information to process, but thanks to technology like ChatGPT there will be a lot more of it and most of it will be very inaccurate. If anything, this is a competitive advantage for human-powered journalism after people figure out that the AI-powered version is complete bullshit.

Check here for more reads from

,

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.