Go Go Gadget....Artificial Intelligence
Or, how to get stranded on a date. Nothing in between.
I might be in the wrong Yahoo chat room , but I see an emerging challenge with the misuse of AI in the workplace that I don't see addressed much.
Several times, I've been stuck with questions or authoritative comments about Cyber Crucible, Inc., or myself, or my team, that are patently incorrect and defy logic as to why these things are said.
Please notice I left "technical topics" off of that list. I'm not talking about someone needing educated on a topic - that's healthy and we *should* be sharing knowledge on things we know a lot about!
Rather than get lost in specific examples, let's do a fun little dating analogy. Admittedly, a game I have not played in many years.
Your date shows up with a bunch of information about you, that they are happy to communicate with you, that a bit of thought would have cause for pause.
"How do you live without driving?"
Sir/Miss -- you are literally sitting in my car right now. I picked you up, and brought you here to the restaurant in said car.
Sounds more horrifying than normal dating, I assume.
Would you go on that second date? Unlikely. You may even pay for a cab for them for the way home.
I have seen this type of Twilight Zone engagement around 10 times in the past 9 months or so. The only answer I can guess, since most people are not severely mentally ill, is that AI is spitting out incorrect answers for them to ask, or incorrect information, and it is not being reviewed before sending it out.
In my company, we've even queried different AI tools just to make sure there's not craziness in the AI answers. Some improvement to be had, for sure. AI is not as foolproof as Inspector Gadget. Not wildly inappropriately wrong information though.
So while I may be confused with the Dennis Underwood Water Treatment manager, when questioning my experience in cybersecurity, a 15 second double check of whatever AI spit it out probably would have saved some unspoken embarrassment.
I'm not going to use crude language, and there really isn't "an assistant" they can truthfully blame. A real EA would likely NOT give their boss bullets on Cyber Crucible's metallurgical operations).
So I just try to move the conversation past some type of Family Guy skit material.
I can't be the only one.
The examples I gave are roughly equivalently farcical to the real ones over the past several months. Not more frequently thank goodness, or I'd start wondering if I'm on The Truman Show.
Maybe we just need to double check what our AI tools spit out, BEFORE they get emailed around.


