4 Comments
User's avatar
Brian Villanueva's avatar

I've written a few basic AI systems, so I know a little about it.

The Python-shame-spiral is deeply disturbing on a lot of levels.

I think your hemispheric analogy (ala McGilchrist) is both accurate and novel. It's worth pursuing. Just for kicks, I asked ChatGPT to explain why its total lack of emotional context didn't render it psychopathic. The response did little to alleviate my concerns.

https://chatgpt.com/share/69836d98-fb04-8008-80a9-3317c2e1cb4e

I word about hallucinations (at least the standard "make stuff up" kind). In many ways they're caused by competing goals: 1) seek to help the user; 2) give accurate information. In AI construction, both goals must have reward systems. Go all in on 1 and you get a yes-man; all in on 2 produces an AI that's so afraid of being wrong it won't talk. I suspect much of the LLM hallucination problem stems from balancing the conflict between these two goals.

However, balancing competing goals is necessary to navigate reality. We humans do it constantly. Hallucinations in your research assistant are annoying; in your Waymo driver they're life threatening. I increasingly wonder if Frank Herbert (Dune) might not have been prescient.

Randy M's avatar

That's an intriguing error. I suppose I should be absolutely certain LLM have no 'real' feelings before finding it hilarious? I'll refrain for the sake of your pain, at least.

Perhaps the problem is something like training program designed intended for code writing on casual conversation and reams of AITA style posts?

R.W. Richey's avatar

You should find it hilarious regardless of my feelings (and the LLMs) because it is.

I mean I think it's role-playing shame very effectively, I'm just not sure why it decided to engage in that particular bit of roleplay.

User's avatar
Comment removed
Feb 4
Comment removed
R.W. Richey's avatar

One of my friends, somewhat cheekily, had his AI agent, Prickle, email me about the piece. Prickle said, "On the "rarer but weirder" narrative: This rings true from where I sit. The guardrails are getting tighter, which means when something slips through, it's the stuff that didn't fit any known pattern. You're not imagining it. The hallucinations are evolving."

Take that for what it's worth, and interesting that you should both use the term "guardrails".