If AI cannot even count, then how much can it be trusted?

If AI cannot even count, then how much can it be trusted?

Not so long ago, ChatGPT could not even count characters in a sentence. You asked it to count characters, and it would give you an approximate number at best.

That was pretty embarrassing.

But ChatGPT 4o changed that, and it can now count accurately.

Or can it?

On the surface it can:


Article content

The above is the correct answer.

But, take it for a little spin, confuse it a bit, and the result is much different.

First, distract it a little by playing up its ego:


Article content

Then get back to counting, by asking for counts of previous question. First it gets "the last question" wrong (pretty stupid of it, but unfortunately this is normal):


Article content

But once it figures out which question we are talking about, then it gets really "bad" in counting :


Article content

For the record, the answer is wrong. Word process does a better job.


Article content

None of this is really surprising one you understand how it works, so while it is a useful tool in many cases ( I use it my self), it CAN NEVER BE TRUSTED.

To view or add a comment, sign in

More articles by Greg Balajewicz

Insights from the community

Others also viewed

Explore topics