But if you actually want real, accurate answers, you’re not going to get them.

Equally, it has no way to vet results for accuracy.

“By its nature, AI models won’t produce factual results.

A hand pulling a slice from a pizza

Maybe glue would make this cheese stick better?.Quin Engle / Unsplash

So, we should always take their responses with caution.

Galdon-Clavell gave us a pretty chilling example.

PhonlamaiPhoto / Getty Images

Here’s another example.

PVA wood glue dripping seductivley down the side of the bottle cap

Not a pizza ingredient, no matter what Google tells you.Scott Sanker / Unsplash

It’ll just guess because it doesn’t have a human-level understanding of context.”

This means that they can never be any better than their sources.

This includes content farms, SEO junk, conspiracy theories, and so on.

A brain with cables

AI cannot reason like a human.PhonlamaiPhoto / Getty Images

In the end, though, it might not make much difference.