●Stories
●Firehose
●All
●Popular
●Polls
●Software
●Thought Leadership
Submit
●
Login
●or
●
Sign up
●Topics:
●Devices
●Build
●Entertainment
●Technology
●Open Source
●Science
●YRO
●Follow us:
●RSS
●Facebook
●LinkedIn
●Twitter
●
Youtube
●
Mastodon
●Bluesky
Follow Slashdot blog updates by subscribing to our blog RSS feed
Forgot your password?
Close
This discussion has been archived.
No new comments can be posted.
Load All Comments
Full
Abbreviated
Hidden
/Sea
Score:
5
4
3
2
1
0
-1
More
Login
Forgot your password?
Close
Close
Log In/Create an Account
●
All
●
Insightful
●
Informative
●
Interesting
●
Funny
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
byJava Pimp ( 98454 ) writes:
We also tend not to put people exhibiting these behaviors in decision-making positions.
Except when we put one of them in charge of our country... Twice....
byTWX ( 665546 ) writes:
yeah, this part stood out to me:
A LLM will be just as confident when saying something completely wrong -- and obviously so, to a human -- as it will be when saying something true.
as pretty much now empowering both ignorance as if it's equivalent to knowledge and experience, and now asserting that the ignorant person's views are fully valid even when based on bogus, "research."
Assertion while ignorant or actively wrong is the sort of thing that a conman does, because the root of what a conman relies on is confidence, that's where the con- part comes from. AI may as well be a conman.
bylarryjoe ( 135075 ) writes:
yeah, this part stood out to me:
A LLM will be just as confident when saying something completely wrong -- and obviously so, to a human -- as it will be when saying something true.
I find striking similarities between LLMs and very confident humans. They both have no problems making statements confidently because they are oblivious to or discount completely any consideration of being incorrect.
I remember on one occasion my coworkers and I went to visit a renowned expert who was a VP and fellow at Amazon, in addition to having a bunch of national awards. During the conversation, we asked him a question, and he responded with an answer that was incorrect, which we knew was completely incorrect because we were experts in that field. Yet, he acted supremely confident ... just like an LLM.
Parent
twitter
facebook
byaccount_deleted ( 4530225 ) writes:
Comment removed based on user account deletion
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
Slashdot
●
●
Submit Story
It is much harder to find a job than to keep one.
●FAQ
●Story Archive
●Hall of Fame
●Advertising
●Terms
●Privacy Statement
●About
●Feedback
●Mobile View
●Blog
Do Not Sell or Share My Personal Information
Copyright © 2026 Slashdot Media. All Rights Reserved.
×
Close
Working...