Artwork

Innehåll tillhandahållet av Jeff Wilser. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Jeff Wilser eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Why Does AI Hallucinate? Can It Be Fixed? w/ EyeLevel.AI CEO Neil Katz

1:08:35
 
Dela
 

Manage episode 424656930 series 3503527
Innehåll tillhandahållet av Jeff Wilser. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Jeff Wilser eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

As most people who have played with AI know, it can make stuff up, or what’s often referred to as “hallucinate.” (I like to think of it as “bullshitting.”

It’s one of the trickiest problems vexing the entire AI space. Why does AI do this? How widespread is the problem? What are the solutions, and is it even something that CAN be solved?

To unravel all this, we speak with Neil Katz, the founder of EyeLevel.AI, a company that’s developing solutions to help make AI more accurate (and less likely to hallucinate) for private companies. They're building what they call the "truth serum" for AI,
We dive deeeeep into the world of AI hallucinations, as it's one of the least understood--and most important--topics in the space.
I very much enjoyed.
Find Neil and EyeLevel.AI at:
https://www.eyelevel.ai/

  continue reading

70 episoder

Artwork
iconDela
 
Manage episode 424656930 series 3503527
Innehåll tillhandahållet av Jeff Wilser. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Jeff Wilser eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

As most people who have played with AI know, it can make stuff up, or what’s often referred to as “hallucinate.” (I like to think of it as “bullshitting.”

It’s one of the trickiest problems vexing the entire AI space. Why does AI do this? How widespread is the problem? What are the solutions, and is it even something that CAN be solved?

To unravel all this, we speak with Neil Katz, the founder of EyeLevel.AI, a company that’s developing solutions to help make AI more accurate (and less likely to hallucinate) for private companies. They're building what they call the "truth serum" for AI,
We dive deeeeep into the world of AI hallucinations, as it's one of the least understood--and most important--topics in the space.
I very much enjoyed.
Find Neil and EyeLevel.AI at:
https://www.eyelevel.ai/

  continue reading

70 episoder

Alle Folgen

×
 
Loading …

Välkommen till Player FM

Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.

 

Snabbguide