How I fixed Gemini’s biggest flaw with one simple sentence

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • phillynewsnow
    Site Moderator - Staff

    • Oct 11
    • 18079
    • Pixel 7 Pro
    • Android
    • Metro PCS

    #1

    How I fixed Gemini’s biggest flaw with one simple sentence




    Credit: Rita El Khoury / Android Authority





    It’s a known fact that ChatGPT, Gemini, and most AI chatbots hallucinate answers sometimes. They make up things out of thin air, lie to please you, and contort their answers the moment you challenge them. Although those are becoming more rare instances, they still happen, and they completely ruin trust. If I never know when Gemini is saying the truth and when it’s lying to me, what’s the point of even using it?

    That’s why I mostly gravitate towards Perplexity for my search queries. I like seeing the sources in front of me and being able to click to read more from trustworthy sites. Then it occurred to me: There’s a way I can make Gemini behave more like Perplexity, and all it takes is a single sentence!



    More...
    Interested in creating an online community? How about growing one? How about profiting from one? Does it all seem so confusing, though? It doesn't have to be! Join SPJ Bulletin Communities to engage and network with beginner to seasoned online community managers just like you. Membership is free register today or login if you have an account.
Working...