Amazing.

  • pezhore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Google maps has been hounding me to try AI searching, but I cannot for the life of me think of a reason why I would want to ever do so.

    This is just another reason why AI sucks at concrete data searching. Give it a prompt for coming up with a new recipe for some dinner dish, sure. Help me formulate listicles for my shitty blog, okay.

    But asking AI to find the nearby hospital, grocery store, or arcade seems fucking ludicrous. Fix your search indexers for real places don’t offload that to hallucinating LLMs.

    • AcausalRobotGod@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      It’s really annoying because SEARCH IS REALLY GOOD ALREADY and “AI searching” is adding a bad, buggy frontend over either the already good search or a worse version of search (vector DB searches).

  • Mii@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    I used to believe public institutions like the WHO wouldn’t jump onto any stupid bandwagon, but here we are.

    But at least they acknowledge it’s all bullshit on their own website, lol.

    WHO takes no responsibility for any conversation content created by Generative AI. Furthermore, the conversation content created by Generative AI in no way represents or comprises the views or beliefs of WHO, and WHO does not warrant or guarantee the accuracy of any conversation content. Please check the WHO website for the most accurate information. By using WHO Sarah, you understand and agree that you should not rely on the answers generated as the sole source of truth or factual information, or as a substitute for professional advice.

    • 0xb0ba@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I hate this cop out everyone uses when they stuff AI into their system. “By using it it’s actually your fault for believing the information” while also making it the preferred or even only way to try and get information. This is unethical especially in healthcare