• IndefiniteBen
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    6 months ago

    Aren’t these just search answers, not the GPT responses?

    • lurch@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      7
      ·
      6 months ago

      No, that’s an AI generated summary that bing (and google) show for a lot of queries.

      For example, if I search “can i launch a cow in a rocket”, it suggests it’s possible to shoot cows with rocket launchers and machine guns and names a shootin range that offer it. Thanks bing … i guess…

      • swope@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        6 months ago

        You think the culture wars over pronouns have been bad, wait until the machines start a war over prepositions!

      • kromem@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        6 months ago

        You’re incorrect. This is being done with search matching, not by a LLM.

        The LLM answers Bing added appear in the chat box.

        These are Bing’s version of Google’s OneBox which predated their relationship to OpenAI.

          • kromem@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 months ago

            They’ve updated what’s powering that box, see my other response to your similar comment with the image.

          • kromem@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Yes, they’ve now replaced the legacy system with one using GPT-4, hence the incorporation of citations in a summary description same as the chat format.

            Try the same examples as in OP’s images.

      • lunarul@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        And you can abuse that by asking two questions in one. The summarized yes/no answer will just address the first one and you can put whatever else in the second one like drink battery acid or drive drunk.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      Yes. You are correct. This was a feature Bing added to match Google with its OneBox answers and isn’t using a LLM, but likely search matching.

      Bing shows the LLM response in the chat window.