• GetOffMyLan@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    6 hours ago

    I mean they literally do analyze text. They’re great at it. Give it some text and it will analyze it really well. I do it with code at work all the time.

    Because they are two completely different tasks. Asking them to recall information from their training is a very bad use. Asking them to analyze information passed into them is what they are great at.

    Give it a sample of code and it will very accurately analyse and explain it. Ask it to generate code and the results are wildly varied in accuracy.

    I’m not assuming anything you can literally go and use one right now and see.

    • apotheotic (she/her)@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      The person you’re replying to is correct though. They do not understand, they do not analyse. They generate (roughly) the most statistically likely answer to your prompt, which may very well end up being text representing an accurate analysis. They might even be incredibly reliable at doing so. But this person is just pushing back against the idea of these models actually understanding or analysing. Its slightly pedantic, sure, but its important to distinguish in the world of machine intelligence.

      • GetOffMyLan@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        I literally quoted the word for that exact reason. It just gets really tiring when you talk about AIs and someone always has to make this point. We all know they don’t think or understand in the same way we do. No one gains anything by it being pointed out constantly.

        • apotheotic (she/her)@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          You said “they literally do analyze text” when that is not, literally, what they do.

          And no, we don’t “all know” that. Lay persons have no way of knowing whether AI products currently in use have any capacity for genuine understanding and reasoning, other than the fact that the promotional material uses words like “understanding”, “reasoning”, “thought process”, and people talking about it use the same words. The language we choose to use is important!

          • GetOffMyLan@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            No it’s not. It’s pedantic and arguing semantics. It is essentially useless and a waste of everyone’s time.

            It applies a statistical model and returns an analysis.

            I’ve never heard anyone argue when you say they used a computer to analyse it.

            It’s just the same AI bad bullshit and it’s tiring in every single thread about them.

            • apotheotic (she/her)@beehaw.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              14 minutes ago

              I never made any “AI bad” arguments (in fact, I said that they may be incredibly well suited to this) I just argued for the correct use of words and you hallucinated.