• Nikls94@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    4 days ago

    Well… it’s not capable of being moral. It answers part 1 and then part 2, like a machine

    • fckreddit@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      4 days ago

      Being ‘moral’, means to have empathy. But empathy is only possible between two beings that share experiences and reality or at least some aspects of it. LLMs don’t have experiences, but it builds it’s weights from training data. It is fundamentally a computer program. Just textual information is not enough to build deep context. For example, when I say “this apple is red”, anyone reading this can easily visualize a red apple because of your experience seeing a apple. That cannot be put into text because it is a fundamental part of human experience that is not available to a computer program, as of yet.

      At least that is my hypothesis. I can very obviously be wrong., which is another fundamentally human experience.