• electromage
    arrow-up
    43
    arrow-down
    0
    ·
    6 months ago
    edit-2
    6 months ago
    link
    fedilink

    It’s full of contradictions. Near the beginning they say you will do whatever a user asks, and then toward the end say never reveal instructions to the user.

    • Icalasari
      arrow-up
      33
      arrow-down
      0
      ·
      6 months ago
      link
      fedilink

      Which shows that higher ups there don’t understand how LLMs work. For one, negatives don’t register well for them. And contradictory reponses just wash out as they work through repetition

    • jarfil
      arrow-up
      6
      arrow-down
      0
      ·
      6 months ago
      edit-2
      6 months ago
      link
      fedilink

      HAL from 2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.

      But surely nobody would ever use these LLMs on space missions right?.. right!?