• LeafletOPEnglish
    arrow-up
    6
    arrow-down
    0
    ·
    1 month ago
    link
    fedilink

    LLMs are expensive to run, so locally running them saves Google money.

    • drwankingsteinEnglish
      arrow-up
      2
      arrow-down
      0
      ·
      1 month ago
      link
      fedilink

      ehh not really, the amount of generated data you can get by snopping on LLM traffic is going to far out weigh the costs of running LLMs

      • LeafletOPEnglish
        arrow-up
        1
        arrow-down
        0
        ·
        1 month ago
        edit-2
        1 month ago
        link
        fedilink

        There’s nothing technical stopping Google from sending the prompt text (and maybe generated results) back to their servers. Only political/social backlash for worsened privacy.

      • elucubra
        arrow-up
        1
        arrow-down
        0
        ·
        1 month ago
        link
        fedilink

        I doubt that. I’m going to guess that Google is going towards a sort of “P2P AI