• Blastboom StriceEnglish
    arrow-up
    60
    arrow-down
    2
    ·
    8 months ago
    edit-2
    8 months ago
    link
    fedilink

    [Edit: indeed, its actually good that it’s 2gb]

    2gb plugin??!

    Btw, does it work with tenacity?

    • 9point6English
      arrow-up
      64
      arrow-down
      0
      ·
      8 months ago
      link
      fedilink

      AI models are often multiple gigabytes, tbh it’s a good sign that it’s not AI marketing bullshit (less of a risk with open source projects anyway). I’m pretty wary of AI audio software that’s only a few megabytes.

      • interdimensionalmemeEnglish
        arrow-up
        10
        arrow-down
        0
        ·
        8 months ago
        link
        fedilink

        Tensorflowlite models are tiny, but they’re potentially as much an audio revolution as synthetizer were in the 70s. It’s hard to tell if that’s what we’re looking at here.

      • NeatoEnglish
        arrow-up
        4
        arrow-down
        3
        ·
        8 months ago
        link
        fedilink

        Why are they that big? Is it more than code? How could you get to gigabytes of code?

        • General_EffortEnglish
          arrow-up
          51
          arrow-down
          0
          ·
          8 months ago
          link
          fedilink

          Currently, AI means Artificial Neural Network (ANN). That’s only one specific approach. What ANN boils down to is one huge system of equations.

          The file stores the parameters of these equations. It’s what’s called a matrix in math. A parameter is simply a number by which something is multiplied. Colloquially, such a file of parameters is called an AI model.

          2 GB is probably an AI model with 1 billion parameters with 16 bit precision. Precision is how many digits you have. The more digits you have, the more precise you can give a value.

          When people talk about training an AI, they mean finding the right parameters, so that the equations compute the right thing. The bigger the model, the smarter it can be.

          Does that answer the question? It’s probably missing a lot.

        • Aatube
          arrow-up
          15
          arrow-down
          5
          ·
          8 months ago
          edit-2
          8 months ago
          link
          fedilink

          It’s basically a huge graph/flowchart.

            • Aatube
              arrow-up
              14
              arrow-down
              2
              ·
              8 months ago
              link
              fedilink
              1. Specifying weights, biases and shape definitely makes a graph.
              2. IMO having a lot of more preferred and more deprecated routes is quite close to a flowchart except there’s a lot more routes. The principles of how these work is quite similar.
              • General_EffortEnglish
                arrow-up
                3
                arrow-down
                4
                ·
                8 months ago
                link
                fedilink
                1. There are graph neural networks (meaning NNs that work on graphs), but I don’t think that’s what is used here.

                2. I do not understand what you mean by “routes”. I suspect that you have misunderstood something fundamental.

                • Aatube
                  arrow-up
                  5
                  arrow-down
                  2
                  ·
                  8 months ago
                  link
                  fedilink
                  1. I’m not talking about that. What’s weights, biases and shape if not a graph?
                  2. By routes, I mean that the path of the graph doesn’t necessarily converge and that it is often more tree-like.
                  • General_EffortEnglish
                    arrow-up
                    4
                    arrow-down
                    1
                    ·
                    8 months ago
                    edit-2
                    8 months ago
                    link
                    fedilink

                    You can see a neural net as a graph in that the neurons are connected nodes. I don’t believe that graph theory is very helpful, though. The weights are parameters in a system of linear equations; the numbers in a matrix/tensor. That’s not how the term is used in graph theory, AFAIK.

                    ETA: What you say about “routes” (=paths?) is something that I can only make sense of, if I assume that you misunderstood something. Else, I simply don’t know what that is talking about.

        • ඞmirEnglish
          arrow-up
          7
          arrow-down
          0
          ·
          8 months ago
          link
          fedilink

          They’re composed of many big matrices, which scale quadratically in size. A 32x32 matrix is 4x the size of a 16x16 matrix.

        • 9point6English
          arrow-up
          8
          arrow-down
          2
          ·
          8 months ago
          edit-2
          8 months ago
          link
          fedilink

          The current wave of AI is around Large Language Models or LLMs. These are basically the result of a metric fuckton of calculation results generated from running a load of input data in, in different ways. Given these are often the result of things like text, pictures or audio that have been distilled down into numbers, you can imagine we’re talking a lot of data.

          (This is massively simplified, by someone who doesn’t entirely understand it themselves)

    • bambooEnglish
      arrow-up
      33
      arrow-down
      0
      ·
      8 months ago
      link
      fedilink

      It seems reasonable given it includes multiple AI models.

    • FischEnglish
      arrow-up
      7
      arrow-down
      0
      ·
      8 months ago
      link
      fedilink

      2gb is pretty normal for an AI model. I have some small LLM models on my PC and they’re about 7-10gb big. The big ones take up even more space.

    • Lexi SneptaurEnglish
      arrow-up
      2
      arrow-down
      10
      ·
      8 months ago
      link
      fedilink

      Isn’t tenacity a joke project made by 4channers

        • Lexi SneptaurEnglish
          arrow-up
          3
          arrow-down
          0
          ·
          8 months ago
          link
          fedilink

          Gotcha, thank you for the info. Gotta admit their made-up words are pretty funny

      • RmDebArc_5English
        arrow-up
        10
        arrow-down
        1
        ·
        8 months ago
        link
        fedilink

        Tenacity is a Audacity fork without telemetry

        • m-p{3}English
          arrow-up
          16
          arrow-down
          1
          ·
          8 months ago
          link
          fedilink

          Isn’t the telemetry in Audacity opt-in anyway?

          • FischEnglish
            arrow-up
            3
            arrow-down
            0
            ·
            8 months ago
            link
            fedilink

            The fork was created when Audacity was bought and one of the first things the new developers were about to do was add opt-out telemetry. People didn’t like that at all. From what I read in this thread, they ended up adding opt-in telemetry instead.

          • xploitEnglish
            arrow-up
            1
            arrow-down
            0
            ·
            8 months ago
            link
            fedilink

            deleted by creator