• themachineEnglish
    arrow-up
    52
    arrow-down
    0
    ·
    5 months ago
    link
    fedilink

    Just look at the bit rate of what you are streaming and multiply it by 3 then add a little extra for overhead.

  • WFloydEnglish
    arrow-up
    6
    arrow-down
    0
    ·
    5 months ago
    link
    fedilink

    I have 35mbps upload from the ISP, and limit each stream to 8mbps. This covers direct streaming all my 1080p content and a 4K transcode as needed.

  • Faceman🇦🇺English
    arrow-up
    5
    arrow-down
    0
    ·
    5 months ago
    link
    fedilink

    Are you transcoding?

    4mbit per client for 1080 is generally a workable minimum for the average casual watcher if you have H265 compatible clients (and a decent encoder, like a modern intel CPU for example), 6 - 8mbit per client if its H264 only.

    Remember that the bitrate to quality curve for live transcoding isn’t as good as a slow, non-real-time encode done the brute force way on a CPU. so if you have a few videos that look great at 4mbit, dont assume your own transcodes will look quite that nice, you’re using a GPU to get it done as quickly as possible, with acceptable quality, not as slowly and carefully as possible for the best compression.

  • SigHunterEnglish
    arrow-up
    5
    arrow-down
    0
    ·
    5 months ago
    link
    fedilink

    My family is very satisfied with 6 mbit/s per stream. Some HEVC, most H264. They see it as high quality. 3 Streams would be 18 to 20 Mbit/s

  • Possibly linuxEnglish
    arrow-up
    3
    arrow-down
    1
    ·
    5 months ago
    link
    fedilink

    How expensive is internet? If its cheap go overkill and don’t worry about it.

  • Diabolo96English
    arrow-up
    5
    arrow-down
    5
    ·
    5 months ago
    edit-2
    5 months ago
    link
    fedilink

    I don’t have a jellyfin server but 1MB/s (8mbps) for each person watching 1080p (3.6Gb per hour of content for each file) seems reasonable. ~3MB/s (24mbps) upload and as much download should work.

    • GenderNeutralBroEnglish
      arrow-up
      14
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      1mbps is awfully low for 1080. Or did you mean megabyte rather than megabit?

      • Diabolo96English
        arrow-up
        4
        arrow-down
        0
        ·
        5 months ago
        edit-2
        5 months ago
        link
        fedilink

        I had a hunch that writing the actual Upload/download speed tather than mbps was probably wrong. My bad, my internet provider lingo is rusted.

        • GenderNeutralBroEnglish
          arrow-up
          3
          arrow-down
          0
          ·
          5 months ago
          link
          fedilink

          Gotcha. Typically lowercase b=bit and uppercase B=Byte, but it’s hard to tell what people mean sometimes, especially in casual posts.

          Come to think of it, I messed up the capitalization too. Should be a capital M for mega.

    • dysprosiumEnglish
      arrow-up
      2
      arrow-down
      0
      ·
      5 months ago
      link
      fedilink

      Why don’t people use Mb/s and MB/s which makes it so much clearer what you’re talking about

      • SigHunterEnglish
        arrow-up
        7
        arrow-down
        0
        ·
        5 months ago
        edit-2
        5 months ago
        link
        fedilink

        Back in the day, the rule was mbit (megabit) for data in transfer (network speed) and MB (megabyte) for data at rest, like on HDDs

        • dysprosiumEnglish
          arrow-up
          1
          arrow-down
          0
          ·
          5 months ago
          edit-2
          5 months ago
          link
          fedilink

          So mbit/s instead of Mbit/s ? But the M in Mega is always capitalized though, except the k in kilo.

          • realbadatEnglish
            arrow-up
            2
            arrow-down
            0
            ·
            5 months ago
            link
            fedilink

            Bigger number sounds better for the ISP.

          • the magnificent rhys
            arrow-up
            1
            arrow-down
            0
            ·
            5 months ago
            link
            fedilink

            @Moneo @SigHunter Networking came to be when there were lots of different implementations of a ‘byte’. The PDP-10 was prevalent at the time the internet was being developed for example, which supported variable byte lengths of up to 36-bits per byte.

            Network protocols had to support every device regardless of its byte size, so protocol specifications settled on bits as the lowest common unit size, while referring to 8-bit fields as ‘octets’ before 8-bit became the de facto standard byte length.

          • bitwabaEnglish
            arrow-up
            1
            arrow-down
            0
            ·
            5 months ago
            link
            fedilink

            The real answer?

            Data is transmitted in packets. Each packet has a packet header, and a packet payload. The total data transmitted is the header + payload.

            If you’re transmitting smaller packet sizes, it means your header is a larger percentage of the total packet size.

            Measuring in megabits is the ISP telling you “look, your connection is good for X amount of data. How you choose to use that data is up to you. If you want more of it going to your packet headers instead of your payload, fine. A bit is a bit is a bit to your ISP.

      • ludEnglish
        arrow-up
        3
        arrow-down
        0
        ·
        5 months ago
        link
        fedilink

        The best format imo is MB/s and Mbit/s

        It avoids all confusion.