Running Docker-ized qBittorrent v4.6.0. 64-bit on Ubuntu 23.10. Seeding 29 torrents, Leeching 0 torrents.

According to Glances, it’s using 18 Gigs of memory which seems high. I just wonder if maybe I have a setting somewhere that is problematic? Or is this typical behavior?

  • empireOfLove@lemmy.one
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    you may have weird cache settings that is storing large amounts of the torrent data in memory. this is not inherently bad- reduced disk IO hits extends disk life and increases performance, unless your system is memory constrained and it’s affecting other programs.

    how are you viewing memory usage? virtual memory pages and true physical memory usage can be displayed very weirdly on a lot of linux systems

  • buskbrand@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Are you running the container in rootless mode (perhaps via Podman)?

    Rootless containers run on an emulated network stack (slirp4netns for podman, not sure about rootless docker), since the runtime doesn’t have the privilege to touch the real one - which is the point of running rootless.

    This emulation uses a decent amount of memory and torrent clients in particular open a lot of connections. My slirp4netns process eats up several gigabytes whenever the torrent container is active.

  • Brickfrog@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    You may want to update your post to mention which version of qBittorrent 4.6.0 you’re on.

    e.g. are you using the build with Libtorrent 1.2.x or Libtorrent 2.0.x?

    Libtorrent 2.0.x does tend to use more memory during runtime (especially when you have many actively uploading/downloading torrents) but it’s fine overall, the OS / kernel knows to redistribute memory away from qBittorrent to other applications as needed. In other words if nothing is crashing then you should be okay.

    That said I’ve mainly tinkered with Libtorrent 2.0.x clients on Windows (Deluge & qBittorrent) so there might be something I’m missing specific to Linux or Docker. qBittorrent with “Physical memory (RAM) usage limit” set to its max will basically let Libtorrent use as much memory as it likes… it is lower priority memory so in theory as long as everything else in Windows is working the other applications can still request memory & run properly. Funny enough with Deluge I don’t think it even has a RAM usage limit setting so Deluge with Libtorrent 2.0.x will happily use the max memory available to it.