• Fosheze@lemmy.world
      link
      fedilink
      English
      arrow-up
      78
      ·
      edit-2
      7 months ago

      It’s a dynamically-sized list of objects of the same type stored contiguously in memory.

      dynamically-sized: The size of it can change as needed.

      list: It stores multiple things together.

      object: A bit of programmer defined data.

      of the same type: all the objects in the list are defined the same way

      stored contigiously in memory: if you think of memory as a bookshelf then all the objects on the list would be stored right next to each other on the bookshelf rather than spread across the bookshelf.

      • kbotc@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        7 months ago

        Dynamically sized but stored contiguously makes the systems performance engineer in me weep. If the lists get big, the kernel is going to do so much churn.

        • Killing_Spark@feddit.de
          link
          fedilink
          English
          arrow-up
          14
          ·
          7 months ago

          Contiguous storage is very fast in terms of iteration though often offsetting the cost of allocation

          • Slotos@feddit.nl
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 months ago

            Modern CPUs are also extremely efficient at dealing with contiguous data structures. Branch prediction and caching get to shine on them.

            Avoiding memory access or helping CPU access it all upfront switches physical domain of computation.

        • :3 3: :3 3: :3 3: :3@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          7 months ago

          Which is why you should:

          1. Preallocate the vector if you can guesstimate the size
          2. Use a vector library that won’t reallocate the entire vector on every single addition (like Rust, whose Vec doubles in size every time it runs out of space)

          Memory is fairly cheap. Allocation time not so much.

        • yetiftw@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          matlab likes to pick the smallest available spot in memory to store a list, so for loops that increase the size of a matrix it’s recommended to preallocate the space using a matrix full of zeros!