A newly discovered trade-off in the way time-keeping devices operate on a fundamental level could set a hard limit on the performance of large-scale quantum computers, according to researchers from the Vienna University of Technology.

  • shortwavesurfer@monero.town
    link
    fedilink
    English
    arrow-up
    58
    ·
    1 year ago

    whether entropy will have the final say on just how powerful quantum computers can get, only time will tell.

    I wonder if this pun was on purpose

  • souperk@reddthat.com
    link
    fedilink
    English
    arrow-up
    39
    ·
    1 year ago

    Scientist:

    That means: Either the clock works quickly or it works precisely – both are not possible at the same time.

    Engineer: Explain that to my manager please!

    Also, Engineer: Well, what if we accounted for error rate and fixed precision post-processing?

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 year ago

    the measure of time is bound by the limits of physics itself.

    It feels like a relief.

    We already had quantum theory that told us not to hack energy into infinitely small pieces, and we had the uncertainty principle, so we refrained from doing the same to space.

    Now we know that time behaves itself somewhat accordingly.

  • Arthur Besse@lemmy.ml
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 year ago

    I really hope that there isn’t a cryptographically-relevant quantum computer built in our lifetimes, but we should still assume that there likely will be and accordingly should switch everything to use (hybrid) post-quantum cryptography ASAP.

    • jayrhacker@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Why not? I’ve got a hard drive which I lost the keys to I’d like to recover, and having all the old secrets out in the open would be really interesting.

      • Arthur Besse@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        It isn’t expected that a quantum computer will be able to instantly break symmetric encryption, as is used in full disk encryption. It will give an enormous advantage (halving the number of bits of security) but attacking that will still require a large amount of time and energy. What a CRQC will very quickly break is the asymmetric primitives, as used in TLS, encrypted email and chats, etc.

        On the other hand, using default parameters from not so long ago, it is cheaper than you might expect to brute-force your disk passphrase already today without a quantum computer… which is why you should use a stronger key derivation function (in addition to a strong passphrase, of course).

      • Plopp@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Isn’t that symmetrical encryption? Quantum computers aren’t really that beneficial for symmetrical encryption iirc, due to it being a process that can’t be parallelized very efficiently (and quantum computers are kinda slow per operation).

  • Diabolo96@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    My internet is pretty garbage right now, I can’t even open the article. Can someone share the relevant parts ? Thanks in advance.

    • btp@kbin.socialOP
      link
      fedilink
      arrow-up
      21
      ·
      1 year ago

      A newly discovered trade-off in the way time-keeping devices operate on a fundamental level could set a hard limit on the performance of large-scale quantum computers, according to researchers from the Vienna University of Technology.

      While the issue isn’t exactly pressing, our ability to grow systems based on quantum operations from backroom prototypes into practical number-crunching behemoths will depend on how well we can reliably dissect the days into ever finer portions. This is a feat the researchers say will become increasingly more challenging.

      Whether you’re counting the seconds with whispers of Mississippi or dividing them up with the pendulum-swing of an electron in atomic confinement, the measure of time is bound by the limits of physics itself.

      One of these limits involves the resolution with which time can be split. Measures of any event shorter than 5.39 x 10-44 seconds, for example, run afoul of theories on the basic functions of the Universe. They just don’t make any sense, in other words.

      Yet even before we get to that hard line in the sands of time, physicists think there is a toll to be paid that could prevent us from continuing to measure ever smaller units.

      Sooner or later, every clock winds down. The pendulum slows, the battery dies, the atomic laser needs resetting. This isn’t merely an engineering challenge – the march of time itself is a feature of the Universe’s progress from a highly ordered state to an entangled, chaotic mess in what is known as entropy.

      “Time measurement always has to do with entropy,” says senior author Marcus Huber, a systems engineer who leads a research group in the intersection of Quantum Information and Quantum Thermodynamics at the Vienna University of Technology.

      In their recently published theorem, Huber and his team lay out the logic that connects entropy as a thermodynamic phenomenon with resolution, demonstrating that unless you’ve got infinite energy at your fingertips, your fast-ticking clock will eventually run into precision problems.

      Or as the study’s first author, theoretical physicist Florian Meier puts it, “That means: Either the clock works quickly or it works precisely – both are not possible at the same time.”

      This might not be a major problem if you want to count out seconds that won’t deviate over the lifetime of our Universe. But for technologies like quantum computing, which rely on the temperamental nature of particles hovering on the edge of existence, timing is everything.

      This isn’t a big problem when the number of particles is small. As they increase in number, the risk any one of them could be knocked out of their quantum critical state rises, leaving less and less time to carry out the necessary computations.

      Plenty of research has gone into exploring the potential for errors in quantum technology caused by a noisy, imperfect Universe. This appears to be the first time researchers have looked at the physics of timekeeping itself as a potential obstacle.

      “Currently, the accuracy of quantum computers is still limited by other factors, for example the precision of the components used or electromagnetic fields,” says Huber.

      “But our calculations also show that today we are not far from the regime in which the fundamental limits of time measurement play the decisive role.”

      It’s likely other advances in quantum computing will improve stability, reduce errors, and ‘buy time’ for scaled-up devices to operate in optimal ways. But whether entropy will have the final say on just how powerful quantum computers can get, only time will tell.

      This research was published in Physical Review Letters.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Maybe we need more quantity

      More quantity of time?
      For you?
      Declined.
      You’ll get your 60-70 years like everybody else. :-)