I highly recommend disabling JavaScript by default in your browser and then whitelisting the websites that you use frequently and need JavaScript to function.

The privacy benefit of this is that when you read articles online or visit new websites, most of the time it will not need JavaScript to function which will stop loading a lot of ads and tracking scripts.

The security benefit here is massive, first if you visited a bad link that contains a malware that is dependent on JavaScript it would not work, secondly if you visited a link for a service that you use and JavaScript did not work there, then you can see in real time that this is a fake page and not the real websitewebsite you intended to visit.

Bonus tip: try to replace the unnecessary websites that can’t work without JavaScript and you need by JavaScript free websites or open source apps.

Disclaimer: Stay cautious. This recommendation will improve your privacy and security, but it does not protect you from everything.

  • Emotet@slrpnk.net
    link
    fedilink
    arrow-up
    96
    arrow-down
    1
    ·
    4 months ago

    15-20 years ago, I’d have agreed with you. But apart from a select few news sites and exceedingly rare static sites, what percentage of websites most users use day to day actually function even minimally without JavaScript?

    I’m convinced that in practice, most users would be conditioned to whitelist pretty much every site they visit due to all the breakage. Still a privacy and security improvement, but a massive one? I’m not sure.

    Very happy to be convinced otherwise.

    • smeeps@lemmy.mtate.me.uk
      link
      fedilink
      arrow-up
      22
      ·
      4 months ago

      Yep, software dev here for a static marketing site for a product. We are in a constant battle with PMs and SEO who want Google tracking, Facebook, TikTok, A/B testing, cursor tracking, etc. We’re trying to keep page-speeds fast while loading megabytes of external JS code…

      Luckily all that can be blocked by the end user without affecting the site though, all you’d lose is some carousels and accordions etc that are done with a few lines of code.

    • montar@lemmy.ml
      link
      fedilink
      arrow-up
      20
      ·
      4 months ago

      Tried and can confirm almost every webpage even static ones which could be simple as rock needs truckload of bloat js code to be loaded from ext servers.

    • ModerateImprovement@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      4 months ago

      I had been doing this since long time and I am super comfortable with this as most of the websites that I come across does not need JavaScript to function.

      Even for the websites that need JavaScript and I need them, I try to replace them with open source apps and clients.

      You can list the websites you use and require JavaScript and I can give you alternatives if you want.

      Happy to help you.

      • Emotet@slrpnk.net
        link
        fedilink
        arrow-up
        7
        ·
        4 months ago

        It’s great that it works for you and that you strive to spread your knowledge. Personally, I’m quite happy with my DNS filtering/uBlock Origin and restrictive browser approach and already employ alternatives where feasible in my custom use case.

        Thanks for your offer, though!

    • s38b35M5@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I agree that most websites don’t load without JavaScript, but you don’t need seven or more different domains with java allowed for the main site to work. Most sites have their own, plus six google domains, including tag manager, Facebook, etc. I whitelist the website and leave the analytics and tracking domains off.

  • The Hobbyist@lemmy.zip
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    4 months ago

    You’re suggesting a whitelisting approach which I’ve used for a long time. But in the end, I I was so upset that most websites required me to enable JavaScript for their unique website because they would otherwise be broken. And I was only interested in blocking it for specific webpages so I ended up having a blacklisting approach which I recommend to keep some sanity, but that’s my opinion :)

  • exu@feditown.com
    link
    fedilink
    English
    arrow-up
    15
    ·
    4 months ago

    You’d end up whitelisting sob many sites that it makes this approach worthless in my opinion.

    Instead I’ve settled on blocking scripts by default and whitelisting subdomains until the site works. It does require more time and effort, but it’s probably the only way to meaningfully block parts of javascript apart from just not using that website.
    Depending on how exactly you so this, you’ll end up with a huge filter list. Mine in uBlock Origin has 245kB when exported.

    • Aggravationstation@feddit.uk
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      You do end up whitelisting a lot of sites, but seeing that a site is using Javascript this way lets you weigh up if you feel it’s worth the security risk to enable it or not.

  • moreeni@lemm.ee
    link
    fedilink
    arrow-up
    14
    ·
    4 months ago

    I’ve been doing this for a few years and eventually got tired of whitelisting websites. I’ve went as far as using NoScript for fine-grained control, but what’s the point? If you need a single feature JS, or a single article on a domain, you will let everything run if you grant the permissions, so why bother?

    Better keep JS on and run an up-to-date browser with a custom DNS to filter out known malicious websites. Also, don’t visit random links, that’s an actually good advice.

  • shortwavesurfer@lemmy.zip
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    4 months ago

    This combined with using a DNS over HTTPS or DNS over TLS that filters out things such as Control D is golden

    They have several levels on their free DNS. Level zero has no filtering at all. Level one filters out known malware and shit. Level two filters out known malware and advertising. And level three does known malware advertising and social media.

    • Peffse@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      4 months ago

      I had to switch to uMatrix after NoScript broke all embedded media. It seems to still block all the 3rd party scripts, but allows 1st party by default so less severe website breakage.

  • Blizzard@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    I prefer to block the bad scripts with uBlock/AdGuard and use the rest of them.

  • Mikelius@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    The security part is the reason I use NoScript to do this. We’ve all typo squatted sites we visit, I’m sure. But if I typo squat a site I frequently visit and see the JavaScript disabled, it forces me to recheck I’m on the right site. Granted it’s only happened once where I didn’t realize I typo’d until seeing it was disabled, but it only takes 1 time to lose everything…

    Not sure the fingerprint concerns are too major for me either. Hopefully most scenarios, I’m flagged as a bot or crawler and out of some data that would otherwise have been collected. Who knows. I imagine that JavaScript makes up for way more fingerprinting though.

    • Tangent5280@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      For critical functionality like banking and stuff it’s also a good idea to put a bookmark to the right site on the toolbar and then only ever access it via bookmarks.