Reading the Docs, it seems like PodMan is the replacement for docker. You could try containerd/nerdctl, but podman is likely the best way for you. RHEL10 docs even say it supports the older docker config options
Reading the Docs, it seems like PodMan is the replacement for docker. You could try containerd/nerdctl, but podman is likely the best way for you. RHEL10 docs even say it supports the older docker config options
Also, sometimes: it is scary BECAUSE it is familiar.
Oh, but it does. It helps soooo much. Something about the pinpoint pressure in just the right spot just hits so good… But you can’t just keep biting it forever, and generally the itch comes back again soon after.
Tailscale/headscale/wire guard is different from a normal vpn setup.
VPN: you tunnel into a remote network and all your connections flow through as if you’re on that remote network.
Tailscale: your devices each run the daemon and basically create a separate, encrypted, dedicated overlay network between them no matter where they are or what network they are on. You can make an exit node where network traffic can exit the overlay network to the local network for a specific cidr, but without that, you’re only devices on the network are the devices connected to the overlay. I can setup a set of severs to be on the Tailscale overlay and only on that network, and it will only serve data with the devices also on the overlay network, and they can be distributed anywhere without any crazy router configuration or port forwarding or NAT or whatever.
Honestly, that sounds like a keepalived replacement or equivalent. I went with keepalived because I’m also using the IP for the proxmox cluster itself so it had to be outside kube, but the idea is the same. If all you’re using the IP for is kube, go with kube-vip! But let us know how it works!
You’ll want to look into “keepalived” to setup a shared IP across all worker nodes in the cluster and either directly forward, or setup haproxy on each to do the forwarding from that keepalived IP to the ingresses.
I’m running 6 kube nodes (running Talos) running in a 3node proxmox cluster. Both haproxy and keepalived run on the 3 nodes to manage the IP and route traffic to the appropriate backend. Haproxy just allows me to migrate nodes and still have traffic hit an ingress kube node.
Keepalived manages which node is the active node and therefore listens to the IP based on backend communication and a simple local script to catch when nodes can’t serve traffic.
I’ve tried them all and in the last year or so, none of worked anymore. They fixed the glitch.
I feel like people always miss the point of a Nintendo console. While it’d be cool to have HDR and OLED and such, it’s not really a big deal. It doesn’t really impact my ability to have fun with Mario Kart or Party. It doesn’t change my enjoyment of animal crossing, etc.
I’ve got a PS5 hooked to a 4k HDR 120hz screen, and enjoy the performance and graphical fidelity regularly, and there are some games I’m still picking a switch up to play. Heck, I still play on a Wii fairly frequently. The controllers, game types/styles, game quality, accessibility, replayability, etc.
It should never be just “we want to have a child so we will”. That’s self centered, short sighted and irresponsible.
Anyone looking to have children should think through at the minimum:
To bring a child into a bad environment, with no time or money to spend on the child, is to bring the child into this world setup for failure and would only put a drain on the system, the resources, the climate, the relatives, etc.
People are choosing (in Japan and elsewhere around the world) to not have children because of the less than favorable conditions outlined above, and many others.
On one hand, I absolutely abhor governmental blanket data collection and the storage of this data. Both from a personal privacy, independence and freedom point of view, and from a “you know they’ll just leak the data and then everyone will have it” standpoint.
On the flip side:
In March, President Trump signed an executive order calling for the federal government to share data across agencies
Any sane company or government would have already done this… not sharing data between agencies/silos is leads to inaccuracies, duplication of data and work (wasted time/money), additional complexity in data storage and gathering, plus it provides multiple attack surfaces for data breaches.
Also, I read that as “if one agency needs something they can ask the other one for it” which has likely been happening for centuries at this point and this is just another “Trump said we need to do what’s already happening so he can look smart and like he’s doing something besides golfing and accepting foreign bribes”.
Is there a reason not to have the lossless/original files on the server? What I mean is, you could setup one of the myriad of self hosted music streaming apps here and the vast majority will transcode to lossy, appropriately compressed files for steaming or even downloading on remote devices for offline listening, on the fly.
So they’re agreeing that the last set of 8 movies wasn’t a “faithful adaptation”?
I mean, they skipped lots of stuff from the books, so maybe this’ll fill that in? But at the same point, we’ve been through this already, who is going to care to rehash it all? Find a new story!
To be fair, the traditional web models were falling apart prior to AI as well. We’ve gone so far past “ad driven” that Everything has to be full of ads and clickbait to drive revenue just to run the infrastructure, let alone pay for the pages creation and upkeep. Journalists and developers, services and goods are all using adword soup to try to get anything close to a useful revenue stream and it’ll just keep getting worse until we figure out a better business model. We’re going to increasingly see paywalls to try to make up for that, but a large part of people on the internet won’t want to spend money on quality sources when they use to be able to get it for free. It’s been a race to the bottom for a while and it’s at a point that isn’t sustainable long term. AI just accelerates that to the next level.
We don’t even get the laugh track!
It’s okay, we’ll just get rid of the regulations on everything else so this one fits the norm.
Have it sync the backup files from the -2- part. You can then copy them out of the syncthing folder to a local one with a cron to rotate them. That way you get the sync offsite and you can keep them out of the rotation as long as you want.
I still don’t get the game price hate. There’s plenty of other things to hate on Nintendo for, but do an inflation check on $60 between 2010 and now and it’s $88. It’s way past time for prices to go up and they’re still much cheaper than they used to be in the 80’s and 90’s.
Do I like that it finally followed the market and increased? No. But PS5 and XBox games have been expensive new for quite a while. Indie games are able to fill in the market space below and that’s where my money goes anyway. This will just continue that trend.
Not to mention the used market will continue thanks to physical copies that are slowly being dropped from other platforms.
Same here. This seems like something we’d have had in place 20 yrs ago and they just layered facial recognition in the last 5-10.
That would be awesome, and I regularly do so on vacations, but let’s be real here: I like having a job so I can have a house and food and pay for goods and services when necessary. Being constantly connected is a basic requirement and responsibility for employment, so I’m going to choose the connection with the least impact on my daily life.
And no one was surprised.