

For sure, but as long as clickbait works they’ll keep doing it.


For sure, but as long as clickbait works they’ll keep doing it.


I mean yeah, but why? Like what did you like about it?


If they had made the deck more powerful, the old ones would suddenly have been obsolete.
I’m pretty sure it has more to do with current chip technology not actually changing that much in the, what, 2 years since the deck first released?
Also obsolete is a pretty strong word for what - if it had stronger internals - would likely end up being more expensive than current models.


To be fair, “an entire x” does have markedly different connotation than “x”. The emphasis is that it’s, well, the entirety of x. It’s the difference between “i ate the cereal” and “i ate all the cereal”.


Claymore (the end was kinda mid)
Genuinely curious - why do you like it? I see this at the top anime of all time. I watched it a few years ago and i thought it was absolutely horrible. Like 2 or 3 out of 10.
I feel like the only reason i can see is “the main character is a bad guy” but that doesnt excuse trope-y terrible writing, flat characters, and mid-2000’s animation that aged horribly. Am i missing something?


Make them optional lmao. I dont have a 4k screen, havent ever had one, and wont buy one for a very long time. Why am i storing these assets i will never use?


Honestly, it’s because a bunch of programs i used disappointed me (performance, functionality, [being a web app at all], etc.) and i figured it couldnt be that hard to do it better. In some cases i was right, in most i was wrong. As it turns out though, I really like programming so i guess i’m stuck here


I mean to be fair, those errors arent really meant for you (the end user) in the first place.
I’m not sure I understand your point about fall through having to be explicit
As far as i understand it, every switch statement requires a break otherwise it’s a compiler error - which makes sense from the “fallthrough is a footgun” C perspective. But fallthrough isnt the implicit behavior in C# like it is in C - the absence of a break wouldnt fall through, even if it wasnt a compiler error. Fallthrough only happens when you explicitly use goto.
But break is what you want 99% of the time, and fallthrough is explicit. So why does break also need to be explicit? Why isnt it just the default behavior when there’s nothing at the end of the case?
It’s like saying “my hammer that’s on fire isnt safe, so you’re required to wear oven mitts when hammering” instead of just… producing a hammer that’s not on fire.
From what i saw on the internet, the justification (from MS) was literally “c programmers will be confused if they dont have to put breaks at the end”.
the ergonomics expected of modern languages.
As someone learning c# right now, can we get some of those “modern ergonomics” for switch statements 💀
I cant believe it works the way it does. “Fallthrough logic is a dumb footgun, so those have to be explicit rather than the default. But C programmers might get confused somehow, so break has to be explicit too”
I miss fallthrough logic in languages that dont have it, and the “goto case” feature is really sick but like… Cmon, there’s clearly a correct way here and it isnt “there is no default behavior”


Generators probably. It’s the one thing i genuinely miss about python when i work in rust.


Ick. At the very least, i’ve seen it a LOT less in VSC. The fact that something as simple as rainbow brackets uses the freemium model in intellij sucks. I mean the fact that it’s not a builtin setting is dumb too but that’s beside the point


I feel like it’s like pointers.
“Variable” refers to the label, i.e. a box that can contain anything (like *ptr is a pointer to [something we dont know anything about])
Immutable describes the contents, i.e. the stuff in the box cant change. (like int* ptr describes that the pointer points to an int)
Rust makes it very obvious that there’s a difference between constants and immutable variables, mainly because constants must be compile time constants.
What do you call it when a variable cant change after its definition, but isnt guaranteed to be the same on each function call? (E.g. x is an array that’s passed in, and we’re just checking if element y exists)
It’s not a constant, the contents of that label are “changing”, but the label’s contents cant be modified inside the scope of that function. So it’s a variable, but immutable.


The freemium and constant “are you sure you dont want to pay?” from some intellij plugins is insulting enough that it’s hard to believe any developer would praise it. Presumably this doesnt happen in vscode because it cant happen in vscode, not because people arent shameless enough to do it there.


That depends on your definition of correct lmao. Rust explicitly counts utf-8 scalar values, because that’s the length of the raw bytes contained in the string. There are many times where that value is more useful than the grapheme count.


I think it was this issue. Looks like maybe it got fixed some time this year? Iunno, i’ll look into it at some point
code that’s been written today has been made obsolete by a language feature in the latest nightly build
I mean couldnt you say that about any language? There’s lots of old C code that’s obsoleted by features in C11. There’s lots of stuff written in python today that’s obsoleted by stuff in the 3.13 alpha. It’s just kinda how things go.
Doesnt the edition system prevent this from being too big of an issue anyway?


Rule number 2: stop dismissing performance questions just because of something some guy said decades ago. Performance matters, learning about performance matters, and answers like yours dont help anyone.
Did they ask if they should optimize, or did they ask which one generates more performant assembly? Which one of those questions did you answer?
Maybe they already measured and already knows this is a bottleneck. Maybe they are curious if match statements are a slow abstraction (e.g. in python, it’s essentially a chain of if/else. In rust it’s often compiled to an indexable table). Maybe the given example code is only partially representative of the actual code this is being applied to.
It’s so irritating to look up performance-related questions when this answer is at the top (and middle, and bottom) of every thread. I swear half the reason every piece of modern software runs like shit is because nobody bothered to learn how to optimize and now everyone just parrots that phrase instead of saying “i dont know”.
There’s tons of little “premature” optimizations that you can do that arent evil. Choosing the right data structure (how random is the access? Are you using keys? Does it need to be sorted?). Estimating time complexity and load size (e.g. “i’m parsing [11 million | 2] files, i should probably [keep time complexity in mind | ignore time complexity completely]”). Structuring loops in a way that’s easy for compilers to auto-vectorize - usually it’s not any harder to read what the loop is doing, so why not do it right away?
Yes i’m bitter =(


Is pycharm’s semantic highlighting still kinda ass? That’s the biggest thing that stopped me from using it over vsc. As of like may this year i remember there still being active issue tracking for it.
Qownnotes
It’s a desktop app, but can sync with self-hosted cloud servers. It’s also literally just text/markdown files.