I don’t understand the basis of the 24Hz limit rumor. My monitors are 144Hz, and if I limit them to 60Hz and move my mouse around I see fewer residual mouse cursors “after-images” than I do at 144Hz. That’s a simplified test that shows that the eye can perceive motion artifacts beyond 60Hz.
The eye can perceive LEDs that are rectified at 60Hz AC, it’s very annoying.
I could never tell if people who were claiming not seeing more than the 24 Hz/FPS thing were serious or just excusing poor game optimization. They were either fanboys defending a poor job of a product, or simply had terrible eyes. But I think even with the latter you’d still be able to tell the difference in smoothness.
It’s one of those things that once you experience a higher framerate in games it’s very hard to go back to a lower setting.
I find it hard to get used to in movies/shows though. My TV has an option to insert frames for smoother playback to make it appear a higher Hz, but it often looks unnatural. It was hard getting used to The Hobbit movie (I think it was Desolation of Smaug) that was in 48 FPS. And Avatar: Way of Water was constantly switching between lower and higher frames for regular and action scenes, it was such a jarring experience.
I believe 24Hz works in movies because the way cinemas are set up. The image projected onto canvas in a dark/dim room “burn” in (not sure what the correct term is) which can make it appear smoother. This is why they can get away with it in cinemas. Plus it’s also a consistent 24Hz, which in games (and Way of Water) isn’t.
People used this excuse for games, to make games more “cinematic”, but that was just an absolute horseshit excuse for games being poorly optimised. Especially if the framerate wasn’t locked to 24FPS, and because home monitors and TVs don’t work the same as cinema projectors.
I’m sure if all cinemas and media would move to a higher framerate/Hz it would eventually just feel normal though. It just often takes a lot of time getting used to, especially for cinema experiences.
I used to have a 4k tv I used as a monitor. It was 60hz. When I was tired, my eyes would vibrate back and forth trying to play nice with the frame rate, blurring everything up. Very difficult to read. Huge increase in headaches.
Switched to 120hz tv (all other specs equal) and the problem stopped entirely and hasn’t resurfaced in the 6 years since.
A person may not notice it directly, but it does matter.
I don’t really notice in movies and stuff but those are so damned chaotic anyway that it probably really doesn’t matter as much. (I don’t like live action, it’s difficult af to follow)
I haven’t noticed in games really but i mostly play console where that’s not really something you can usually tweak
It’s often weird how people don’t notice it much when you turn a setting on or off. But then I usually whip out the UFO site and they’re immediately convinced (it’s also easier to explain).
I have to say that on the PS5 the framerate differences have been quite noticeable. Especially first-party titles that support performance mode to go up to 60+ FPS instead of a usually locked 30, like in God of War and Horizon games.
I haven’t really run into ps5 issues, but then physical media is very difficult to find for 5, so I only have 4 games for ps5 vs 50+ for ps4 (I don’t buy digital games, ever).
But I guess I don’t really pay much attention to it either. As long as it works well enough I don’t usually mess with the display settings other than turning gama waaaaaaay up so I can see shit properly… my tv doesn’t support hdr, which I think became standard in 2017, or anything newer than that which newer games are built to use, so I mostly just leave the defaults alone. I definitely notice some games are smoother than others, but that could just as easily be the texture pack or resource utilization as well.
Back when I was playing games on my phone, I’d actually turn down the refresh… sure this game can run at 120, but it can also run at 30 or 60, let’s see what the lowest I can stand is! I don’t do that anymore, but it was good for battery life :)
I think it’s the limit for what most people can see as jittery motion. You may be able to differentiate between higher FPS settings, but above 24 hertz most people shouldn’t be able to see discrete steps.
24hz is the lower limit. People will perceive 24hz as a smooth sequence, especially with motion blur, while anything below it will start to look choppy. Of course humans can perceive higher frequencies. But 24hz became the standard because celluloid film is expensive especially in the early days of cinema. The less frames you need to shoot the less film you need to buy and develop. And film back then was probably not sensitive enough for the lower exposure times that come with higher frame rates.
No it can see much more. Bonus: your brain can ‘see’ more than 100hz too. Google bundesen tva. Source i worked on programs to measure it for my gfs phd. Also i play fps :D
My biggest gripe with cooking instructions is the non-specificity. “Stir pasta frequently”? How frequently? How continuously? Tell me in unit Hertz
I won’t accept my pasta at anything lower than 120Hz.
Not sure your pasta will survive that kind of speeds…
They just pasta way
Oh gnocchi didn’t.
The human eye cannot see more than 24Hz, so why bother
I don’t understand the basis of the 24Hz limit rumor. My monitors are 144Hz, and if I limit them to 60Hz and move my mouse around I see fewer residual mouse cursors “after-images” than I do at 144Hz. That’s a simplified test that shows that the eye can perceive motion artifacts beyond 60Hz.
The eye can perceive LEDs that are rectified at 60Hz AC, it’s very annoying.
I could never tell if people who were claiming not seeing more than the 24 Hz/FPS thing were serious or just excusing poor game optimization. They were either fanboys defending a poor job of a product, or simply had terrible eyes. But I think even with the latter you’d still be able to tell the difference in smoothness.
It’s one of those things that once you experience a higher framerate in games it’s very hard to go back to a lower setting.
I find it hard to get used to in movies/shows though. My TV has an option to insert frames for smoother playback to make it appear a higher Hz, but it often looks unnatural. It was hard getting used to The Hobbit movie (I think it was Desolation of Smaug) that was in 48 FPS. And Avatar: Way of Water was constantly switching between lower and higher frames for regular and action scenes, it was such a jarring experience.
iirc 24hz is just the minnimum thta the movie industry found creates the illusion of a moving image.
I believe 24Hz works in movies because the way cinemas are set up. The image projected onto canvas in a dark/dim room “burn” in (not sure what the correct term is) which can make it appear smoother. This is why they can get away with it in cinemas. Plus it’s also a consistent 24Hz, which in games (and Way of Water) isn’t.
People used this excuse for games, to make games more “cinematic”, but that was just an absolute horseshit excuse for games being poorly optimised. Especially if the framerate wasn’t locked to 24FPS, and because home monitors and TVs don’t work the same as cinema projectors.
I’m sure if all cinemas and media would move to a higher framerate/Hz it would eventually just feel normal though. It just often takes a lot of time getting used to, especially for cinema experiences.
I used to have a 4k tv I used as a monitor. It was 60hz. When I was tired, my eyes would vibrate back and forth trying to play nice with the frame rate, blurring everything up. Very difficult to read. Huge increase in headaches.
Switched to 120hz tv (all other specs equal) and the problem stopped entirely and hasn’t resurfaced in the 6 years since.
A person may not notice it directly, but it does matter.
I don’t really notice in movies and stuff but those are so damned chaotic anyway that it probably really doesn’t matter as much. (I don’t like live action, it’s difficult af to follow)
I haven’t noticed in games really but i mostly play console where that’s not really something you can usually tweak
It’s often weird how people don’t notice it much when you turn a setting on or off. But then I usually whip out the UFO site and they’re immediately convinced (it’s also easier to explain).
I have to say that on the PS5 the framerate differences have been quite noticeable. Especially first-party titles that support performance mode to go up to 60+ FPS instead of a usually locked 30, like in God of War and Horizon games.
I haven’t really run into ps5 issues, but then physical media is very difficult to find for 5, so I only have 4 games for ps5 vs 50+ for ps4 (I don’t buy digital games, ever).
But I guess I don’t really pay much attention to it either. As long as it works well enough I don’t usually mess with the display settings other than turning gama waaaaaaay up so I can see shit properly… my tv doesn’t support hdr, which I think became standard in 2017, or anything newer than that which newer games are built to use, so I mostly just leave the defaults alone. I definitely notice some games are smoother than others, but that could just as easily be the texture pack or resource utilization as well.
Back when I was playing games on my phone, I’d actually turn down the refresh… sure this game can run at 120, but it can also run at 30 or 60, let’s see what the lowest I can stand is! I don’t do that anymore, but it was good for battery life :)
I think it’s the limit for what most people can see as jittery motion. You may be able to differentiate between higher FPS settings, but above 24 hertz most people shouldn’t be able to see discrete steps.
That’s at least how I’ve come to understand it
24hz is the lower limit. People will perceive 24hz as a smooth sequence, especially with motion blur, while anything below it will start to look choppy. Of course humans can perceive higher frequencies. But 24hz became the standard because celluloid film is expensive especially in the early days of cinema. The less frames you need to shoot the less film you need to buy and develop. And film back then was probably not sensitive enough for the lower exposure times that come with higher frame rates.
Sooo…just curious how you explain this?
Just another of those internet image optical illusions. You won’t be fooling anyone on here 🧐
No it can see much more. Bonus: your brain can ‘see’ more than 100hz too. Google bundesen tva. Source i worked on programs to measure it for my gfs phd. Also i play fps :D
only 120hz?! I refuse to eat any pasta below 2.4ghz
Those are some loose standards… I only accept pasta at 1.21 Jiggawatts and 88mph.
Just imagine the chaos when you run the microwave at the same time!
What kind of dumb instructions are that?
Stirring exactly once is enough in most cases.
Maybe a graph of how strong the bond gets over time for 2 elements?