I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture.
You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
I mean its not that crazy, I’m writing this on a moto Z2 play. It was released June 2017, not long till year 6 bit hope it goes longer. It’s perfectly usable, runs most apps fine, can even run TFT.
Phones haven’t changed that much recently, this model has a great screen, 4gb of ram(more than some laptops that are still being released!), and a decent chip. Only issue is the battery is sub 3000mah but I know of a few models from around the same time went up to 5000mah.
You do get better mileage running an OS like lineage and being degoogled since a lot of their tracking processes kill the battery and slows things down.
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Your original comment mentioned general purpose computers. Video production definitely isn’t general purpose.
What do you mean by productivity?
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Yup, exactly this.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
So what are you suggesting - everyone to stick to 640x480 even though many smartphones today shoot 4K/60?
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture. You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
I mean its not that crazy, I’m writing this on a moto Z2 play. It was released June 2017, not long till year 6 bit hope it goes longer. It’s perfectly usable, runs most apps fine, can even run TFT.
Phones haven’t changed that much recently, this model has a great screen, 4gb of ram(more than some laptops that are still being released!), and a decent chip. Only issue is the battery is sub 3000mah but I know of a few models from around the same time went up to 5000mah.
You do get better mileage running an OS like lineage and being degoogled since a lot of their tracking processes kill the battery and slows things down.