this post was submitted on 29 Jun 2023
41 points (100.0% liked)
Technology
37712 readers
149 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Since I love playing devil's advocate, here's a couple of points in their defense:
Multi-GPU videocards: Pretty much dead, it's just not efficient.
64-bit computing: At the time was indeed slightly overhyped because while your OS was 64-bit, most software was still 32-bit, games in particular. So games couldn't really use more than 4 GB of memory. And that was standard for multiple years after this article (this was 2008, 64-bit Windows had been out for ages, and yet 3 years later the original Skyrim release was still 32-bit. Games having 64-bit binaries included was a huge thing at the time) Now most software is 64-bit and yes, NOW it's standard.
High definition: Depends, did they mean HD or Full-HD? Because the former certainly didn't last long for most people. Full HD replaced it real quick and stayed around for a while. Of course, if they meant Full-HD then hell no, they were hella wrong, it's been mainstream for a while and only now is being replaced by 1440p and 4K UHD.
iPhone: The FIRST one as a singular product really didn't live up to the hype. It was missing features that old dumbphones had. Of course the overall concept very much did revolutionize the phone market.
Well to be fair, changes like switching to 64 bit always are very slow (especially if they're not being forced by completely blocking 32 bit). But I don't think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.
Well by 2008 we'd had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it's not worth the effort in the personal computer space.
I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).
It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).