Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
My wife and I regularly joke that one day we'll harass our kids to help us with our neural interfaces but I don't think that sort of thing will happen any time soon.
When I was a kid in the 80's a lot of people could already afford computers. They weren't so cheap that everyone had them but they were affordable to a fair number of people if they really wanted one. A C64 cost $595 at launch, that's under $2,000 in today's dollars.
The biggest barrier to computers were that they weren't "user friendly". If you wanted to play a simple video game you needed to know some basic command line instructions. When I wanted to set up my first mouse for my 8086 it involved installing drivers and editing config.sys and autoexec.bat. You couldn't really do anything with a computer those days unless you were willing to nerd out.
At the same time, nerding out on a computer could easily get you deep into the guts of your computer in a functional way. I learned that the only way I could play video games at night was if I opened up the computer and disconnected the speaker wire so it wouldn't alert my parents. I also learned that I could "hack" Bards Tale by opening up the main file with debug and editing it so that the store would sell an infinite number of "Crystal Swords".
Today there are 2 cell phones for every human on earth. Kids walk around with supercomputers in their pockets. But they've become so "user friendly" that you barely even need to be literate to operate one. That's generally a good thing but it removes an incentive to figuring out how the stuff works. Most people only bother with that if they're having some trouble getting it working in the first place.
At the same time it's gotten much harder to make changes to your computer. The first Apple was a pile of circuits you needed to solder together. You can't even remove the battery on a modern one (without jumping through a lot of hoops). If you edit some of your games it's more likely to trigger some piracy or cheat protection than to let you actually change it.
There are still large communities of computer nerds but your average person today basically treats computers like magic boxes.
I'd expect that kind of gap in other areas. I'd take 3d printing as an example. You can get one now for a few hundred bucks. They're already used in industry but, at this point, they're still very fiddly. The people who have them at home are comfortable doing stuff like troubleshooting, flashing ROMs, wading through bad documentation and even printing custom upgrades for their printer.
Imo it's the opposite, millennials were in that period that you had to have certain computer knowledge to use technology. Today's kids don't use computers so much as they use phones and on the phones everything is super simplified for them compared to a 90s-2000s computer that we had to deal with. I think from here technology will only become easier to use to the point that new generations will actually have less technical knowledge compared to the previous generation.
I think we will be the exception because we came up when you had to figure out how to get things to work whereas everything today is intuitive and seamless and just works.
I'm able to troubleshoot newer tech when it doesn't work where it seems like gen Z can't do it because they haven't had the experience trying to figure out how it all works.
Maybe?
I mean theres boomers who were engineers in their youth who are complete idiots with modern technology, especially computers.
But as an elder millenial myself, I can kind of see it happening to me too. While I do enjoy technology and gadgets, I just dont have a need for all of it, nor the time to tinker like I did in my youth. Like I havent bothered at all with apple devices, so Im kinda clueless with how to navigate those things. Last time I used an apple product was around 2008 when I was using MacOS in college.
I don't think so. I honestly think we grew up in a time that encouraged learning how these things work. Anecdotally, younger people don't seem as interested, because everything's always "just worked" for them; they were raised on iPads. I already see them struggling with technology similarly to boomers when it isn't immediately obvious how to do something.
Yes. Now that tech has been refined and turned into fashion appliences, 20 somethings have no curiosity about tech and no desire to bend it to their will. Learning the underpinnings of tech bores them. I'm a boomer and feel like I grew up at the perfect time for a hacker/engineer. Tech was much simpler when I started out. It took work/programming to get your Commodore 64 to do anything interesting.
The biggest difference is the ui improvements. You don't need to know how to use a command line you can just click an icon. A lot of tech is largely the same as it ever was but just has an easier to use interface. Going forward something like palm readers and facial id scanners will simplify this even further as you won't even need to know what icon to tap. For better or worse.
i think everyone can learn how to use new tech, its more a question if you still want to.
For example i dont feel the need to get into tiktok.....but if tiktok existed 15 years ago i would have.
Here are still old people using CLI text based browsers on a dialup connettion who never felt the need to upgrade to a more visual way to browse the web...even if they could learn it.
At a certain age u just stop giving fucks about new things maybe.
I'm 39 and I'm already starting to get bad with certain parts of technology, so absolutely yes. That said, I'm also getting to the point where I'm starting not to care anymore.
You forgot gen-x, who were the first generation to really have access to the internet at a young age but had to work at it.
I'm gen-x and have both my boomer parents as well as my 'digital native' kids come to me for help with technology.
Your question reminded me of an interesting article I read a while back: Gen Z Is Apparently Baffled by Basic Technology.
It's kind of a click bait title, but I think it's still interesting. Technology is definitely generational, and I'm sure there are some things millennials will be better prepared to use in old age, but there will likely be lots of new tech that will be a struggle to learn.
I think it could go both ways. Another commenter mentioned the younger generation being used to things just working. You pull it out of the box, power on, good to go. When it comes to troubleshooting, that's where they seem to fall behind.
On a other note, I have a friend that is the same age as me, grew up with pretty similar upbringing. He's the type that if it doesn't work then it's not worth using. Couldn't get a Bluetooth speaker to connect to a computer so he got frustrated and played without sound.
I think it will really just depend on what that specific person was exposed to. Admittedly, there has been a few times I got confused at a card reader. I'm used to swiping or inserting the chip. Some card readers are the touch and go and it took me a few minutes to realize it was scan and go.
They are just as bad with current tech. Those of us who grew up as the internet was becoming more than just BBS and college databases had to learn the tech to use it.
Now everything "just works" so nobody needs to learn anything. Nothing is made to be repaired so if something breaks you just buy a new one. The younger generations can't even type properly on a keyboard even though they've been using them their entire lives.
With corporate monopolies, more advanced AI, and the failing of the education systems it will only get worse.
I think there are fewer tech-savvy people in the world than people think. Most people are capable of using tech just enough to do what they're trying to do, and don't take the time to learn to use newer technology, preferring to stick with what they know.
On the other hand, people with a desire to keep up with new technology probably will.
Maybe kids have a leg up on the boomers that only had slide rules growing up, but I believe that tech literacy is much lower than people realize. Beyond the bare minimum of using email and browsing the web, most people generally just don’t aren’t using computers in a deep way, including kids that just grow up consuming content on tablets. Touch screens actively obscure the complexity of computers to make them more intuitive.
This research was published nearly 10 years ago but I it’s relevant today: https://www.nngroup.com/articles/computer-skill-levels/
Maybe some of us will, but not all of us. I'm already better at troubleshooting things than my gen z coworkers because they never had to go through the things that we had to go through to make things work, or at least they don't know how to google things to find solutions to things like I do.
I'm 40 and I've never scanned a qr code in my life. So yes, absolutely.
Most of the basic tech issues and dumb questions I deal with at work are for people over 50 or under 25. Younger GenX and Millennials generally pick things up quickly and have no problem with basic troubleshooting.
It feels like some people will struggle with technology and for others it will be effortless, no matter what era they are from. Some people are curious about these things and want to know how machines work. It's the curiousity, systematic thinking and joy of learning that remain relevant.
Lots of interesting comments! I really enjoyed this thread. Two things I'd add:
-
I think "technology" should really be referred to as "a technology". For instance, judging Gen Z against a technology (like a photocopier) that predates their birth seems a bit unfair. As a Gen X, I don't think it was fair to be judged for growing up with calculators instead of slide-rulers. I love old tech, but I'm not kidding myself, it's old tech not the only tech.
-
Also shouldn't the organisation adapt instead? If new hires are more comfortable watching videos for training vs reading procedures, or taking photos of things with their phone instead of the photocopier, isn't that just fine. It's not my preference, but isn't it best for me to adapt rather than them.
It's not that I don't have generational pride. I like my generation, we were and are adaptable. I just can't imagine that the subsequent generations won't be as adaptable to things I can't even imagine yet.
That depends on the UX/UI developers. When the app tries to be smart and make every interaction a conversation, I immediately want to abandon it. Linux is spoiling me with its user-friendly "do what I tell you" philosophy.
As a member of the Oregon Trail Generation (a sweet spot on the boundary of Gen X and Millennial), I think people who were in elementary school in the 80s have a pretty special set of skills where we can use "old" technology, and were frequently the ones who had to help our elders with it, and we have seen new technology (home computers, the Internet, smart phones) come into being and mature.
So we didn't just learn how to use tech, we learned how to grow with tech as it grew.
I'm guessing large language models - imitative so-called "AI" - is going to do that same sort of growth and change arc over the next couple decades. It's likely I'll be pretty mystified by it, but hopefully my kids will be playing with it and growing with it as it grows into a mature technology.
Technology is quickly becoming less and less about the underlying technologies and more about how the large corporations want you to use their product. I was briefly a volunteer website administrator for a small non-profit and despite having done freelance web development 15 years ago and knowing how to program HTML and several other web technologies, it was a struggle because they used Google on the backend and everything in Google was unintuitivly laid out and impossible to do without going through the Google interface. I often frustratingly joked that I was a Google administrator, not a web administrator.
Another example was some Linksys wireless mesh extenders I bought. The setup process involved using a privacy invasive app on your phone to connect with Bluetooth. It would try for 5 minutes and then just error with no error code. There is no manual setup process. There was no log file. When it didn't work after 5 minutes of trying, it told you to call a phone number that was always busy and blocked the 5 minute connection process since it needs a phone to do both things. Eventually, after about 6 hours, it just randomly started working.
Combine that with people biologicaly becoming less able and willing to learn as they get older and it's pretty likely that millennials will eventually get left behind even if they try to keep up to date.
I am late 30s. Grew up without cell phones, computers etc. Didnt hear about the internet before it was available at the nearby library.
However..
I study computer science. I love tech, gadgets etc.
My theory is though, at some point, my interest drops and i just stick with what i know. Just like i stopped caring about tv, popculture, fashion etc etc
You get other things to do, and less time for the fun stuff.