this post was submitted on 24 Feb 2024
239 points (91.9% liked)
Technology
59404 readers
3109 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There's magic?
Only if you believe in it. Many CEOs do. They're very good in magical thinking.
I have a counter argument. From an evolutionary standpoint, if you keep doubling computer capacity exponentially isn't it extraordinarily arrogant of humans to assume that their evolutionarily stagnant brains will remain relevant for much longer?
You can make the same argument about humans that you do AI, but from a biological and societal standpoint. Barring any jokes about certain political or geographical stereotypes, humans have gotten "smarter" that we used to be. We are very adaptable, and with improvements to diet and education, we have managed to stay ahead of the curve. We didn't peak at hunter-gatherer. We didn't stop at the Renaissance. And we blew right past the industrial revolution. I'm not going to channel my "Humanity, Fuck Yeah" inner wolf howl, but I have to give our biology props. The body is an amazing machine, and even though we can look at things like the current crop of AI and think, "Welp, that's it, humans are done for," I'm sure a lot of people thought the same at other pivotal moments in technological and societal advancement. Here I am, though, farting taco bell into my office chair and typing about it.
You can compare human intelligence to centuries ago on a simple linear scale. Neural density has not increased by any stretch of the imagination in the way that transistor density has. But I'm not just talking density I'm talking about scalability that is infinite. Infinite scale of knowledge and data.
Let's face it people are already not that intelligent, we are smart enough to use the technology of other smarter people. And then there are computers, they are growing intelligently with an artificial evolutionary pressure being exerted on their development, and you're telling me that that's not going to continue to surpass us in every way? There is very little to stop computers from being intelligent on a galactic scale.
Computer power doesn't scale infinitely, unless you mean building a world mind and powering if off of the spinning singularity at the center of the galaxy like a type 3 civilization, and that's sci-fi stuff. We still have to worry about bandwidth, power, cooling, coding and everything else that going into running a computer. It doesn't just "scale". There is a lot that goes into it, and it does have a ceiling. Quantum computing may alleviate some of that, but I'll hold my applause until we see some useful real world applications for it.
Furthermore, we still don't understand how the mind works, yet. There are still secrets to unlock and ways to potentially augment and improve it. AI is great, and I fully support the advancement in technology, but don't count out humans so quickly. We haven't even gotten close to human level intelligence and GOFAI, and maybe we never will.
As I said that answer seems incredibly arrogant in the face of evolutionary pressure and logarithmic growth.
You can believe whatever you want, but I don't think it's arrogant to say what I did. You are basing your view of humanity on what you think humanity has done, and basing your view on AI based on what you think it will do. Those are fundamentally different and not comparable. If you want to talk about the science fiction future of AI, we should talk about the science fiction future of humanity as well. Let's talk about augmenting ourselves, extending lifespans, and all of the good things that people think we'll do in the coming centuries. If you want to look at humans and say that we haven't evolved at all in the last 3000 years, then we should look at computers the same way. Computers haven't "evolved" at all. They still do the same thing they always have. They do a lot more of it, but they don't do anything "new". We have found ways to increase the processing power, and the storage capacity, but a computer today has the same limits that the one that sent us to the moon had. It's a computer, and incapable of original thought. You seem to believe that just because we throw more ram and processors at it that somehow that will change things, but it doesn't. It just means we can do the same things, but faster. Eventually we'll run out of things to process and data to store, but that won't bring AI any closer to reality. We are climbing the mountain, but you speak like we have already crested. We've barely left base camp in the grand scheme of artificial intelligence.
Holy wall of unparagraphed word salad, Again you are not understanding what is and isn't an evolutionary process, a disease can wipe out half a species and that is considered a process of evolution. You don't have to be intelligent about it, all you have to do is continue to increase complexity due to an external force and that is it. That's all that is needed to have an evolutionary force.
With computers we don't have to know what we are doing (to recreate consciousness), we just have to select for better more complex systems (the same way evolution did for humans) which is the inevitable result of progress. Do you think computers are going to stop improving? The road maps for chip architecture for the next ten years doesn't seem to suggest it's slowing down yet.
And like the fractalization of coastlines, facts, knowledge and data are completely unlimited, the deeper you look the more there is.
On top of all of this you have the fact that progress has constantly been accelerating in a way that human intelligence is incapable of percieving accurately.
Therefore computer intelligence is vastly going to outpace or own. And very soon too.
Ahh, we are getting into the insult round of tonight's entertainment. I'll break this reply down for you.
It seems our definitions differ slightly, yes.
That, and the ability to self-actuate your own evolution. You see, that's what we differ on definition of evolutionary force. We didn't have some greater will forcing us down a path of evolution. There was no force. There was trial and error. The "lived long enough to fuck" survived, the rest didn't. Reproduction is a fundamental aspect of evolution. Computers can't reproduce. We have to facilitate that ourselves, though iterating on various aspects of computers. Right now we can fake it with increased processing power, increased memory, more elegant code, but at the end of the day, without some form reproductive system that doesn't rely on us, the computer can't exceed our grasp. If it could, we'd see true exponential growth, not compounding as in Moore's Law. We can't make them do more than what they already do. We can just make them do it faster.
Yeah, sure, and I can cram a hundred monkey's in a room with a hundred typewriters and come up with a better love story than Twilight, but it's gonna take time. Not Shakespeare time, but a few weeks at least. That's the thing, though, the evolution of any system doesn't happen overnight. We didn't wake up one day, walk out of our cave, and create TikTok. Evolution is a long process. You forget all of the things that happened before we figured out that our thumbs weren't solely for sticking up our own asses. There are millions of years that you aren't accounting for. Billions of attempts to create what we take for granted. Consciousness. You say that we don't have to know what we are doing, and you are right, we don't, but it's a crap-shoot with quadrillion to one odds.
Again, we can store as much data as we want, it won't make AI happen. We haven't spontaneously seen life form in libraries, but they have been storing data in them for thousands of years. Consciousness isn't data. If that's all you want, ChatGPT is passing the bar. It still can't tell me it loves me, and mean it.
Funny, you seem to think that you perceive it pretty well...
A well thought out conclusion I'm sure is based on all of the facts you failed to present. Bravo.
The laws of physics still apply. We already have to do all kinds of crazy tricks to make transistors as small as they are and not leak electrons all over the place due to quantum tunneling. The best thing we figured out how to do is just pile on more CPU/GPU cores.
It's also arrogant to assume we will continue on this exponential industrial-revolution growth of the last 300 years and not plateau as a species again for the next thousand. We could be looking at an eon of just burnin' away our oil while we try to cling more and more to whatever other energy impinges on this pitiful little planet, trapped in our local space unable to use our pathetic spacecraft to push us any further.
The laws of physics are no less or more applicable to our own biology in terms of complexity, density, scale, and information capacity and in most ways is far less efficient and accurate than their silicon counterparts.
There is nothing to suggest the growth in computer intelligence is going to stop occurring or it's doing anything but just getting started.
Apart from your use of infinite I agree, there is no reason we shouldn't be able to surpass nature with synthetic intelligence. The time computers have existed is a mere blip on a historic scale, and computers has surpassed us at logic games like Chess and at math already long ago.
Modern LLM models are just the current stage, before that it could be said it was pattern recognition. We had OCR in the 80's as probably the most practical example. It may seem there is long between the breakthroughs, but 40 years is nothing compared to evolution.
I have no doubt strong AI will be achieved eventually, and when we do, I have no doubt AI will surpass our intelligence in every way very quickly.
As a counter argument against that, companies are trying to make self driving cars work for 20 years. Processing power has increased by a million and the things still get stuck. Pure processing power isn't everything.
If you keep doubling the number of fruit flies exponentially, isn't it likely that humanity will find itself outsmarted?
The answer is no, it isn't. Quantity does not quality make and all our current AI tech is about ways to breed fruit flies that fly left or right depending on what they see.
Magic as in street magician, not magic as in wizard. Lots of the things that people claim AI can do are like a magic show, it's amazing if you look at it from the right angle, and with the right skill you can hide the strings holding it up, but if you try to use it in the real world it falls apart.
I wish there was actual magic
It would make science very difficult.
What if it magically made it easier?
Mmm irrational shit makes rationality harder
Look at quantum mechanics
Everything is magic if you don't understand how the thing works.
I wish. I don't understand why my stomach can't handle corn, but it doesn't lead to magic. It leads to pain.
Have you eaten hominy corn? The nixtamalisation process makes it digestible.
I don't have access to that, sadly. I'm pretty sure my body would reject it however. At least from my reading on what it is.
Sam Altman will make a big pile of investor money disappear before your very eyes.
The masses have been treating it like actual magic since the early stages and are only slowly warming up to the idea it‘s calculations. Calculations of things that are often more than the sum of it‘s parts as people start to realize. Well some people anyway.
oh the bubble's gonna burst sooner than some may think
Next week, some say
If you're a thechbro, this is the new magic shit, man! To the moooooon!
If only.