36
submitted 10 months ago* (last edited 10 months ago) by Grayox@lemmy.ml to c/asklemmy@lemmy.ml

I feel as of CEOs worried about the dangers of AI are equating the end of Capitalism with the end of the world, what are your thoughts?

you are viewing a single comment's thread
view the rest of the comments
[-] kromem@lemmy.world 8 points 10 months ago* (last edited 10 months ago)

Capitalism.

There's a great economics paper from the early 20th century that in part won its author the Nobel for economics called "The Nature of the Firm."

It hypothesized that the key reason large corporations made sense to exist was the high transactional costs of labor - finding someone for a role, hiring, onboarding, etc.

It was relevant years ago with things like Uber, where it used to be you needed to get a cab medallion or to make a career out of driving people around, but lowering the transactional costs with tech meant you could do it as a side gig.

Well what's the advantage of a massive corporation when all transaction costs drop to nothing?

Walmart can strongarm a mom and pop because of its in house counsel working on defending or making a suit. But what if a legal AI can do an equivalent job to in house counsel for $99 compared to $10k in billable hours? There's a point of diminishing returns where Walmart outspending Billy Joe's mart just doesn't make sense any more. And as that threshold gets pulled back further the competitive advantages of Walmart reduce.

And this is going to happen for nearly everything other than blue collar labor. Which is an area where local small and medium sized businesses are going to be more competitive in hiring quality talent than large corporations that try to force people to take crap jobs for crap pay because they've made themselves the only show in town.

AI isn't going to kill off humanity. We're doing a fine job of that ourselves, and our previous ideas about AI have all turned out to be BS predictions. What's actually arriving is reflecting humanity at large in core ways that persist deeply (such as the key jailbreaking method right now being an appeal to empathy). Humanity at large around the hump of the normal distribution is much better than the ~5% of psychopaths who end up overrepresented in managerial roles.

i.e. AI will be kinder to humanity than most of the humans in positions of power and influence.

[-] Lanthanae@lemmy.blahaj.zone 2 points 10 months ago

such as the key jailbreaking method right now being an appeal to empathy

Honestly the most optimistic thing that's come out of this. A potential AGI singularity is still terrifying to me...but this does take the edge off a bit.

this post was submitted on 21 Nov 2023
36 points (73.1% liked)

Asklemmy

43396 readers
1348 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS