14
submitted 1 year ago* (last edited 1 year ago) by db0@lemmy.dbzer0.com to c/singularity@lemmy.fmhy.ml

A lot of billionaire money is going into preventing an AI apocalypse instead of actual pressing problems of humanity.

top 4 comments
sorted by: hot top controversial new old
[-] Peanutbjelly@sopuli.xyz 5 points 1 year ago

wow. was expecting to agree heavily with the article, but it felt like reading intangible fluff. is that just me?

when they do talk directly about the issue, they say things like " AI models that scrape artists’s work without compensation" which is not how i would phrase 'actual concerns.' no mention of things like "models specifically built to manipulate the general populace." which i see still gets no attention. having models learn from the world we live in to create a general use model is not the issue, unless you're thinking about biases. i'm an artist, and for 20 years i've witnessed how terrible artist communities are at understanding things like copywrite and economic imbalance/redirection. this is unfortunate because artists are getting screwed, but the take on it is just wrong. this article doesn't touch any of it in a meaningful way.

there are definitely issues with alignment and interpretability that need to be understood as we move forward, and that's where the effort should be focused in these academic settings. if you want to focus on the existential, you should be directly viewing current models and how they could steer towards or away from such threats conceptually, while working on understanding and interpreting said models. we aren't just giving superpowers to an RLHF narrow model.

at least they mentioned climate change, since climate and economic imbalance are really the main existential risks we currently face. A.I. development is likely the only thing that will help us.

i think melanie mitchell is supposed to be interviewed on MachineLearningStreetTalk soon, and i assume she will have a good take on it.

so, TLDR: we likely won't have progress without interpretability, and that should in mind while developing better machines. don't read this article, just wait for the melanie mitchell interview, as i'm sure she's heated after that frustrating munk debate.

[-] moozogew@lemmy.fmhy.ml 2 points 1 year ago

Yeah it annoys me so many people don't seem to realise AI and automation is pretty much our only real hope of fixing our climate issues. automated construction will allow more efficient buildings which when designed using AI tools can easily incorporate all the newest and best design methods, AI tools likewise will help scientists and engineers design and run experiments which are totally out of the scope of current practice.

[-] Martineski@lemmy.fmhy.ml 2 points 1 year ago

Please add date of the article to the title, copy this:

(article from 5.07.2023)

[-] db0@lemmy.dbzer0.com 3 points 1 year ago
this post was submitted on 06 Jul 2023
14 points (93.8% liked)

Singularity | Artificial Intelligence (ai), Technology & Futurology

5 readers
1 users here now

About:

This sublemmy is a place for sharing news and discussions about artificial intelligence, core developments of humanity's technology and societal changes that come with them. Basically futurology sublemmy centered around ai but not limited to ai only.

Rules:
  1. Posts that don't follow the rules and don't comply with them after being pointed out that they break the rules will be deleted no matter how much engagement they got and then reposted by me in a way that follows the rules. I'm going to wait for max 2 days for the poster to comply with the rules before I decide to do this.
  2. No Low-quality/Wildly Speculative Posts.
  3. Keep posts on topic.
  4. Don't make posts with link/s to paywalled articles as their main focus.
  5. No posts linking to reddit posts.
  6. Memes are fine as long they are quality or/and can lead to serious on topic discussions. If we end up having too much memes we will do meme specific singularity sublemmy.
  7. Titles must include information on how old the source is in this format dd.mm.yyyy (ex. 24.06.2023).
  8. Please be respectful to each other.
  9. No summaries made by LLMs. I would like to keep quality of comments as high as possible.
  10. (Rule implemented 30.06.2023) Don't make posts with link/s to tweets as their main focus. Melon decided that the content on the platform is going to be locked behind login requirement and I'm not going to force everyone to make a twitter account just so they can see some news.
  11. No ai generated images/videos unless their role is to represent new advancements in generative technology which are not older that 1 month.
  12. If the title of the post isn't an original title of the article or paper then the first thing in the body of the post should be an original title written in this format "Original title: {title here}".
  13. Please be respectful to each other.

Related sublemmies:

!auai@programming.dev (Our community focuses on programming-oriented, hype-free discussion of Artificial Intelligence (AI) topics. We aim to curate content that truly contributes to the understanding and practical application of AI, making it, as the name suggests, “actually useful” for developers and enthusiasts alike.)

Note:

My posts on this sub are currently VERY reliant on getting info from r/singularity and other subreddits on reddit. I'm planning to at some point make a list of sites that write/aggregate news that this subreddit is about so we could get news faster and not rely on reddit as much. If you know any good sites please dm me.

founded 1 year ago
MODERATORS