submitted 3 weeks ago by L3s@lemmy.world to c/technology@lemmy.world

Greetings everyone,

We wanted to take a moment and let everyone know about the !business@lemmy.world community on Lemmy.World which hasn't gained much traction. Additionally, we've noticed occasional complaints about Business-related news being posted in the Technology community. To address this, we want to encourage our community members to engage with the Business community.

While we'll still permit Technology-related business news here, unless it becomes overly repetitive, we kindly ask that you consider cross-posting such content to the Business community. This will help foster a more focused discussion environment in both communities.

We've interacted with the mod team of the Business community, and they seem like a dedicated and welcoming group, much like the rest of us here on Lemmy. If you're interested, we encourage you to check out their community and show them some support!

Let's continue to build a thriving and inclusive ecosystem across all our communities on Lemmy.World!

submitted 1 hour ago by ylai@lemmy.ml to c/technology@lemmy.world
submitted 35 minutes ago by Xatolos@reddthat.com to c/technology@lemmy.world
submitted 1 hour ago by boem@lemmy.world to c/technology@lemmy.world
submitted 3 hours ago by nave@lemmy.ca to c/technology@lemmy.world
On Being an Outlier (www.goethe.de)
submitted 2 hours ago by JoBo@feddit.uk to c/technology@lemmy.world

Proponents of AI and other optimists are often ready to acknowledge the numerous problems, threats, dangers, and downright murders enabled by these systems to date. But they also dismiss critique and assuage skepticism with the promise that these casualties are themselves outliers — exceptions, flukes — or, if not, they are imminently fixable with the right methodological tweaks.

Common practices of technology development can produce this kind of naivete. Alberto Toscano calls this a “Culture of Abstraction.” He argues that logical abstraction, core to computer science and other scientific analysis, influences how we perceive real-world phenomena. This abstraction away from the particular and toward idealized representations produces and sustains apolitical conceits in science and technology. We are led to believe that if we can just “de-bias” the data and build in logical controls for “non-discrimination,” the techno-utopia will arrive, and the returns will come pouring in. The argument here is that these adverse consequences are unintended. The assumption is that the intention of algorithmic inference systems is always good — beneficial, benevolent, innovative, progressive.

Stafford Beer gave us an effective analytical tool to evaluate a system without getting sidetracked arguments about intent rather than its real impact. This tool is called POSIWID and it stands for “The Purpose of a System Is What It Does.” This analytical frame provides “a better starting point for understanding a system than a focus on designers’ or users’ intention or expectations.”

submitted 9 hours ago by boem@lemmy.world to c/technology@lemmy.world

cross-posted from: https://lemmy.world/post/14276504

Odours have a complex topography, and it’s been mapped by AI

submitted 20 hours ago* (last edited 17 hours ago) by umami_wasbi@lemmy.ml to c/technology@lemmy.world

If a stamp have a barcode, why not just let people who have printers at home to print it on the envelope directly? This eliminates the need to buy physical stamp, thus the probability of buying counterfeit stamps.


The software maker will use the Recommended section of the Start menu, which usually shows file recommendations, to suggest apps from the Microsoft Store.


Google provides cloud computing services to the Israeli Ministry of Defense, and the tech giant has negotiated deepening its partnership during Israel’s war in Gaza, a company document viewed by TIME shows.

The Israeli Ministry of Defense, according to the document, has its own “landing zone” into Google Cloud—a secure entry point to Google-provided computing infrastructure, which would allow the ministry to store and process data, and access AI services.

Project Nimbus is a controversial $1.2 billion cloud computing and AI agreement between the Israeli government and two tech companies: Google and Amazon. Reports in the Israeli press have previously indicated that Google and Amazon are contractually barred from preventing specific arms of the Israeli state using their technology under Project Nimbus. But this is the first time the existence of a contract showing that the Israeli Ministry of Defense is a Google Cloud customer has been made public.

Google recently described its work for the Israeli government as largely for civilian purposes. “We have been very clear that the Nimbus contract is for workloads running on our commercial platform by Israeli government ministries such as finance, healthcare, transportation, and education,” a Google spokesperson told TIME for a story published on April 8. “Our work is not directed at highly sensitive or classified military workloads relevant to weapons or intelligence services.”


When Adobe Inc. released its Firefly image-generating software last year, the company said the artificial intelligence model was trained mainly on Adobe Stock, its database of hundreds of millions of licensed images. Firefly, Adobe said, was a “commercially safe” alternative to competitors like Midjourney, which learned by scraping pictures from across the internet.

But behind the scenes, Adobe also was relying in part on AI-generated content to train Firefly, including from those same AI rivals. In numerous presentations and public postsabout how Firefly is safer than the competition due to its training data, Adobe never made clear that its model actually used images from some of these same competitors.


cross-posted from: https://lemmy.world/post/14246943

I found this talk really helpful in understanding the broader context of open source's recent difficulties (see xz vulnerability, Redis license change, etc.)

I am one of the people who has immensely enjoyed using open source at a personal level (and have done a tiny bit of contributing). I've seen and read a lot about burn out in open source and the difficulties of independent open source maintainers trying to make a living off their work while companies make billions using that work and only ever interact with the maintainer to demand more unpaid labor. But I've never seriously considered how we got to this point or what it might take to move to a more sustainable world of thriving, fair open source.


Couch Potato Report predicts half of Canada will be without traditional TV by 2026


I was just watching a tiktok with a black girl going over how race is a social construct. This felt wrong to me so I decided to back check her facts.

(she was right, BTW)

Now I've been using Microsoft's Copilot which is baked into Bing right now. It's fairly robust and sure it has it's quirks but by and large it cuts out the middle man of having to find facts on your own and gives a breakdown of whatever your looking for followed by a list of sources it got it's information from.

So I asked it a simple straightforward question:

"I need a breakdown on the theory behind human race classifications"

And it started to do so. quite well in fact. it started listing historical context behind the question and was just bringing up Johann Friedrich Blumenbach, who was a German physician, naturalist, physiologist, and anthropologist. He is considered to be a main founder of zoology and anthropology as comparative, scientific disciplines. He has been called the "founder of racial classifications."

But right in the middle of the breakdown on him all the previous information disappeared and said, I'm sorry I can't provide you with this information at this time.

I pointed out that it was doing so and quite well.

It said that no it did not provide any information on said subject and we should perhaps look at another subject.

Now nothing i did could have fallen under some sort of racist context. i was looking for historical scientific information. But Bing in it's infinite wisdom felt the subject was too touchy and will not even broach the subject.

When other's, be it corporations or people start to decide which information a person can and cannot access, is a damn slippery slope we better level out before AI starts to roll out en masse.

PS. Google had no trouble giving me the information when i requested it. i just had to look up his name on my own.


Should just use Linux, tbh.


Jesus, again already?


The encounter between a Rivian driver and uninformed Tesla owner highlights 'a need for better education and communication within the EV community,' the Rivian driver says.

view more: next ›


53483 readers
2612 users here now

This is a most excellent place for technology news and articles.

Our Rules

  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots

founded 10 months ago