this post was submitted on 07 Jan 2024
103 points (100.0% liked)

Technology

37716 readers
328 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

one passage of note:

Where does all of this leave the Firefox browser. Surman argued that the organization is very judicious about rolling AI into the browser — but he also believes that AI will become part of everything Mozilla does. “We want to implement AI in a way that’s trustworthy and benefits people,” he said. Fakespot is one example of this, but the overall vision is larger. “I think that’s what you’ll see from us, over the course of the next year, is how do you use the browser as the thing that represents you and how do you build AI into the browser that’s basically on your side as you move through the internet?” He noted that an Edge-like chatbot in a sidebar could be one way of doing this, but he seems to be thinking more in terms of an assistant that helps you summarize articles and maybe notify you proactively. “I think you’ll see the browser evolve. In our case, that’s to be more protective of you and more helpful to you. I think it’s more that you use the predictive and synthesizing capabilities of those tools to make it easier and safer to move through the internet.”

you are viewing a single comment's thread
view the rest of the comments
[–] averyminya@beehaw.org 7 points 10 months ago (1 children)

I've been hopeful for an external hardware device, something akin to MythicAI's analog hardware. It essentially offloads the heavy duty work done by the GPU, with far lower power consumption and about 98-99% accuracy, then sends the output data back to the computer to be digitized. Adding more tensor cores is just making more power consumption which is already an issue.

That company in particular was using this method for real time AI tracking in cameras but I feel like it could be easily adapted to effectively eliminate the work in AI that NVIDIA is doing for GPU's. Why brute force AI with power and tensor cores when a couple wires and some voltage can sift through the same or larger models at the same.or faster speeds with, well okay about 98-99% accuracy. It could be a simple hardware attachment via PCIe or hell even USB with a small bottleneck for conversion times. I just used an app to upscale a photo locally on my phone, took about 14m (Xperia 1IV), I could easily have offloaded that work to an analog AI device. We are nearly to the point where we can just run "AI*" on a phone at nearly PC speeds.

All this to say - local AI indeed. The only way AI works is when everyone has access to it. Give full, free access to everybody and the fear of corporate interference drops drastically. There are plenty of models available online not made by Google or Microsoft pushing whatever or harvesting data back (remember to firewall your programs if you run them locally). Ideally tagsets could be open sourced but in the capitalist world I could also see independent artists selling models of their work under a license

/* Of course, AI as a broad spectrum term encompassing model based projects, LLM's for assistants & generative imaging, and not the actual AI as a semi-autonomous intelligence

[–] jarfil@beehaw.org 2 points 10 months ago* (last edited 10 months ago) (1 children)

Neuromorphic hardware seems to be best suited as an extension of RAM storage. It doesn't need to use the DAC/ADC approach of Mythic AI, some versions are compatible with a CMOS process, and could be either integrated directly into the processor, maybe as an extension of the cache or a dedicated neural processing module, or into RAM modules.

It's pretty clear that current NN processing solutions, by repurposing existing hardware, are bound to get replaced by dedicated hardware with a fraction of the power requirements and orders of magnitude larger processing capabilities.

Once some popular use cases for large NNs have been successfully proven, we can expect future hardware to come with support for them, so it also makes sense to make plans for software that can use them. And yes, local AI... and possibly trainable locally.

[–] averyminya@beehaw.org 2 points 10 months ago

Oh yeah Intel's version of that was looking promising too.