this post was submitted on 07 Jan 2024
103 points (100.0% liked)
Technology
37712 readers
191 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Neuromorphic hardware seems to be best suited as an extension of RAM storage. It doesn't need to use the DAC/ADC approach of Mythic AI, some versions are compatible with a CMOS process, and could be either integrated directly into the processor, maybe as an extension of the cache or a dedicated neural processing module, or into RAM modules.
It's pretty clear that current NN processing solutions, by repurposing existing hardware, are bound to get replaced by dedicated hardware with a fraction of the power requirements and orders of magnitude larger processing capabilities.
Once some popular use cases for large NNs have been successfully proven, we can expect future hardware to come with support for them, so it also makes sense to make plans for software that can use them. And yes, local AI... and possibly trainable locally.
Oh yeah Intel's version of that was looking promising too.