[-] GamingChairModel@lemmy.world 2 points 3 days ago

Ok so most monitors sold today support DDC/CI controls for at least brightness, and some support controlling color profiles over the DDC/CI interface.

If you get some kind of external ambient light sensor and plug it into a USB port, you might be able to configure a script that controls the brightness of the monitor based on ambient light, without buying a new monitor.

[-] GamingChairModel@lemmy.world 1 points 3 days ago

They chiplet past 500

I don't know if I'm using the right vocabulary, maybe "die size" is the wrong way to describe it. But the Ultra line packages two Max SoCs with a high performance interconnect, so that the whole package does use about 1000 mm^2 of silicon.

My broader point is that much of Apple's performance comes from their willingness to actually use a lot of silicon area to achieve that performance, and it's very expensive to do so.

[-] GamingChairModel@lemmy.world 2 points 3 days ago

Apple does two things that are very expensive:

  1. They use a huge physical area of silicon for their high performance chips. The "Pro" line of M chips have a die size of around 280 square mm, the "Max" line is about 500 square mm, and the "Ultra" line is possibly more than 1000 square mm. This is incredibly expensive to manufacture and package.
  2. They pay top dollar to get the exclusive rights to TSMC's new nodes. They lock up the first year or so of TSMC's manufacturing capacity at any given node, at which point there is enough capacity to accommodate other designs from other TSMC clients (AMD, NVIDIA, Qualcomm, etc.). That means you can just go out and buy an Apple device made from TSMC's latest node before AMD or Qualcomm have even announced the lines that will be using those nodes.

Those are business decisions that others simply can't afford to follow.

[-] GamingChairModel@lemmy.world 1 points 3 days ago

The biggest problem they are having is platform maturity

Maybe that's an explanation for desktop/laptop performance, but I look at the mobile SoC space where Apple holds a commanding lead over ARM chips from Qualcomm, and where Qualcomm has better performance and efficiency than Samsung's Exynos line, and I'm thinking a huge chunk of the difference between manufacturers can't simply be explained by ISA or platform maturity. Apple has clearly been prioritizing battery life and efficiency for 10+ generations of Apple Silicon in the mobile market, and has a lead independent of its ISA, even as it trickled over to the laptop and desktop market.

[-] GamingChairModel@lemmy.world 6 points 4 days ago

Well, specifically, they're promising battery life that beats Qualcomm's implementation of an ARM laptop SoC.

Qualcomm is significantly behind Apple. I'm not convinced that the ISA matters all that much for battery life. AMD's x86_64 performance per watt blew Intel's out of the water in recent generations, and Qualcomm/Samsung's ARM chips can't compete with Apple's ARM chips in the mobile, tablet, or laptop space.

[-] GamingChairModel@lemmy.world 3 points 4 days ago

To be honest, no. I mainly know about JPEG XL only because I'm acutely aware of the limitations of standard JPEG for both photography and high resolution scanned documents, where noise and real world messiness cause all sorts of problems. Something like QOI seems ideal for synthetic images, which I don't work with a lot, and wouldn't know the limitations of PNG as well.

[-] GamingChairModel@lemmy.world 3 points 4 days ago

You say that it is sorted in the order of most significants, so for a date it is more significant if it happend 1024, 2024 or 9024?

Most significant to least significant digit has a strict mathematical definition, that you don't seem to be following, and applies to all numbers, not just numerical representations of dates.

And most importantly, the YYYY-MM-DD format is extensible into hh:mm:as too, within the same schema, out to the level of precision appropriate for the context. I can identify a specific year when the month doesn't matter, a specific month when the day doesn't matter, a specific day when the hour doesn't matter, and on down to minutes, seconds, and decimal portions of seconds to whatever precision I'd like.

[-] GamingChairModel@lemmy.world 1 points 4 days ago

Sometimes the identity of the messenger is important.

Twitter was super easy to set up with the API to periodically tweet the output of some automated script: a weather forecast, a public safety alert, an air quality alert, a traffic advisory, a sports score, a news headline, etc.

These are the types of messages that you'd want to subscribe to the actual identity, and maybe even be able to forward to others (aka retweeting) without compromising the identity verification inherent in the system.

Twitter was an important service, and that's why there are so many contenders trying to replace at least part of the experience.

[-] GamingChairModel@lemmy.world 22 points 4 days ago

This isn't exactly what you asked, but our URI/URL schema is basically a bunch of missed opportunities, and I wish it was better designed.

Ok so it starts off with the scheme name, which makes sense. http: or ftp: or even tel:

But then it goes into the domain name system, which suffers from the problem that the root, then top level domain, then domain, then progressively smaller subdomains, go right to left. www.example.com requires the system look up the root domain, to see who manages the .com tld, then who owns example.com, then a lookup of the www subdomain. Then, if there needs to be a port number specified, that goes after the domain name, right next to the implied root domain. Then the rest of the URL, by default, goes left to right in decreasing order of significance. It's just a weird mismatch, and would make a ton more sense if it were all left to right, including the domain name.

Then don't get me started about how the www subdomain itself no longer makes sense. I get that the system was designed long before HTTP and the WWW took over the internet as basically the default, but if we had known that in advance it would've made sense to not try to push www in front of all website domains throughout the 90"s and early 2000's.

[-] GamingChairModel@lemmy.world 5 points 4 days ago

Your day to day use isn't everyone else's. We use times for a lot more than "I wonder what day it is today." When it comes to recording events, or planning future events, pretty much everyone needs to include the year. Getting things wrong by a single digit is presented exactly in order of significance in YYYY-MM-DD.

And no matter what, the first digit of a two-digit day or two-digit month is still more significant in a mathematical sense, even if you think that you're more likely to need the day or the month. The 15th of May is only one digit off of the 5th of May, but that first digit in a DD/MM format is more significant in a mathematical sense and less likely to change on a day to day basis.

[-] GamingChairModel@lemmy.world 3 points 4 days ago

Functionally speaking, I don't see this as a significant issue.

JPEG quality settings can run a pretty wide gamut, and obviously wouldn't be immediately apparent without viewing the file and analyzing the metadata. But if we're looking at metadata, JPEG XL reports that stuff, too.

Of course, the metadata might only report the most recent conversion, but that's still a problem with all image formats, where conversion between GIF/PNG/JPG, or even edits to JPGs, would likely create lots of artifacts even if the last step happens to be lossless.

You're right that we should ensure that the metadata does accurately describe whether an image has ever been encoded in a lossy manner, though. It's especially important for things like medical scans where every pixel matters, and needs to be trusted as coming from the sensor rather than an artifact of the encoding process, to eliminate some types of error. That's why I'm hopeful that a full JXL based workflow for those images will preserve the details when necessary, and give fewer opportunities for that type of silent/unknown loss of data to occur.

[-] GamingChairModel@lemmy.world 57 points 5 days ago
  • Existing JPEG files (which are the vast, vast majority of images currently on the web and in people's own libraries/catalogs) can be losslessly compressed even further with zero loss of quality. This alone means that there's benefits to adoption, if nothing else for archival and serving old stuff.
  • JPEG XL encoding and decoding is much, much faster than pretty much any other format.
  • The format works for both lossy and lossless compression, depending on the use case and need. Photographs can be encoded in a lossy way much more efficiently than JPEG and things like screenshots can be losslessly encoded more efficiently than PNG.
  • The format anticipates being useful for both screen and prints. Webp, HEIF, and AVIF are all optimized for screen resolutions, and fail at truly high resolution uses appropriate for prints. The JPEG XL format isn't ready to replace camera RAW files, but there's room in the spec to accommodate that use case, too.

It's great and should be adopted everywhere, to replace every raster format from JPEG photographs to animated GIFs (or the more modern live photos format with full color depth in moving pictures) to PNGs to scanned TIFFs with zero compression/loss.

357
view more: next ›

GamingChairModel

joined 1 year ago