[-] dgriffith@aussie.zone 1 points 1 month ago* (last edited 1 month ago)

As another poster has mentioned, M-Discs are written using a Blu-ray writer and are good for a few hundred years, in theory.

[-] dgriffith@aussie.zone 2 points 1 month ago

Blu-Ray USB drive and M-Discs is about the best you can get at present. Keep the drive unplugged when not in use, it'll probably last 10-20 years in storage.

Seeing as there hasn't been much advance past Blu-ray, keep an eye out for something useful to replace it in the future, or at least get another drive when you notice them becoming scarce.

[-] dgriffith@aussie.zone 13 points 1 month ago* (last edited 1 month ago)

90% of users when they are presented with the UAC popup when they do something:

"Yes yes whateverrr"

[-] dgriffith@aussie.zone 33 points 1 month ago

Never understood why smartphones are so super bright by default.

Because they have to compete with 50k lux outside and then scale to 600 lux indoors, then down to just to a few lux in a darkened room.

Perhaps the brightness slider needs to be more logarithmic so you can slide from 0.001 percent to 100 percent more easily.

[-] dgriffith@aussie.zone 1 points 1 month ago

I've got photos in Flickr dating from 1999 onwards. Ten thousand or so of them, and a couple of the early ones are now corrupted.

But they are my "other backup" for Google photos so I don't mind too much. I also have a USB Blu-ray drive at home that I use to periodically burn M-Discs that I hand out to a few relatives.

That's about as good as I can conveniently do for backup, and it's probably better than the single-point-of-failure box of negatives that my parents have in their cupboard.

[-] dgriffith@aussie.zone 8 points 1 month ago

when they're powered down.

There's no periodic cell refresh in flash memory like there is in DRAM. When USB sticks are plugged in, all you are doing is powering up the flash chip and interface ICs.

You'd have to read a block then write it back to actually refresh the stored charges in the cells.

[-] dgriffith@aussie.zone 1 points 2 months ago

I don't think there's anything commercially available that can do it.

However, as an experiment, you could:

  • Get a group of photos from a burst shot
  • Encode them as individual frames using a modern video codec using, eg VLC.
  • See what kind of file size you get with the resulting video output.
  • See what artifacts are introduced when you play with encoder settings.

You could probably/eventually script this kind of operation if you have software that can automatically identify and group images.

[-] dgriffith@aussie.zone 92 points 2 months ago

Dammit now I have to reduce the block size of my discord-based cold storage filesystem.

[-] dgriffith@aussie.zone 2 points 2 months ago

They need to learn how to use their tools better. Winscp does all that transparently for you if you press F4 on a file on a remote system. Or maybe they did and you just didn't see it.....

It's quite a handy function when you're diving through endless layers of directories on a remote box looking for one config file amongst many.

[-] dgriffith@aussie.zone 8 points 2 months ago* (last edited 2 months ago)

Most times what I get when asking it coding questions is a half-baked response that has a logic error or five in it.

Once I query it about one of those errors it replies with, "You're right, X should be Y because of (technical reason Z). Here's the updated code that fixes it".

It will then give me some code that does actually work, but does dumb things, like recalculating complex but static values inside a loop. When I ask if there's any performance improvements it can do, suddenly it's full of helpful ways to improve the code that can make it run 10 to 100 times faster and fix those issues. Apparently if I want performant code, I have to explicitly ask for it.

For some things it will offer solutions that don't solve the issue that I raise, no matter how many different ways I phrase the issue and try and coax it towards a solution. At that point, it basically can't, and it gets bogged down to minor alterations that don't really achieve anything.

Sometimes when it hits that point I can say "start again, and use (this methodology)" and it will suddenly hit upon a solution that's workable.

So basically, right now it's good for regurgitating some statistically plausible information that can be further refined with a couple of good questions from your side.

Of course, for that to work you have to know the domain you're working in fairly well already otherwise you're shit out of luck.

[-] dgriffith@aussie.zone 1 points 2 months ago* (last edited 2 months ago)

If library devs do versioning correctly, and you pin to major versions like "1.*" instead of just the "anything goes" of "*", this should not happen.

Your unit tests should catch regressions, if you have enough unit tests. And of course you do, because we're all operating in the dream world of, "I am great and everyone else is shit".

[-] dgriffith@aussie.zone 1 points 2 months ago* (last edited 2 months ago)

The problem with stack overflow is that you need to know enough about the domain you're working in to describe it accurately enough to search and find that previous great answer.

If you have no clue, and then naively ask the no-clue kinds of questions, because you have no clue, then you get beaten over the head about not searching for the existing answer that you don't know how to search for.

view more: ‹ prev next ›

dgriffith

joined 1 year ago