1460
It's time to mentally prepare yourselves for this
(lemmy.world)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
I'm just saying, but we did.
Pretty much every electronic thing you own that resembles a computer (phones, tablets, laptops, desktops, even your damned TV) uses UTC. Every. Single. One. Translates that time to "local" whenever it needs to.
So when your TV goes from 9:32 to 9:33, is just showing the converted time from UTC each time.
Almost every device on the planet is keeping time in UTC.
Just because you don't see UTC time on your device, doesn't mean that's not what's happening. I had an issue where I needed to get into my computer's bios for something, as soon as the BIOS loaded and showed the time, it was "wrong" because it was in UTC. I'm sure plenty of newer BIOS dialogs are configured to account for timezones now, so yeah. I might be unique in this. It's still there.
Almost all computers count time as seconds from the epoch (midnight 1/1/1970). That then gets converted into a readable time, which may go through UTC to be converted first, but that's not how it's storing it.
You're referring to UNIX time. And you're correct.
It's a count of how many seconds from midnight, January first, 1970, UTC.
Local computers update that time, still in UTC, from time servers, usually over NTP, then translate that time reading from UNIX time in UTC, to a human readable format in the local time zone.
All computers are still keeping track of time from Epoch in UTC.
Unix time is far less universal in computing than you might hope. A few exceptions I'm aware of:
Converting between time formats is a common source of bugs and each one will overflow in different ways. A time value might overflow in the year 2036, 2038, 2070, 2100, 2156, or 9999.
Also, Unix time is often managed with a separate nanoseconds component for increased resolution. Like in C
struct timespec
, modern *nix filesystems like ext4/xfs/btrfs/zfs, etc.Because you don't use Windows. Windows by default stores local time, not UTC, to the RTC. This behavior can be overriden with a registry tweak. Some Linux distro installer disks (at least Ubuntu and Fedora, maybe others) will try to detect if your system has an existing Windows install and mimicks this behavior if one exists (equivalent to
timedatectl set-local-rtc 1
) and otherwise defaults to storing UTC, which is the more sane choice.Storing localtime on a computer that has more than one bootable OS becomes a particularly noticable problem in regions that observe DST, because each OS will try to change the RTC by one hour on its first boot after the time change.
That's a nice theory, it would be a shame if I was only running Windows 10 on my desktop.
Spoiler: I am. No Linux or any other os or bootloader in sight.
That's strange. As far as I can tell from any web searches, every version Windows still defaults to storing local time to the hardware clock and there are no reports of that changing with an update, nor is there any exposed setting control to configure this behavior outside of regedit. If you're curious enough, you can check the current setting in the registry at
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\TimeZoneInformation
. Windows maintains the current time as UTC if and only if the RealTimeIsUniversal key is present and nonzero.I expect it's more likely some other issue would make the BIOS display an hour that's inconsistent with your local timezone. For example, maybe a bug in the BIOS, maybe a timezone offset setting within the BIOS, or maybe a dead clock battery.
I'm not a typical example. I can check that reg setting later. My PC is a Dell precision 7910 rack that I picked up second hand, running Windows 10 Pro that I self installed.
It's joined to my homelab active directory domain which has a gpo for setting NTP. I don't believe I've set any additional settings for time via policy.
The system is also set up for virtualization. I'm pretty sure hyper-V is installed and I have VMware workstation installed as well (I mainly use workstation for VMs).
The main disappointment I have with this system is split between the limited GPU space and the BIOS, neither of which I can do much about. The GPU issue is that the rack orientation of the system doesn't allow much room for a GPU to breathe so even a "good" GPU can't really get airflow, unless it's a blower style; I don't have the money to be picky about my GPU and I was donated an RTX 2080 Ti founders edition, which is definitely not a blower style cooler. Without hardware hacking the system, the card thermal throttles very quickly and doesn't get very good performance numbers. IIRC it was measuring around the same performance of a GTX 1060 or so. I moved the GPU out of the case temporarily using a PCIe riser which solved the immediate concern, and I'll be doing some minor modifications to the chassis to make it a more permanent option.
The BIOS issues are mainly that the tuning options either don't exist or are extremely limited. The BIOS will tell you about the CPU/RAM speeds and features, but won't necessarily give you options to change anything. I want to adjust my numa configuration on the unit, to better match the hardware so my os makes better threading decisions, but such options are unavailable through the normal means and I haven't dug into the Dell command line tools for the BMC/IPMI which may be able to adjust the settings. For anyone familiar with numa, what I'm seeing is that my first, say 80% of CPUs are all in one numa node, and the last eight are split. As in, the first 80%+ are in both, the next 4 cores are in numa 0 and the last 4 are in numa 1. I have 2x14 core xeon CPUs with HT, so having 20+ pCores in both numa nodes is creating some interesting stuttering issues. They're not super frequent, but they happen when the system is busy.
To my recollection, I have not run any of the windows 10 cleanup scripts available around the internet, mainly because I'm a tech and I don't like not knowing what's happening/changing on my own system, though I did make a string of changes when I first installed Windows 10 related to optimizing for SSDs and other performance improvements. All performance based, nothing to do with the time.
Beyond that, it's a pretty typical Windows 10 professional install running on workstation hardware.
But would the moon work on a 24 hour system at all?
I can’t believe I just typed that as a serious comment
Didn’t Bajor have a 28 hour day? I’m now voting for Universal Bajoran Time
If you're setting moon time to the day/night cycle of the moon, yeah, it would actually have a much longer day, from what I understand.
I might be wrong, but to my best understanding, the moon is tidally locked to the earth, meaning the same side of the moon is always facing the surface of our planet. Which means the rotation of the moon, and the length of a day on the moon would be tied to how fast it orbits the earth.
You can tell the duration of an orbit by simply following the moon cycle ("new" moon (midnight) through "full" moon (noon) and back to a "new" moon. Based on this, unless I've made a serious error in my logic, a moon day would be something like 20-30 earth days.
If I Google it, the moon orbits earth approximately every 27.3 days. Which is about 665h 12m... Give or take a few hours.
Our entire concept of time, days, months, and years, breaks on the moon. On earth, an hour is 1/24th of a rotation of the planet. A day is one full day/night cycle, a year is one orbit around the sun.
When you transpose this principle to the moon, am hour is 1/24th of a rotation of the moon, which happens to be 1/24th of a year, which is one orbital rotation around the earth. So one day = one year on the moon.
So how do we measure time on the moon in a way that isn't completely insane? The only logical thing I can think of is to fundamentally lock the time zone of the moon to the earth. That the date, and maybe even the time, isn't based on the moon, but rather transposed from some definition of the same on earth.
This also leads me into a rant/discussion about time in SciFi. Once you leave the orbit and reference point of Earth, what is a day? An hour? A year? You have no point of reference to base such notions of time. Why is there a "night shift" in programs like Star Trek? Why is there really only one captain? Why does everything on these shows seem to occur during their idea of "daytime"?
Then there negotiating with some alien race and say they'll reconvene tomorrow about something... Tomorrow, based on what? You're in space. It makes sense if they're in orbit of a planet, but then you get to see standoffs in the middle of fucking nothing, and they're like "you have 24 hours to decide". Okay. 24 hours based on what exactly?
I appreciated MiB's take on this in the film. They defined not only how much time they had to return the galaxy, but in what format the time was being counted in. Which they could calculate and adjust to earth time.
This all sets aside relativity, since when you're moving near, at, or beyond the speed of light, you experience time differently (see: interstellar), also gravity can affect this, and other factors. But somehow, they just side-shuffle from the whole time thing and just focus on the drama of it all. Viewers are too enthralled with the spectacle, not realizing that these Romans or boleans, or Klingons, or cardassians, or whatever, probably have a completely different idea of how much time their version of "one hour" or "one day" is.
It's fascinating and frustrating.
I love it and hate it all at the same time.
Time sucks. It's never correct, often ignored, and bluntly, a strange concept that isn't, IMO, well defined. We have the idea pretty well set up here on earth, based entirely on things happening on and to this planet, but if you take that reference point away, everything collapses.
Honestly quote irrelevant. It's hidden away. It's not shown to us. It could use literally any frame of reference, like farts since the beginning of times, if it's converted for you, then it's not.
I'm still technically correct. And we all know that's the best kind of correct.