When that eucharist looks moldy
unperson
:dean-smile: I like the piracy comm and most of the db0 users I see coming here; they don't deserve being thrown into the .world pit.
yay xfce sensors
3 aur/xfce4-sensors-plugin-nvidia-hddtemp_through_netcat-current 1.3.95-1 (+2 0.00) (Orphaned)
Sensors plugin for the Xfce panel with nvidia and hddtemp (through netcat) support
2 aur/xfce4-sensors-plugin-nvidia 1.4.4-2 (+26 0.00)
A lm_sensors plugin for the Xfce panel with nvidia gpu support
1 extra/xfce4-sensors-plugin 1.4.4-1 (198.8 KiB 808.5 KiB) [xfce4-goodies]
Sensors plugin for the Xfce panel
==> Packages to install (eg: 1 2 3, 1-3 or ^4)
==> 1
Sync Explicit (1): xfce4-sensors-plugin-1.4.4-1
[sudo] password for unperson:
resolving dependencies...
looking for conflicting packages...
Packages (8) exo-4.18.0-1 garcon-4.18.2-1 libwnck3-43.0-3 libxfce4ui-4.18.6-1 libxfce4util-4.18.2-1 xfce4-panel-4.18.6-1
xfconf-4.18.3-1 xfce4-sensors-plugin-1.4.4-1
Total Download Size: 2.66 MiB
Total Installed Size: 15.76 MiB
:: Proceed with installation? [Y/n]
Once upon a time if you pressed F1 on the desktop a full manual showed up that started with "how to use a mouse" and ended with registry hacks. It was contextual so if you were in the calculator it showed information about the calculator.
Today if you press F1 it opens Microsoft™ Edge™ with a Microsoft™ Bing™ search for how to get help with windows
. The results include some shitty youtube videos that somebody uploaded with that exact phrase.
How are you decoding the H.264 MVC video?
I doubt they worried about being condescending, lots of people fear that the official documentation will be too difficult and never read it. The logic is that the docs are arcana written by witches that know how to write programming languages, and the tutorials are written by regular girls that had to struggle to understand the language instead of the syntax just appearing on their heads.
I pretty much learned how to program from the official Python tutorial. I had been struggling for years before that; I had some notions but I couldn't put together anything really useful. The Python docs got me over the hump precisely because of what OP said: it starts from 0 and builds up until you have enough tools to write whatever project you have in mind. I imagine that having had to design and reason everything about the language actually gives the writer a great sense of how it fits together and what the logical increments are.
Since then I always go first to whatever the language designers wrote; for example K&R's The C Programming Language, the Rust book, the Postgresql manual, etc, and only once I feel that I know enough I complement it with other sources.
This approach extends to libraries as well: first I read whatever official docs there are, then I search the source code for the functionality I need to learn about, and only if that fails I look elsewhere.
It seems like a slow method but it's so reliable that it works out for me. After a while of doing this you become the reference and people come ask you questions.
Python in particular is very well documented. There are two levels, the official tutorial that glosses over stuff and presents things conceptually, and the reference that tells you exactly what is happening and what the syntax does.
That whole chapter about the data model is really useful when you try to do anything fancy with Python. It's all in one page so you can Ctrl-F all the arcane definitions.
- mindustry: RTS with production lines. It's got a bit of a learning curve.
- hyperrogue: roguelike that takes place in the hyperbolic plane. It's more fun to not research how it works and just hit play: you die if a monster touches you, but the game will not let you make a losing move, so you only lose when you're "checkmated".
Of course you had to have something to drive the VGA outputs. Usually this meant a VIA, SiS, or Unichrome chip in the motherboard. Those chips often had no 3D acceleration at all, and a max resolution of 1280x1024. You were lucky to have shaders instead of fixed-function pipelines in 2008-era integrated graphics, and hardware accelerated video decoding was unheard of. The best integrated GPUs were collaborations with nVidia that basically bundled a GPU with the mainboard, but those mainboards were expensive.
Windows Vista did not run well at all on these integrated chips, but nobody liked Windows Vista so it didn't matter. After Windows 7 was released, Intel started bundling their "HD Graphics" on CPUs and the on-die integrated GPU trend got started. The card in the picture belongs to the interim time where the software demanded pixel shaders and high-resolution video but hardware couldn't deliver.
They left a lot of work for the CPU to do: if you try to browse hexbear on them you can see the repainting going from top bottom as you scroll. You can't play 720p video and do anything else with the computer at the same time, because the CPU is pegged. But if you put the 9500 GT on them then suddenly you can use the computer as a HTPC. It was not an expensive card, it was 60-80 USD, and it was a logical upgrade to a tower PC you already have to make it more responsive and enable it to play HD video.
Yes, it was the cheapest graphics card that could decode 1080p H.264 video in real time (and the acceleration worked in the Flash player). The 8500 GT could also do it but it was never popular. It made a huge difference when youtube became a thing.
I like it because it's designed. It think it's the only DE with actual designers continuously working on it. It has few options and doesn't feel hacked together.