

Yes, and it should probably be cheaper in Poland. But it’s really 17% more expensive in this case, not 44% (or 30% as the article calculates).


Yes, and it should probably be cheaper in Poland. But it’s really 17% more expensive in this case, not 44% (or 30% as the article calculates).


The Polish price includes 23% VAT, no?


15 years is too long, it doesn’t match the state of the industry or technological progress.
How is this too long? I would consider it a reasonable amount of time to receive security updates on a computer.
I have a notebook that I bought in 2012. It can run Ubuntu LTS 24.04, which is supported until 2034, without issue. There is no indication that the next release will stop supporting this hardware. I don’t see why Microsoft couldn’t provide this.
Plasma 6 is a significant upgrade for sure, especially on Wayland! I’d rate the crash frequency (on Fedora) at between once per week and once per month ;-)
Join the Debian Trixie upgrade fun today :) https://micronews.debian.org/
True, Linux applications (e.g. apt, dnf, pip, but also rm, sudo, and many more) would be more precise.
For Arch, it’s probably not so easy to define “essential” packages, as it, for example, supports many different bootloaders. It is of course also a question of distro philosophy and target audience. Personally, I’ve noticed that “rm -r” as root prompts for every file on RHEL but does not on Arch…
E: Removing essential system-critical packages is not permitted. This might break the system.
You can still do it if you really want, but even Linux rightly has some protections against breaking your system.
I’d say Mint is fine for gaming, as long as your hardware is supported. I’m using it with an Nvidia GPU on X11 and I can play all the games I want to play (Steam is Steam after all). My main gripe is that multi-monitor VRR doesn’t work on X11, but it hasn’t pushed me to another distro just yet…
For people/beginners that mostly want to game on a computer, I’d say that actually something “immutable” like Bazzite might be one of the best options.


Yield is the percentage of chips that are functional. Roughly, you can think of it as the probability of a chip having 0 defects. The bigger the chip, or the higher the defect density, the lower this probability becomes. Chip designers will also include mitigation techniques (e.g. redundancy) to allow chips to work even with some defects.


Talking about the “yield” of a process doesn’t make any sense. Yield is a metric for a specific chip fabricated on a given process. This depends heavily on the size of the chip and mitigation techniques.
The “correct” metric to compare processes is defect density (in defects per square cm). Intel claims that their defect density is below 0.4 defects/cm²: https://www.tomshardware.com/tech-industry/intel-says-defect-density-at-18a-is-healthy-potential-clients-are-lining-up. This would be relatively high but not much worse than what TSMC has seen for their recent nodes: https://www.techpowerup.com/forums/threads/intel-18a-process-node-clocks-an-abysmal-10-yield-report.329513/page-2#post-5387835).


What kind of junk energy is there to harvest from a car (in meaningful amounts)? I guess breaking is the obvious answer, but that’s already covered by regenerative breaking. Most car-based energy harvesting systems seem to employ speedbumps that clearly take useful (kinetic) energy away from the car (probably at a very poor efficiency).


How would a turbine that takes energy from the air current generated by a passing car decrease the energy of the car?
Not sure where you got that idea from, but how would that generate a meaningful amount of energy? It seems very unlikely that such a system would ever recover the energy spent on its construction.


Sure, but those are completely different approaches. Dams have the advantage that they have a much larger capture area for water and that they can accelerate the water beyond the 10 m/s terminal velocity of raindrops.


Raindrop energy harvesting is a rubbish idea. The raindrops simply don’t have a meaningful amount of energy to begin with: https://news.ycombinator.com/item?id=36907674
It says “UNSUPPORTED: VSYNC is not available on the Linux platform.” and runs at a stuttery 133 fps. This test shows 144 Hz: https://fpstest.org/refresh-rate-test/ The Nvidia settings app shows 144 Hz + VRR are active and I can see that the cursor is rendered at >70 fps.
I’m pretty sure that my desktop is drawn at 144 Hz (on the primary display) and xrandr also tells me that that’s the active mode. 🤷♂️
Edit: This is with Nvidia (proprietary drivers) and VRR monitors.
Is that generally an issue on Linux Mint / Cinnamon X11? I have a 144 Hz and a 70 Hz monitor and they seem to work fine…
I’m not sure I follow… Did the Fedora Council actually take a decision?
It’s either +44% (from $70) or -31% (from $101). Percentages are weird…