A i5-6500 has a TDP of 65W while a i5-13600K has a TDP of 150W.
If you get something modern that has the performance of a i5-6500 it will be a little bit more efficient. The key is that more performance uses more power.
Yes, but also no. Older hardware is less power efficient, which is a cost in its own right, but also decreases backup runtime during power failure, and generates more noise and heat. It also lacks modern accelerated computing, like ai cores or hardware video encoders or decoders, if you are running those appd. Not to mention lack of nvme support, or a good NIC.
For me a good compromise is to recycle hardware upgrades every 4-5 years. A 19 year old computer? I would not bother.
Yeah what I’ve always done is use the previous gaming/workstation PC as a server.
I just finished moving my basic stuff over to newer old hardware that’s only 6-7 years old, to have lots of room to grow and add to it. It’s a 9700k (8c/8t) with 32GB of ram and even a GTX 1080 for the occasional video transcode. It’s obviously overkill right now, but I plan to make it last a very long time.
i think the best choice is a cheap used pc or laptop, or server. Reduces electric waste. I also host my own server on a 19 year old Dell Insprion 1300
They take up so much space though.
A lot of older equipment actually wastes more electricity.
But it will cut down on electronic waste.
not always. especially laptops
Not necessarily.
A i5-6500 has a TDP of 65W while a i5-13600K has a TDP of 150W.
If you get something modern that has the performance of a i5-6500 it will be a little bit more efficient. The key is that more performance uses more power.
If you buy a high watt CPU, that’s on you. Ryzen 7 also came out in 2022 and had many 65 watt cpus that could outperform an i5-6500.
Yes, but also no. Older hardware is less power efficient, which is a cost in its own right, but also decreases backup runtime during power failure, and generates more noise and heat. It also lacks modern accelerated computing, like ai cores or hardware video encoders or decoders, if you are running those appd. Not to mention lack of nvme support, or a good NIC.
For me a good compromise is to recycle hardware upgrades every 4-5 years. A 19 year old computer? I would not bother.
my 19 year old laptop runs the web server just fine, and only needs 450 mb ram even with many security modules. it produces minimal noise
I have a Lenovo M710q with a i3 7100T that uses 3W at idle. I’m not mining bitcoin, server is idle 23h a day if not more.
Bro, I am just hosting a WordPress backup, an RSS reader, and a few Python scripts
Yeah what I’ve always done is use the previous gaming/workstation PC as a server.
I just finished moving my basic stuff over to newer old hardware that’s only 6-7 years old, to have lots of room to grow and add to it. It’s a 9700k (8c/8t) with 32GB of ram and even a GTX 1080 for the occasional video transcode. It’s obviously overkill right now, but I plan to make it last a very long time.