Hello all. With the price of electricity (and other things) being a hot topic these days I was wondering about running costs of devices. I’m not at all clued up on electrics (wattage etc).
I have a Vero 4k+ and a Raspberry Pi4 both running OSMC and both stay on 24/7 (although not playing things 24/7, possibly 2 hours a day). Does anyone have any idea of a general cost per year for running each in the UK?
I do understand that I’m asking quite a vague, non-OSMC question, I’m just trying to get a general idea.
The Vero will use about 4W during normal use and less when idling.
If we assume you leave it on and it’s always doing something (no hard drive attached) then at 50p per killowatt hour it would be just over £15 a year to run.
I’m not familiar with the energy consumption of the Pi 4 but would expect it to be higher, particularly if you have a drive attached and are using the USB-C’s increased power output.
Thank you so much for such a quick reply, this gives me an idea going forward. I may well re-arrange my setup (vero and other things) to switch off at the wall when I’m not in and overnight. Every little helps!
Based on this and this the PI4 needs between (IDLE) 2,875W and (100% CPU Usage on every Core) 6,4W and round 3W while showing a video with a resolution of 1080.
I have three RPi 4’s attached to POE adapters and my switch shows them all drawing just shy of 4 watts each right now. My personal opinion is that it makes more sense tracking down larger energy consumers before contemplating reducing your quality of life to save a couple pennies a day. If you pick up a Kill A Watt meter you will find tracking down where your electricity is being consumed a much easier task.
None of them were actively being used at the time I looked which is representative of what the OP was getting at. I assume your wondering why they are reporting a bit higher than your chart. The switch is measuring the current draw from the RPi and the PSU (POE adapter that converts the 48V on the ethernet down to 5V).
I don’t have the needed Hardware to run a test on CPU Usage vs Power Consumption, but if someone can run a test between IDLE and 400% CPU load that would really helpful,
Like running from IDLE to 400% across all Cores in 25% intervals and IDLE to 400% balanced over all CORES in 25% Steps with stress or else.
There will always be a certain load from the VideoCore RTOS running.
In terms of power consumption and cost, I don’t think you have to worry much about running a Pi 24/7.
How exactly would that be useful? The core concern is one of vampire power normally when one broaches this subject. The subject is completely legit BUT one need be mindful of how much energy is actually getting used, not just that it is. There was a time when it was common for TV’s and cable/satellite boxes to use 40-60 watts, and computers 100 watts+ when they were just sitting there doing nothing useful. When electricity was cheap enough few cared and it just became supplemental heating in your home. Nowadays some things have gotten much better like TV’s as shipped sometimes using less than one watt when off. Now the tradeoff there is the TV takes a full minute before it comes up instead of five seconds. In the menu you can turn down the power savings so it starts up faster but it costs you an extra $3 per year in electricity. Is the extra annoyance worth it? Maybe the slow startup gets you into the habit of not turning the TV off as frequently. Since a TV can draw hundreds of watts while being used that habit can quickly end up using more electricity than if one is not shy on turning it off. How far is someone to go? When your looking at trying to get savings from a single device using less than five watts you should remember that when you swap out a single 60 watt light bulb with an LED your no loner using ~55 watts that you were. Admittedly light bulbs are the easy example but they are a good one to show the scale of how some changes will have a much larger impact on energy saving than others.
@darwindesign you are right in all, just to know it, it’s just simple as that. I would guess there are some sort of linearity from IDLE to full load but i can’t prove it.
I can’t easily measure it that way as my switch averages over a period of time. I also don’t think it would really give much insight either as how much power is consumed under actual use, at any given time, for any given user is going to vary depending on what’s playing, connected, how much hardware offload, background services, etc. Outside of edge cases like someone with multiple tuners and pile of hard drives, it is unlikely to be enough that most people should give any consideration to.