Help with math calculating power usage with solar

Hello, I have spent days trying to come up with something that works but I am convinced I am making this more difficult than it needs to be.

I have objects already for total house power, real house power, and all the objects that make up the total house power ie fridge, tv etc.

So in the scenario I am generating say 800w but the house is using 1000w, the real power object would be 200w. The fridge is using 250w, I want to be able to work out the percentage of watts the fridge is using.

I came up with fridge (250w) / total house power (1000w) = 0.25 then do real power which would be 200w x the 0.25 = fridge real power (50w)

Then I can use a time range to split the 50w into peak and non peak watts feed it into utility meter program to gather a months worth of numbers and calculate peak and off peak cost for each device whilst taking into account the solar savings.

I was originally feeding each device into a function to create a topic for each device, then feed them into another function to do the calcs but it is just getting messier as I go.

So then I tried feeding the fridge and total power as msg.fridgewatts and msg.totalpower and using them as var's to then output the percent. but getting a NaN error.

It currently looks like this;

var fridgepercent = (msg.frigewatts / msg.totalpower);
msg.payload = fridgepercent; 
return msg;

Then I would feed this into another function to do the real power * fridgepercent but again this is just getting messy.

I feel like I have been staring at it too long and have come back to it several times but I just feel like I am missing a more simple solution.

Any suggestions would be really appreciated.


Its interesting when you start going down this rabbit hole isn't it !!

So are you interested in real figures or are you just approximating ? i.e. do you have a power monitor on your fridge that measures say the real power use every minute and calculates that ? or are you just saying the fridge draws 3KW/H per day - therefore the average is 250w ?

The way i am approaching this at the moment is to record a time/energy profile of each of my devices during a normal cycle of operation (so in the case of a fridge that cycle is 24hrs). Other such as a pool pump are 2 hours

I slice this into 1 minute intervals and then record the information for power draw for each minute whilst it is operating

My objective is slightly different to yours - i want to maximize my self consumption of solar

So i have apportioned a base load of 1400watts to our house - which covers things like the fridges, UPSs etc

I then have all my other dispatchable devices - pool pump, heat pump, dishwasher, washing machine, clothes dryer, Air Con down and recorded in timeslices and stored as an array.

I am working my way through a way to optimize the power usage of these devices based on the forecasted solar output for the day.

That maths is doing my head in at the moment

In your case though - rather than transferring as message properties why not store their use in Arrays/Objects - so record them in realtime, including solar output and then come back and do the maths from there based on whether they fall into Peak or off-peak schedules.


The way you describe it is a little confusing.
Are these objects real measurements, catched by a power or energy meter or just calculations, based on a simulation, like you would do on an spreadsheet and then inject into your flow?
What is "real power" in your case, I gather this is the (remaining) load at grid level and does "real" mean only consumption (direction of energy flow what you are paying for, hence added by your energy meter from your energy service provider) or also production (egress flow of energy into the grid) at grid level?

Yes, cost is based on a tariff and a standard household tariff plan is based on energy, not power. Or do you have some kind of other tariff/plan, where min/max/average Power (typically 15mins interval) is also part of the tariff?
Again, I am unsure if you are able to measure real energy consumption/generation or just power and that you will have to calculate the amount of energy (and ultimately cost) yourself, based on the given timeframe in question?

Thanks for the replies, I should of given a bit more information.

What I am using is "realish" measurements with Meross MS310 plugs they are probably only 90% accurate to be honest but good enough for what I was trying to achieve.

So this started with trying to reduce consumption and start recording usage as I started to implement a bit of DIY solar, I then feed all this information into Home Assistant for automation.

Once you have the numbers coming in for real time usage there is a home assistant component called Utility Meter which will do the math for you for hourly, daily, weekly, etc which funnily enough I just realised does multiple tariffs so that will be helpful once I sort the math out.

That is actually my goal as well, I have a zero export configured with a limiter on my solar so the more I can use during the day the better. I have batteries in the system to help catch any additional produced during the day.

I am doing the same thing, if the dishwasher starts a cycle whilst we have no solar or we are within a few hours of the cheap tariff it will switch the smart plug off and inform the house via google home hub that the washer is paused until 8:30pm.

If I can get just the real output ie the fridge using 250w but the solar is covering 50w I can easily feed this into the utility meter component and it will do all the math for me.

Consumption only, I see the real power as basically the total power of all my smart plugs in real time added together which gives me a total figure I can use. Then the real power is just total - solar production.

So if washer and dishwasher are on for example and we are effectively using say 3kw but we have 1kw of solar then 2kw is coming from the grid. The 2kw is what I class as my real power usage.

I am with octopus energy on the go faster tariff so 15.18p 01:30am to 8:30pm then 5.5p 8:30pm till 01:30am.

As long as I end up with the live power figure I feed that into the home assistant utility meter component and it will do the monthly energy kwh for me.

I do have all of this setup already measuring all my devices and feeding into the utility meter component so I know how many kwh each device is using then do a simple monthly khw x 0.155 which was my old tariff.

But for me to properly understand what devices are costing me money I need to take into account the tariffs but more importantly the solar.

Does the math actually make sense, Example;

Devices using in total = 2000w (Total Power) This is all my plugs fed into 1 sensor
Solar producing 1000w - This is actually on one of the same plugs they read both ways
Total Power - Solar = 1000w (Real Power) This is a sensor already configured
Fridge is using 250w

So fridge (250w) / total power (2000w) = 0.125 (fridgepercent)
Real Power (1000w) * fridgepercent (0.125) = 125w real usage

I then do that for each device, then feed them into the utility meter component.

So it looks something like this

A good example here is a my hotub in the picture, it is currently 30.86kwh x .15(my old tariff) = £4.63

Now the filter pump is using around 50w 24x7 so 6-8hrs of that is likely covered by solar and the heater automatically comes on 8:30pm till 01:30am during cheap tariff. So 99% of the 30.86kwh is likely x .05.5 = £1.70 so there is a huge difference in total cost.

I can easily do the tariff side of things, it is just the math and getting the nodered piece right for the "solar taken into account real energy" sensors I need.


Thank you for your input, that makes more sense now.

I actually don't think it does.
It makes sense in maths, but not in reality.

Imaging the following:
Devices using in total: 900W
Solar Production: 1000W
Total real Power: ?? in your case = 0, in real = -100W
Fridge is using 250W
So Fridge (250W) / total Power (-100W) = ??? reality: 100percent
Real power (-100W) * 100% = ??? reality: 0W real usage

You are missing the aspect of applying the appropriate measurement interval (time) and the concurrency that will happen (like in a 15min interval, fridge and hairdryer are running at the same time, but hairdryer only for 5mins in that 15min interval - while solar fluctuates from 200W - 1000W in that interval). You will never know "what energy packets" will be used from solar or from grid.

I'd actually feed all consumption and solar measurements (the raw data) into a time-series database, on a different topic/series each.
Then apply a query to that database, from now() - time since last query and collect each dataset based on a standard, defined interval (like PT-5min, or 10 or 15min).
This will get you:

  • devices total, average power consumption per 5min interval
  • solar generation power, average per 5min interval
  • device X (i.e. fridge) power consumption per 5min interval
    ...for that moment and length in you know the interval length, calculate the energy (power * 5mins) for each value.

...then you can do your maths with these "measurements" again/feed this into home assistant.
I think that the result is more acurate/closer to reality.

So my plan for this was to have a if / else

I have a function I am playing with at the moment that looks like this;


var realpower = context.get("realpower") || 0;
var fridgepercent = context.get("fridgepercent") || 0;

if (realpower <= (0)) {
    msg.payload = "0";
} else
    msg.payload = (realpower) * (fridgepercent);
    msg.realpower = (realpower);
    msg.fridgepercent = (fridgepercent);
return msg;

So I would basically just output "0" for all devices if I do produce more then consumed but with me having zero export with the limiter I shouldn't ever go over apart from now and again.

I was planning on having all the devices poll every say every 30 seconds so it would grab a snap in time, the utility meter component should take care of the rest.

The current one for the hottub looks like this;

Can only add two pics


I think I might of made something work

Percent Calc Function;


var totalpower = context.get("totalpower") || 0;
var fridgewatts = context.get("fridgewatts") || 0;

msg.payload = (fridgewatts) / (totalpower);
return msg;

Real Calc function;


var realpower = context.get("realpower") || 0;
var fridgepercent = context.get("fridgepercent") || 0;

if (realpower <= (0)) {
    msg.payload = "0";
} else
    msg.payload = (realpower) * (fridgepercent);
return msg;

Just realised I missed one of the components, in home assistant there is a component called Integration which will do the sums, you input in power / watts and it will do the calcs to create kwh for the devices.

Yes, that does look more promising.
I'd focus on the integral values and calculate deltas and parts/percentages based on energy, rather on live graphs from power.
This will give you more accurate results, I think...especially around the edge-cases, discussed before where there is a sudden jump/spike around zero/null.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.