Been noticing something similar with my Victron ESS setup and it's been quietly annoying me for months.
The system is consistently pulling slightly more from the grid than the setpoint I've configured — we're talking maybe 30-50W of "unexplained" draw even when solar and battery state look perfectly healthy. Not a disaster, but when you're trying to run genuinely close to zero net import, it adds up.
My working theory is that the ESS algorithm is calculating the setpoint based on ideal power flows, without properly accounting for inverter inefficiency and conversion losses in the chain. So when I ask for 0W grid setpoint, the system is essentially promising the grid "I'll cover everything" — but the maths doesn't factor in the ~5-8% that gets eaten by the Multiplus doing its job.
A few things I've checked already:
- MPPT and battery figures look accurate in VRM
- Fronius AC-coupled PV readings seem consistent
- ESS assistant is on the latest version
- Fogstar lithium bank reporting SOC correctly via CAN
The discrepancy is small but consistent, which makes me think it's a calculation quirk rather than a sensor issue.
Has anyone else spotted this, particularly on larger systems where the inefficiency losses become more noticeable in absolute watts? I'd be curious whether tweaking the grid setpoint to something like -30W (telling the system to slightly export) effectively acts as a workaround — compensating for what the algorithm isn't accounting for.
Would love to know if this rings any bells or whether I'm barking up the wrong tree entirely. Anyone dug into the ESS assistant code or raised it with Victron directly?