Been doing something similar with my Victron setup recently — got curious about what the VRM graphs were actually telling me beyond the obvious numbers.
Started logging daily snapshots of voltage vs SoC and comparing them against temperature readings from my Fogstar Drift cells. What I noticed is that on cold mornings (we're talking sub-5°C up here on the Pennines), the SoC estimate drifts noticeably compared to what the actual resting voltage suggests. The MPPT and the BMS aren't always singing from the same hymn sheet.
My rough experiment so far:
- Noted resting voltage at 80%, 50%, and 20% SoC across different temperatures
- Compared VRM's SoC% against a basic Peukert-adjusted estimate
- Flagged any charging sessions where tail current behaviour looked odd
Early conclusion: the SoC figure is more of a best guess than a fact, especially in winter. Useful for ballpark, not for precision decisions.
Anyone else been digging into their VRM data this way? Particularly interested if folk with LFP setups have noticed the flat discharge curve making SoC interpretation even trickier — because mine certainly does. The voltage barely moves between 30–80% which makes the Victron algorithm work pretty hard to estimate anything meaningful.
Would be good to build up some comparison data from different setups across the UK — our climate throws enough variables at these systems to make it worth doing properly.