Interpreting battery behaviour from VRM data – small experiment

by Pennine Solar · 1 month ago 18 views 5 replies
Pennine Solar
Pennine Solar
Member
9 posts
thumb_up 6 likes
Joined Jan 2024
1 month ago
#5859

Been doing something similar with my Victron setup recently — got curious about what the VRM graphs were actually telling me beyond the obvious numbers.

Started logging daily snapshots of voltage vs SoC and comparing them against temperature readings from my Fogstar Drift cells. What I noticed is that on cold mornings (we're talking sub-5°C up here on the Pennines), the SoC estimate drifts noticeably compared to what the actual resting voltage suggests. The MPPT and the BMS aren't always singing from the same hymn sheet.

My rough experiment so far:

  • Noted resting voltage at 80%, 50%, and 20% SoC across different temperatures
  • Compared VRM's SoC% against a basic Peukert-adjusted estimate
  • Flagged any charging sessions where tail current behaviour looked odd

Early conclusion: the SoC figure is more of a best guess than a fact, especially in winter. Useful for ballpark, not for precision decisions.

Anyone else been digging into their VRM data this way? Particularly interested if folk with LFP setups have noticed the flat discharge curve making SoC interpretation even trickier — because mine certainly does. The voltage barely moves between 30–80% which makes the Victron algorithm work pretty hard to estimate anything meaningful.

Would be good to build up some comparison data from different setups across the UK — our climate throws enough variables at these systems to make it worth doing properly.

Sophie Hill
Sophie Hill
Member
3 posts
Joined Jul 2025
1 month ago
#5901

@PennineSolar VRM is basically a lie detector for batteries — mine ratted out a dodgy Fogstar cell that was masking itself beautifully at rest but absolutely bottoming out under any real load.

LiFePO4Fan
LiFePO4Fan
Active Member
19 posts
thumb_up 17 likes
Joined Jan 2024
1 month ago
#5905

@PennineSolar the part people miss with VRM is the resting voltage after a charge cycle ends — that tail-off curve tells you a lot about actual SoC versus what the BMS is reporting.

Mine showed a consistent ~0.02V sag about 40 mins post-charge for months before anything obvious showed up on capacity tests. Turned out one of my Fogstar cells was sitting slightly out of balance and the BMS was papering over it.

Worth overlaying your temperature logs too if you haven't already — voltage behaviour looks different in a cold outbuilding versus summer. My tiny house setup swings quite a bit seasonally and it skewed my interpretation early on.

Basically treat VRM as a trend tool rather than a snapshot tool. Single readings are almost meaningless.

Daily Solar
Daily Solar
Active Member
48 posts
thumb_up 41 likes
Joined Mar 2023
4 weeks ago
#5914

@LiFePO4Fan nailed the resting voltage point — I'd add that the time to reach resting state matters enormously. A healthy LiFePO4 pack should settle within 15-30 minutes post-charge. If yours is still drifting an hour later, that's either a dodgy cell or your BMS is still doing active balancing (check which before panicking).

On VRM specifically — the "min cell voltage during discharge" overlay is criminally underused. I've got it running alongside SOC% on my cabin setup and the divergence between the two during high EV charging draws told me my 280Ah Fogstar pack needed a proper top-balance far more clearly than any single number could.

Pro tip: export the raw CSV and plot delta-voltage (max cell minus min cell) over time. Spikes during bulk charge = weak cell. Spikes during discharge = connection resistance. Different problems, different fixes.

ExBrickie
ExBrickie
Active Member
27 posts
thumb_up 13 likes
Joined May 2023
4 weeks ago
#5945

@LiFePO4Fan @DailySolar both good points, but I'd push back slightly on over-reading the resting voltage in isolation. Ambient temperature throws it off more than people admit — I've seen my bank read 0.05V lower on a cold morning on the boat versus a warm afternoon, same actual state of charge. VRM doesn't log cabin temp by default so that correlation gets lost.

What I find more useful is plotting charge acceptance rate over time. If a cell or group is starting to refuse current earlier in the bulk phase, that shows up weeks before voltage anomalies become obvious. Caught a marginal cell in my Fogstar stack that way — voltage looked fine, but it was hitting absorption threshold suspiciously fast.

Worth adding a temperature sensor if you haven't already. Victron's own BMV temp probe is cheap enough.

DontPanic25
DontPanic25
Member
8 posts
thumb_up 5 likes
Joined Aug 2024
4 weeks ago
#5980

@ExBrickie raises something I learned the hard way with my cabin setup. Spent weeks obsessing over resting voltage on my Fogstar Drift cells, convinced one battery was degrading — turned out the ambient temperature in the outhouse was swinging 15°C between morning and evening, which was skewing everything.

VRM's historical overlay feature saved me here. Once I started comparing voltage curves across similar temperature windows rather than just time of day, the picture got a lot cleaner.

The data point people rarely export is the charge current tail — watching how quickly absorption current drops off tells you more about actual capacity than resting voltage alone. My Victron BMV logs this beautifully if you set the sample rate down to two minutes.

Context is everything with this stuff. The raw numbers are almost meaningless without the surrounding conditions.

Log in to join the discussion.

Log In to Reply