Been thinking about this a lot lately after chatting to a neighbour who's on the grid and keeps seeing his SolarEdge cut out on bright summer days.
His street voltage was sitting at 257V last July — measured it myself with a clamp meter out of curiosity. That's above the EN 50160 upper limit of 253V, and sure enough his grid-tie inverter was tripping out precisely when he should have been generating the most.
It got me wondering — is this becoming more common in rural UK areas as more neighbours add solar? The DNO infrastructure clearly wasn't designed for this level of distributed generation.
For those of us who've gone full off-grid (or hybrid), this is actually one of the unsung benefits nobody talks about. My Victron Multiplus-II on the static caravan doesn't care what the grid is doing — it just cracks on with whatever the Pylontechs and the panels are providing. No voltage trip-outs, no lost generation windows.
If you're running a hybrid setup though, it's worth knowing your inverter-charger's tolerance thresholds. Victron lets you tweak the AC input voltage range, which is handy if you want to stay grid-connected but only on your terms.
Curious whether anyone here has actually measured their grid voltage and found it creeping high during summer? Particularly interested if you're in a rural area with lots of solar installations nearby — I suspect the problem is worse than people realise.
Worth considering for anyone still on the fence about cutting the cord entirely. Sometimes the grid isn't the reliable backstop it's cracked up to be. 🔌