Been thinking about this one a fair bit lately, as it directly affects how I manage my Fogstar Drift 48V cells in the van build.
The issue is absorption/tail current management — specifically whether your BMS or charger is doing any intelligent tapering as you approach 100% SoC. With LiFePO4, that final 5-10% is where you can cause the most long-term stress if you're hammering full charge current right up to the cutoff point.
My setup uses a Victron MultiPlus-II, which handles the CV phase reasonably well and lets you configure the tail current threshold (I run mine at around 2% of capacity). But the BMS itself — a JK BMS in my case — doesn't do any active current dampening on its own. It'll just hard-cut if a cell hits HVC. That's brutal compared to a gradual taper.
A few things worth considering:
- Can your charger/inverter-charger be configured to reduce charge current progressively during absorption? Victron's DVCC over VE.Can is quite powerful for this.
- Is there a case for simply capping max SoC at 90-95% for daily cycling and only going to 100% occasionally for balancing?
- If the BMS has no native current dampening, could external automation (CerboGX, Node-RED, etc.) handle it?
I've seen people run large LiFePO4 banks for years without any taper logic and report fine capacity retention, but equally I'd rather not find out the hard way.
Interested whether anyone here has practical experience integrating proper charge tapering on systems where the BMS is fairly basic. What's your approach?