I'm trying to design a circuit to be used with some 12V LiFePO4 battery packs which don't have an integrated heater, so I can add thermal protection.
The circuit would sit between a 65W solar charge controller and the battery's existing BMS.
It would sense temperature and during detected freezing conditions it would:
- disable charging using an n-channel MOSFET
- enable a heating pad when charge current is available
- prevent the battery from discharging into the heating pad
- not prevent the battery from discharging into the load
I won't have access to the load terminals so my circuit sees the charger and load together. The load is about 2W.
I'm stuck on how to achieve #2 and #3 together without disconnecting the load. I need some way to sense that there is charge current.
I'm currently exploring a strategy involving the comparison of battery negative voltage to the drain voltage of the charge control MOSFET. This approach seems promising as, during charging, the drain voltage is expected to be lower than the battery negative terminal voltage.
Falstad simulation here of charging scenario.
This approach works in the simulation, but:
- My model of the battery and charger are very crude and possibly wrong. How should I model these?
- The rds_on of the MOSFETs are unrealistically high, so the potential difference I'd need to sense would be much smaller, possibly too small to be reliable?
- I'm unsure what would happen at edge cases where there's just a bit of charge current available. For example, some kind of hysteresis is probably needed for current sensing but I'm not sure how to add this.
Many BMS systems boast built-in thermal protection capabilities. Given this, I assume my goal is attainable. Is my general approach sensible, or are there more effective alternatives to consider?