Forum Discussion
jrnymn7
Jan 16, 2015Explorer
Niner,
From what I've read so far, there are widely varying maximum charging voltages, ranging from about 15.3v to 14.4v. These are max, so not the norm. Just like 16v for EQ'ing is not the norm when 14.8v is the recommended charge voltage.
The cell monitor will keep the individual cell from overcharging, i.e; from going over a preset voltage. If any given cell does reach the upper setpoint, the entire charging process is stopped, and the cell is actually forcibly discharged, or "drained" of some of its power. Then charging resumes. It reminds me of the on/off shunt solar controllers that work within a particular set of voltages.
So, if the upper setpoint is 3.9v, that is 15.6v for four cells. But if the max voltage is said to be 14.6v (3.65v per cell) that leaves plenty of room for imbalance. So it appears the 4 cells could be way off and yet charging would continue. I imagine Full Charge would be when all 4 cells have reached that 3.65v mark? And I'm still not sure what turns charging off, once the said 14.6v overall battery voltage is reached?
During battery use/discharge, a lower per cell voltage limit is imposed. I'm guessing these upper and lower limits (setpoints) vary from BMS to BMS.
Your 350 watt MW has plenty of voltage, but perhaps lacks in amperage. I have read everything from 5C to .5C as an appropriate charge rate. I've even come across the phrase "for fast charging, use ..." But on a 200Ah Li (320Ah fla), .5C is still 100 amps; way more than most folks have.
I think the coolest thing about Lithium may be the faster charging in the 80-100% SOC range, due to higher acceptance rates. Talk about knocking charge times down! In my case, where doing a morning charge with the gennie, before turning things over to solar, is just not feasible at times, I will be able to gen charge in the evening, without wasting a lot of energy. This changes the game completely.
From what I've read so far, there are widely varying maximum charging voltages, ranging from about 15.3v to 14.4v. These are max, so not the norm. Just like 16v for EQ'ing is not the norm when 14.8v is the recommended charge voltage.
The cell monitor will keep the individual cell from overcharging, i.e; from going over a preset voltage. If any given cell does reach the upper setpoint, the entire charging process is stopped, and the cell is actually forcibly discharged, or "drained" of some of its power. Then charging resumes. It reminds me of the on/off shunt solar controllers that work within a particular set of voltages.
So, if the upper setpoint is 3.9v, that is 15.6v for four cells. But if the max voltage is said to be 14.6v (3.65v per cell) that leaves plenty of room for imbalance. So it appears the 4 cells could be way off and yet charging would continue. I imagine Full Charge would be when all 4 cells have reached that 3.65v mark? And I'm still not sure what turns charging off, once the said 14.6v overall battery voltage is reached?
During battery use/discharge, a lower per cell voltage limit is imposed. I'm guessing these upper and lower limits (setpoints) vary from BMS to BMS.
Your 350 watt MW has plenty of voltage, but perhaps lacks in amperage. I have read everything from 5C to .5C as an appropriate charge rate. I've even come across the phrase "for fast charging, use ..." But on a 200Ah Li (320Ah fla), .5C is still 100 amps; way more than most folks have.
I think the coolest thing about Lithium may be the faster charging in the 80-100% SOC range, due to higher acceptance rates. Talk about knocking charge times down! In my case, where doing a morning charge with the gennie, before turning things over to solar, is just not feasible at times, I will be able to gen charge in the evening, without wasting a lot of energy. This changes the game completely.
About Technical Issues
Having RV issues? Connect with others who have been in your shoes.24,369 PostsLatest Activity: Mar 14, 2026