In short, you are right, batteries can discharge faster than charge. But in terms of efficiency of use, the matter is not the same.
The way the question as initially done does not consider some very important topics, as battery is essentially a (rechargeable) source of electrical energy. These are some relevant points to consider:
- How much is the charging current of a Lead Acid battery?
- How much is the maximum continuous discharge current of a L.A. battery?
- How much is the maximum intermittent discharge current?
- How long the battery can deliver a given discharging current?
- How efficiently the energy can be converted during a discharge?
- Addendum: How a car battery is charged by an alternator-based charging system?
Most of this questions I have already answered in this post #1, but I will present some discussions here, using your data as an hypothetical example:
Battery = 12V.
Capacity = 50Ah.
Cold Cranking Amps: CCA = 250A.
A safe charging current is limited to 15%~20% of C during the bulk charging stage, as I illustrated in this other post #2, on the first picture.
So, I.ch < 10A.
For an AGM SLA battery a safe continuous discharging current is 3C. The AGM data I found in that post had a maximum intermittent (5s) current of 15C, where the CCA (discharging for 30s) was about 55% of that absolute maximum, or CCA = 8C, when maximum continuous was 3C.
On the other hand, your (hypothetical) battery had CCA = 5C. I would then guess this is a deep-discharge battery with no further data available.
In this case, my guess for the maximum continuous discharge current would be 3C/8C x 1C = 18~20A.
Obviously, you could discharge at CCA = 250A, but just for up to 30s and the voltage would drop as low as 7.2V.
It can be useful for a cranking motor running for 10~20s, but to use it to power an inverter, probably the electronics would disconnect sooner, by 10V. Please check that first link (#1) for further details.
An important concept not spoken is the amount of energy = Voltage x Current x Time or
Energy (Wh) = Voltage (V) x Capacity (Ah).
Stored energy as “V x C” in your case is 12 x 50Ah = 600Wh. This value is valid for a discharge rate of 0.05C = in 20 hours.
Discharging an AGM at C/20 = 2.5A <-> you get 100% C.
Discharging an AGM at C/1 = 50A <-> you get 70% C.
Discharging at AGM maximum of 3C = 150A <-> you get 42% C.
In your case, your battery’s maximum continuous discharge was estimated as 20A, and it last just 42% C; time = (42% x 50)/20 = 1.05 = just 1 hour!
When discharging rate become less efficient?
From the Graph on Figure 2 of the post #2, it seems that discharge rate should be limited to C/5 or the battery will loose actual capacity “%C” more drastically.
Again, extrapolating that manufacturer’s data to other batteries - until we have more exact data, it is the best guess we have.
A curious observation from the available data:
Both continuous current limits - the maximum charge current and maximum discharge current are the same, 10A = C/5.
Manufacturers have declared intermittent discharge current limits as that 15C max, or the CCA; but they do not provide information (at consumer level) about intermittent charging current limits (correspondingly similar to the mentioned limits of 30s or 5s).
I could speculate that some manufacturers design their batteries to accept intermittent charging excursions higher than C/5, as the ones caused by alternator charging under variable engine rotation.
Another point is:
How much energy is used and how deep the battery is discharged for starting an engine for as long as 30s?
Let’s assume the CCA efficiency is even lower than 42%, for instance, just half of it, at 20%:
Energy = (V x A x Time)/ Efficiency
E = 12V x 250A x (30s/3600)H / 20% = 125Wh.
This 125Wh is about 20% of the rated energy of 600Wh.
So, due to this intense cranking effort, the battery discharged to 80% State of Charge and battery voltage will not drop too much.
Recharging by the car’s alternator:
As the battery may be in a car, alternator will recharge promptly, topping off with 10A~20A (or more) along several minutes (or hours on a trip), with variable charging capacity mostly due to engine’s variable speed/RPM, until voltage reaches nominal about 14.4V.
As a general estimative, for every Hour with I.ch = 10A, the alternator sends “12V” x 10A x 1h = 120Wh nominal, somehow recharging in 1h @ 10A the energy used in that long engine cranking of 30s. Complete charging time may be shorter if maximum initial charging current is higher and if actual cranking time is usually as short as 5~10s.
The actual charging voltage is indeed higher (13.x up to 14.4V), but here I assumed it being used to compensate the electrochemical inefficiencies; that is why I used the same nominal 12V.
The exact charging voltage is compensated by the ambient temperature - this alternator regulating IC has a graph illustrating the Charging Voltage versus operating temperature:

Addendum:
How a car battery is charged by an alternator-based charging system?
I found an interesting article discussing several details about the charging in more realistic situations,as variable rotation speed (RPM). Here is a collage of some figures in this article, where the colored curve graphs represent charging system response varies significantly along different engine speeds - 750 rpm, 1500 rpm and 3000 rpm.
