70mA times 5 seconds is 350mC (milli-Coulombs). Let's assume your LED strips can drop from the normal 13.2V down to 11.2V (2 volt drop) and that's still an okay brightness for you. Then the capacitance needed is 350mC divided by 2V = 175mF (milli-Farads).
A Google search tells me that supercapacitors are available in this range, but not for 12V rating. You may install several in series to increase the voltage rating, noting that this decreases the capacitance proportionally. E.g. 3x 1F 5.5V supercapacitors in series will give you 1/3F (333mF) with a maximum voltage of 16.5V.
Note it will be a supercapacitor or ultracapacitor; boring old normal capacitors don't store this much energy (they stop around 1mF).
The amount of capacitance needed is of course lower if you can tolerate a lower voltage drop.
There's no reason to add a resistor between the capacitor and the LEDs. The LED strip has its own resistors where needed.
There is a reason to add a resistor between the capacitor and the car battery (either side of the diode, doesn't matter which) - to limit the "in-rush current" when the car is turned on and the battery starts charging the empty capacitor very quickly. Probably only a few ohms are needed.
Is it worth it? Up to you I suppose. Tony Stewart EE75 doesn't think so, but it's your car.
This answer does not address the "load dump" concern. I am not at all familiar with this subject area, but I have heard there are requirements for car systems to withstand "load dump" voltage spikes up to 100V for 0.6 seconds. Conceivably you could design the resistor and capacitor to absorb the excess energy, but my quick estimation didn't give good numbers, so you might need a protection circuit to block it instead.
Hey, I think they sell capacitors marketed for car audio amplifiers, for basically this exact circuit. They put them in parallel with the battery so the amplifier can take more power than the battery can put out, for short amounts of time. The diode and resistor probably aren't included.