There is no free lunch. The less average current you feed x number of display segments, the dimmer they will look, regardless (to a close approximation) of whether they're operated at 10% duty cycle or 100% duty cycle, provided only that they PWM frequency is above the human eye's flicker fusion rate. Persistence of vision does not buy you anything because the light intensity is time-averaged by the way the physiology works. Modern LEDs are more-or-less linear in that you give it more current you get incrementally more-or-less the same increment of light (less so at higher currents, so you lose).
Historically, old, badly made LEDs required a threshold current to get going and there was some improvement reported, but that's not been true for decades. Micro amperes can produce a slight visible light.
You have three choices if you want segments that appear steadily illuminated and much less current.
Use a bunch of average current and use a cheap SMPS adapter to get the current
(i.e. fugetaboutit).
Live with very dim displays
Buy more expensive displays that have more brightness at a given current (using
better LED dice inside). They probably will have a bit more forward voltage and the colors may be a bit different (more orange-red than red, for example). Not all mcd claims on datasheets are reliable, in my experience, though the well known manufacturers are mostly believable.
That's about it. Incremental improvements can be had in power (not current) consumption
by reducing the voltage a bit (at the expense of other things such as uniformity). It can be mitigated a bit by using more expensive drivers (eg. MOSFETs that drop mV instead of the relatively wasteful (but dirt cheap) Darlington drivers I suggested.
I must applaud you for your concern about power consumption, most companies care only about such mundane things as commercial success and short-term profits. As such, they'd rather put another $1 into a power supply capacity than put $20 into higher performance circuitry. Hard to believe, isn't it?