TL;DR - it might be fine, it might not, so you should contact the manufacturer and ask them.
From what I can see from various schematics on grid tie sync, the inverter creates one phase synched to one line of the grid, and it creates another phase that's the inverse of the first one.
This doesn't sound 100% right in terms of how grid tie inverters work.
Typically the way a grid-tie inverter works is that you have an IGBT push-pull bank per phase, with the gates being individually driven by the controller. The controller monitors the grid voltage, frequency, and phase, and aligns each inverter phase's output phase perfectly to the grid by separately controlling the IGBT push-pull drivers.
However, when the mains goes down, you don't have any phases to reference your generated waveform to, so you have to generate your own. In this case, generating a single reference clock and then phase-shifting it by 180° to drive the second leg of the split-phase output makes sense.
When the grid comes back online, the inverter must then increase or decrease the output frequency of the inverter output in order to catch up or slow down to the phase of the grid. Once it locks phase, it then frequency-matches back to the grid so that the inverter output tracks the grid frequency in phase. It can then finally adjust the voltages. The exact mechanism for doing all of this is implementation-specific.
Is this bad?
Whether or not it will work depends on how your specific inverter is designed internally. The manual does not make a clear statement one way or another.
Will the neutral balance out the voltage?
No, absolutely not.
Will this lower the voltage or waste energy? Or is this acceptable?
Your inverter will either work just fine, refuse to tie onto the grid, act temperamentally and trip phase error or undervoltage alarms, or trip an internal breaker. I cannot foresee a situation in which a UL approved and IEEE1547-compliant grid tie inverter would catastrophically fail due to phase imbalance, because such a product is guaranteed to contain safety features that will disconnect and shut down the inverter in the event of improper connection.
A grid-tie inverter works by carefully matching the frequency, voltage, and phase of the mains waveform before it attaches to the grid, so that the inverter's generated voltage waveform perfectly matches up with the grid waveform. It must do this because a current flows when there is a voltage delta between connected nodes, and the current that flows is inversely proportional to the resistance between the nodes, as Ohm's law tells us. Since the connection to the grid is very low resistance, any small voltage imbalance will create a massive current surge and overload the inverter, possibly leading to catastrophic failure.
I am sceptical that your inverter always utilises an artificially shifted reference clock to generate the second phase, even when connected to mains.
Such a system would work by phase-locking to the first phase, matching that phase's frequency, then generating the second phase 180° shifted from the first. There is no need to account for phase sequencing - swapping L1 and L2 makes no difference, so that doesn't matter. However, there's a pretty major design flaw here: the phase angle tolerance of the grid is not zero.
The grid does not have an absolutely perfect guarantee of the angles between phases. I couldn't find the official spec for North America, but the UK specifies ±1% phase angle for three phase. On a North American system this phase angle error would result in as much as a ±3.26V 60Hz AC voltage delta between an artificially generated 180°-shifted waveform and the real second leg of the grid. That varying voltage delta is a catastrophic problem if it isn't accounted for - the tracking accuracy needs to be more like 100mV or less, and 3V is far too wide a margin.
It is possible that this could be corrected for by allowing for a certain margin of current imbalance and then adjusting the inverter voltages in realtime using a control loop based on the per-phase current sensors. This would cause the inverter to constantly re-adjust its output voltage to match the offset caused by the phase mismatch, essentially generating a 3V "wobble" on top of the artificially generated 180° shifted output so that the voltage is closely matched to the real signal.
However, I find it hard to believe that this would work in practice. For one, you'd need to match the output voltages to the grid before you connected to it, which means you can't sense the current imbalance ahead of time as part of the control loop. It'd also be very difficult to keep the control loop stable. On top of all that, it doesn't make any sense from a control standpoint.
The more common design approach is to have a separate phase-locked loop (PLL) for each mains phase, so that it can perfectly match the phase of each line independently to account for slight phase errors. The IGBT inverter banks are separately driven anyway - they need to be so they can handle per-phase voltage imbalance - so it makes absolutely no sense to try to drive one phase by referencing the other. Every 3-phase grid tie inverter I've ever looked at uses separate PLLs for tracking the phases. From a board manufacturing perspective it even makes sense to use this approach, because you can start with an existing 3-phase design and remove one of the phases.
The main potential problem here is the transition point when the grid feed is lost or returned.
In the case of your mains feed dropping, the inverter was already tracking the phase of the mains feed, so you won't get a sudden phase discontinuity when the inverter takes over - the output is following the PLL and control loop, which is already matched in voltage, phase, and frequency because it was tied to the grid. If the waveform generation suddenly switched to an artificially generated source with no regard for phase or frequency, you'd get a sudden discontinuity. What tends to happen internally is that the inverter "drifts" the output from whatever parameters were being tracked on the mains, back to its own internally generated reference. It is entirely possible that the 120° angle you started off with, from the mains, will slowly drift to a 180° nominal angle over the period of a few seconds, without any trouble. It depends on the implementation.
This is the inverse of what happens when the mains returns - the control loop slowly shifts the two phases in frequency until they catch up with or slow down to the target. The main problem here is that the inverter might not be able to so without violating output specs and triggering under/over alarms on voltage and frequency. Again, it's possible that the inverter might just quietly drift both phases back to the 120° angle separation without problems. If your equipment was already happily working on the unusual mains feed, it shouldn't care about the inverter shifting things about more, with some potential exceptions for synchronous motors.
Most of this isn't actually a question of the hardware design. Two split-phase inverters that worked with 120° and 180° phase angles could use identical hardware, and just have a different nominal phase shift programmed into the controller firmware.
Given the documentation and spec sheets, I don't think there's enough information to say whether or not it will actually work. All I can say is that it isn't impossible, even though it's unusual. You should contact the manufacturer and ask.