As with any radio receiver, if it can handle a higher data rate, then it is usually burdened with having a higher RF bandwidth and this inevitably means more received background noise i.e. a wider BW lets in more noise and hence, you need a higher received signal level to operate with a decent SNR (signal to noise ratio).
Therefore WiFi is at a significant disadvantage because it has a wider RF bandwidth than LTE (normally) and needs a higher signal level to operate at a decent bit-error-rate (BER). This is embodied in the following empirical but commonly-found relationship.
Power (dBm) needed by a receiver is -154 dBm + \$10log_{10}\$(data rate)
For example, if the WiFi data rate is ten times your LTE data rate, then you need 10 dB more signal to operate at the same SNR. Basically if you double the RF bandwidth you "collect" 3 dB more noise. This means that WiFi is usually the first to suffer as signal levels drop (compared to LTE data rates).
Why WiFi has shorter range than LTE?
This is related to the Friis transmission equation but, more simply, you can think about the same effect with light bulbs; consider a 1000 watt lamp and the distance you could see this at night time - you would probably see it fairly clearly from 10 km away and, if you walked a further 100 metres, it wouldn't look significantly dimmer.
Compared with a small 1 watt lamp, you might see it glowing at 100 metres but, if you walked away a further 100 metres, it would be noticeably dimmer.
There are a bunch of other factors too such as operating frequency - WiFi can operate at a higher carrier frequency and the Friis transmission equation informs you that as frequency rises, the path-loss increases: -
Path loss (dB) = 32.45 + \$20log_{10}\$(F in MHz) + \$20log_{10}\$(D in kilometres).
In other words at ten times the frequency, the path loss increases by 20 dB.