I am developing a server-client application where the client will run on Windows and the server probably on Linux. Maybe I'll later port the client over to Mac and Linux, but not yet.
All home-computers these days run on little-endian. I googled a while, but I couldn't really find a list of devices which run on big-endian. As far as I know, some Motorola chips still use big-endian and maybe some phones (I do not plan on porting the app to smartphones, so this doesn't matter to me). So, why would I rearrange the bytes of every integer, every short, every float, double, and so on, for reading and writing, when I already know that both, server and client run on little-endian?
That's just unnecessary work to do. So, my question is: Can I safely ignore the endianness and just send little-endian data? What are the disadvantages?