Let's say I'd like to save battery power consumption when sending data to remote devices connected wirelessly, i.e., sensor networks. One of the ideas may be compressing the data to reduce the number of bytes to send, but the compression process results in power consumption. This compression idea is only valid when the overall battery power saved for wireless communication exceeds the the power used for compression.
I can just guess that the computation power keeps dropping following Moore's law while the communication power still requires good amount of power still.
- Do you have research paper, technical report, books, application notes that talk about this communication power and computational power tradeoff? Or any real world usage?
- Compared to the power consumed for processing one instruction, how much larger the power consumption for sending one symbol or one byte of data? Hundreds or thousands? Or just tens?
- What practical way may be available for measuring power consumption for communication and computation for small devices (micro-controllers)?
Added
I used compression as an example, but I'm working on more sophisticated algorithm to reduce certain kind of information dramatically. I would like to know if the effort is worth trying, in other words, I need to know how expensive to send a data/symbol or whatever the physical layer is sending compared to the price of running the sophisticated algorithm. I don't care about the latency, at least now. I don't care about the physical layer also, I have an abstraction layer with a buffer that I just use to send or receive information.