16 bit micro controllers are more powerful than 8 bit ones and the 32 bit ones are on a whole different level. With that being said, while we can decided what peripherals we needed for an application and try to find the cheapest (and easy to use i.e the tool chain) alternative, how does one know if an 8 bit micro controller should be used for an application or a 16 bit one or a 32 bit one?
I do understand the differences between them so I will put the question in a different way.
Lets say I have an application in which I shall have an LCD display used to show bitmap image, a 16 key pad, buzzer, temperature and humidity sensor and a mouse input. A high end 8-bit micro controller can easily do all these things in real time. Now if I move to a colored LCD display, then I may need a 32 bit microcontroller that can update the colored display quickly enough and has more memory. However, I may only find out that my high end 8-bit microcontroller is weak, after I have tried it in the project.
So, before we start working on the project, how do we know what size and how much powerful microcontroller is needed for the project?