Floating point units are standard on CPUs today, and even desktops might use them today (3D effects). However, I wonder which applications have initially driven the development and mass adoption of floating point units in history.
Ten years ago, I think most uses of floating point arithmetics constituted in either
- Engineering and Science applications
- 3D graphics in computer games
I think for any application where decimal numbers might have appeared at those times, fixed point arithmetic has been sufficient (2D graphics) or even desirable (finance). Usage of integers would have been sufficient then.
I think these two applications have been the major motivation to establish floating point arithmetic in hardware as a standard. Can you name others, or is there a compelling reason to disagree?