For a school project I'm trying to implement an equation for example like this: (EDIT)
B = ((A + 2) * |A - 10|) / (c * c)
everything is unsigned binary values, absolute values always. The equation should be evaluated 57600 times per second for an image of 240x240 pixels.
I don't know how to start it. Would I be better to implement it by making a MIPS processor and load a list of instructions of the program in assembly and so?
Or should I do a direct approach by code? If so, what methodology should I follow, should I do FSM? Should I use clocks?
I tried to program it by easy combinational (assign... etc) and it works, but it uses almost 80% of available ALMs. I don't think this is the best way, I'm looking to make it the less hardware usage possible, time is not a constraint. I'm using Quartus II and Verilog.