I'm curious about the trade-offs that have been made in implementing different kinds of arithmetic in different models of computers, to which I'm trying to understand what the costs of various options would be.
Say you want to implement base-100, using 7 bits to store a decimal number from 0-99. So you want an adder circuit that takes a pair of 7-bit inputs and produces a 7-bit output (still 0-99) plus a carry flag.
How many logic gates would this take, roughly, compared to the obvious alternatives of either 7-bit binary or 8-bit BCD?