A basic calculator can perform a wide variety of operations. How does the claculator 'get' the concept of adding two numbers in the same way a human does? I dont think people that make claculators store every possible result of every possible addition on the chip before shipping it. Soemhow the chip needs to 'learn' how to add.
I mean if I were to start with a blank slate(chip) and 'teach' the chip to perform calculations(additions), how do i go about it? Do i need to learn artificial nural networks first?