On my quest to better understand how computers work at a deep level I have come to the question of why, exactly, silicon is used in microchips. I always assumed, naively, that silicon had a very high electrical resistance and so it made a good material to sandwich other materials with low electrical resistance (i.e. gold) in. And that this was the way that microchips were made.
After actually doing some research I see that I was wrong and that silicon is a 'semiconductor'. To keep this short I'll just skip forward and just say that I don't understand what a semiconductor is and why it's good for making microchips. I've seen several explanations and they either confused me, or the explanations completely contradicted each other, but the basic gist is that a semiconductor is somewhere in-between a conductor and an insulator. Why is that useful for making integrated circuits?