Input bias can flow both into or out of an op-amp's inputs, and in certain op-amp topologies may do either. The consequence is that you need to design your circuit accounting for the current, i.e. if your source impedance is 100kΩ and you've got 1pA of input bias current, a non-inverting buffer will have a voltage error of (100kΩ * 1pA) = 100nV.
As you can see, input bias current has a negligible effect for low impedance sources, and can have a very significant effect for large impedance sources. This is also why you cannot capacitively couple op-amp inputs and expect them to work; the inputs must sink or source this tiny current for the input topology to work.
Edit: To more directly answer your question, input bias current is not something you need to "provide" explicitly, rather, consider your circuit as sourcing (or sinking) this current. In the process, an error term will be generated. Another way to think of this is that the op-amp has a finite input impedance, and therefore your circuit needs to account for that. If your source has a very high impedance and the op-amp input impedance is too low (which is the same as the op-amp having a relatively large input bias current), then the voltage divider formed by the source impedance and the op-amp input impedance will be further away from 1, which results in a voltage error.