I am not sure if this is the best place to ask this question, but I was looking at a Wikipedia article and noticed there were a lot of edits for that article. Since you can view each edited iteration of the article, I figured the amount of space those web pages take would add up. There are millions of articles and Wikipedia doesn't seem to get rid of vandalized edits. Alternatively, I was thinking that Wikipedia could keep the base article and use the edit history as instructions for how to edit the base article.
My question is, as a person with a limited background in programming, how does a site like Wikipedia store previous edits of pages? Do they store each page, are each edit instructions to modify a base article, or is there some other concept used?
Or, I guess, is the data storage used so minimal, that this isn't even an issue?