Although Wikipedia, today, has fewer than 6 million articles (plus millions of red-link articles), the total revisions number nearly 918 million. Currently, the article count is 5,954,717 articles, with 917,409,211 total revisions, giving an average of 154.06 revisions per article.
Sometimes people are worried that the number of articles or edits is a problem. It isn't, but it is friendly to your fellow humans to try to make it easy to use the article histories in a productive way. One way to do that is to avoid unnecessarily large numbers of revisions for a change, while another is to use more revisions than strictly required so that your edit comments can clearly say what you are doing and why.
The revision count is not a technical or cost problem. The Wikimedia servers combine old versions into large batches and then compress them. Because so much is the same between revisions the compression produces a huge reduction in storage space. The storage space is almost free because starting in late 2004 and early 2005 Wikimedia uses SATA hard disks on some of the web server for this work. The SATA disks are extremely cheap and because the servers are already used for page building there is insignificant additional cost there. As with the main database servers, several copies of each set are kept so that failure of one machine will not cause trouble.
Some people mistakenly believe that deleting articles or revisions saves space somewhere. Wikipedia keeps all old versions of articles and versions, including for deleted articles. No space is saved by deleting. Editors with the basic rights cannot see these versions and might wrongly believe that they are gone. Administrators can see most of them, those with Oversight permission the few that are deleted for reasons that require concealment also from most administrators.
There is a small cost for each edit. A tiny amount of storage is used for metadata and summary information. The edit also has to be sent to the slave database servers. At extreme edit rates this can sometimes cause short delays in the most recent revision being available but today the servers used are fast enough and the processes are streamlined enough that this is not the problem it used to be sometimes back in 2004/5. The issues with this have essentially been engineered out of the Mediawiki design as they were encountered.
The licenses used by Wikipedia require that every revision is saved. If there was wholesale complete removal of revisions, instead of just hiding some, the article would become a copyright infringement and would need to reverted to a blank page and rewritten. Thoughts of copying article text and then deleting the original are impractical for this reason. Don't do it, complying with copyright law is something that is taken very seriously here.
While there are no technical issues with lots of revisions, there are some human issues with careless use of lots of unnecessary revisions.
An edit history that is clogged with experimental or "junk" edits may become confusing to humans who are trying to work out what happened and when. An editor who makes multiple edits to an article in an attempt to achieve his/her final plan may view any edits before the final one as temporary revisions that will not remain very long. Sometimes, it is not easy or even possible to get the permanently planned revision made in a single edit. This can be the case when the edit contains a huge amount of information, or when it is difficult to enter all the text at once. While Wikipedia is a work in progress and there is no deadline for completion it is good to make life easier for those looking at the history to work out what was done in each edit and why.
If sensible, consider the following steps, none of which is required, but which might sometimes make life easier for others: