Many web frameworks, such as Flask or Django use SQLite as their default database. SQLite is compelling because it's included in python, and administrative overhead is pretty low.
However, most high traffic public production sites wind up using a heavier database: mySQL, Oracle, or postgresql.
The questions:
Assume:
- Site traffic is moderate, and concurrent read/write access to the database will happen
- We will use SQLAlchemy with SQLite write locks (although this comment makes me a little nervous)
- The database will contain perhaps 60,000 records
- Data structures do not require advanced features found in heavier databases
Is there ever a compelling case against SQLite concurrency for websites that serve as moderate-traffic internal corporate tools? If so, what conditions will cause SQLite to have concurrency problems?
I'm looking for known specific root causes, instead of general fear / unsubstantiated finger pointing.