03 Jan

Odoo data size limits

Following this past years of Odoo Implementation, we had the opportunity to consult with many large clients whose data has now grown beyond expectation…. and we had to handle Odoo data size limits.
From the Odoo perspective, expansion of database size is most of the time always hitting the same tables in Sales, Stock and Invoicing as well as Accounting. This is aggravated by the following facts:

  • All detail lines including in Acconting are per product.
  • Perpetual costing implies a large number of stock moves since every transaction is tracked per product as well.

According to the business size and the amount of transactions, the database size can become an issue within a few years… or in a few cases, within a few months:

  • Slow access.
  • Slow transactions.
  • Odoo time-out errors.

From that moment onwards, one needs to understand the following in order to deal with Odoo data size limits:

  • PostgreSQL settings and infrastructure are not the answer to your problems.
  • Compensating your problems with more server / cloud power resources will not help much in the long run. You will just end up shortly with the same issues except that having more data “trapped” inside your Odoo database will just make modifications / upgrades / changes more complicated and time-consuming.
  • The problems and issues you are experiencing were there from the start, except the slow amount of data did not allow you to see them.

From the technical point of view, you will end up with the following:

  • Index bloating consuming large IOPS / memory. Yet, removing Indexes would take you back to do sequential scans, which, on large portions of data, gives just as bad performance.
  • ORM is much worse than PostgreSQL to handle requests dealing on large amounts of data.
  • All domain and domain-related queries pulling data through the Odoo ORM become impossible to use over large amounts of data.
  • Dedicated memory per worker falls short when facing the large amounts of data retrieved by PostgreSQL;

We have a tremendous amount of clients or implementers and consultants asking us to “fix” their implementations. Truth is: fixing means refactoring both the database and the modules without losing the data… BUT, since the database schema and structure are interdependent with the modules, any modification done in the modules and addons has consequences inside the database. This makes it VERY complicated and comes up with heavy responsibility on the client’s data which as you know includes transactions and most likely full accounting.
On our side, this means literally hundreds of hours refactoring systems and ending up on D day with a very short window to move on from the “old” system to the “fixed one.” Now, try and do that with millions of records in several tables inside a large database and an operational need to be UP and running on the next day…. And you can see how delicate this is.

In order to avoid the scenarii, I keep repeating the following guidelines:

  • Assess data volume from the beginning. Large clients already have quite a precise idea of how many transactions they run on a daily / weekly / monthly basis.
  • Each client is specific. Tables do not grow evenly inside all Odoo systems. They follow data flow logic and this is the one you ought to follow.
  • Modules, database structure, scaling, clustering, partitioning and indexing must be used according to data growth logic.
  • DO NOT use standard community modules just because you think they are going to make you gain time. Most of the modules I see can do the trick on small to medium-size implementations… but their API versioning and coding will be bottlenecks which will end-up ruining the implementation and make the system unusable in the long run.
  • Store data “store through”. DO NOT use functions. These will kill you with large data.