We have a requirement consisting of periodically synchronising test environment with production environment.
The strategy used is follow the documentation procedure here:
The goal is to automate that. The problem we are facing is that the data coming from prod is huge (which is expected). The backup of production is huge and the composition of the tables make it difficult to do it in once.
Is there a way to fragment or segment the backup from an installation? I know there are several tables and some fields are quite big, I don’t know if there is any advice on how to do this.
Creating a backup of your database is not a Bloomreach Experience issue. You’ll have to look up the specifics of your dbms on how to do this, but modern databases allow for backups on a running system and should take care of any traffic that occurs during this time, though this is often also tied to Enterprise versions of the product.
Thanks, Jasper. The question was not “Hippo specific” (at least directly).
In our case (we use MySQL), what we are doing is to avoid having the content of the following tables in the initial backup dump from the source environment (live, in our case):
Maybe we didn’t find a way to fragment the content in other tables but that did help us a lot because those tables have a lot of content and they are to be truncated as part of the synchronisation process.