When we commit changes that affect a lot of pages that are already indexed by google (results pages for instance), google will detect those changes and it will take a long time to re-index them again.
During this process we can watch google de-indexing our site's pages to subsequently re-index them . Making continuous changes in the site can cause the crawler to reduce its crawling speed and so it will take a lot longer to for our changes to take effect.
To avoid this, one solution is to try to make batch changes, and then try not to make new ones until google re-indexes all the pages that were already indexed before the changes took place. Then we can study the results and queue future changes into a new batch.