@ben_campbell - Recognizing that this thread has been dead for two decades, I stumbled upon it today and think that I can add to the mind boggling stories.
The place that I work is a very large business. One of the core databases that we use requires us to create individual jobs that include individual fields to be filled out for a variety of different purposes. In the past we could make nearly as many changes as we wanted before saving at the risk of losing some work if something unexpected happened. This program was recently replaced with a new web-based version and the new version runs much slower. We have actually documented the new version have more latency in it than it took to complete the entire task with the legacy version. Yesterday, one of my employees documented that using a less efficient workflow actually resulted in a more efficient overall process. Instead of complete the 44 individual changes in sequence which took 11 minutes, she instead closed the working window after each 10 changes which allowed her to complete the task in 7 minutes. We literally have a program that forces us to hit save after every single change before moving to another cell and also has no cache so you can literally click back and forth all day between two cells and the latency will be the same, yet it somehow clogs itself up if you do to many individual changes all on a row. Keep in mind that before a recent update the task took 18 minutes and in the legacy version we estimate that it would have taken 3 minutes. It's a painful time!