Inertia Driven Programming

Ovid on 2009-02-16T21:46:28

I had so much fun recalling this incident in a reply to Schwern, I thought it deserved a top-level post.

Years ago I worked at a company where we made revenue projections for a particular industry. Industry executives could log on and when they saw what they were trying to project, they had a "weighting" factor that they could enter and the numbers would be adjusted with that. For the sake of argument, we'll say that number defaulted to 12, but based on the executive's knowledge of what they were offering, they would adjust that up or down.

After six months of hard labor, one programmer had revamped the system and the weighting factor defaulted to "15". Our customers were livid. They accused us of "cooking the books". Even though our new numbers were more accurate, somehow the executives thought we were cheating and demanded that we change the default back to 12. Our better revenue projections, ironically, became a PR disaster.

I and another programmer were called into a meeting with a vice president and he asked us to change the number. The other programmer went to the whiteboard and started sketching. He explained how the resources worked, what weights were assigned to them, how revenue was multiplied by those weights and how six months of intensive regression analysis and revamping of our statistical model, blah, blah, blah and circles and arrows and a paragraph on the back of each one.

Fifteen minutes later, the programmer finished. The vice president looked at him and said "Yeah, now can you change the f***ing number to 12?"

Inertia is a terrible thing :)


Interesting anecdote!

tsee on 2009-02-16T22:23:44

That's a very interesting anecdote. Let me add another.

I'm in science. Fundamental research. That means normally, considerations such as that of the vice president in your story don't apply at all. You just try hard to get it correct or at least as right as possible. When you're done, you have to work even harder to get an idea of how correct your result is.

Just recently, there was an update to a measurement database I need for my analysis. It changed my results WAY outside of the uncertainty bounds I had calculated. It has taken me weeks to months to understand why. Now, with much help from others, I think I understand the change. It's a bug fix. Can I now publish my results using this new, known-to-be-more-correct database of measurements? Should I extrapolate from the past and increase the estimate of my systematic uncertainty? Wait for another update to see whether my dependencies stabilize? Suddenly it's no longer a simple "get it right" thing. I have to take into account that if I publish this and another update invalidates it, that's hugely embarrassing. Not publishing means a lot of time was wasted and other people's confidence in my method and work will decline.

This is to say: Even with no economic issues at hand and working as scientific as you humanly can, such precedures, decisions and results will be influenced by non scientific (or non-technical) reasoning.

(Let's not even talk about the politics of large scientific collaborations.)

Re:Interesting anecdote!

Ovid on 2009-02-16T23:13:00

You mention "no economic issues", but economics is just the study of the efficient allocation of scarce resources, not money. Whoever is "first to market" often wins. Plus, even though people don't think that economics takes people into account (though institutional economics does), it often does. I don't think our anecdotes are that far off.

Side note: if you've any interest in economics, I highly recommend reading up about institutional economics. Even if you don't agree with it (which I do, to be honest), it's very thought provoking. And I once got a wonderful date because I knew who Thorstein Veblen was. That has to be a first :)