Talk:Cycle time variation
This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
[Untitled]
[edit]Once a person with an education in manufacturing engineering and design knows and understands the definition of Service Level, they will come to the conclusion that the variation in cycle time is the single factor that generates waste.
Suppose one cannot meet the allowed lead time. There is no doubt outside of a request to expedite a process that large variations in cycle time require that the lead time be as large as possible to achieve the lead time. It will follow that additional time be allowed. Now that lead time is greater one must increase one or all three of the buffers (inventory, manpower, overtime). Inventory comes first.
Deming wrote that "All management is prediction." With this knowledge and an understanding of statistics one will see that a large cycle time variation makes management difficult and at some point impossible. If one cannot predict one cannot manage. One must pad or perish. So from where does the problem of cycle time variation derive is the question. The answer isn't conspicuous until an understanding of how all of the processes take place. Investigate and one will discover that the higher the variation the more likely no two people perform an identical task the same.
Why? There is no prescribed Method. That's it. No more whys. The issue is that anyone that can do it does it the only way they know how. So, if one wishes to reduce the cycle time variation one must inevitably implement a program of methods and training to the methods, everywhere. All of the buffer management and throughput accounting in the world can't beat that plan. But they can still make a positive difference in the predictability of the lead time. After all that's what buffers do. And it's easier than retraining all of the work force, which is what lean is about.
Consider also that Mr. Deming wrote, "Management creates quality." As "axioms" both his statements work well together. Look at the numbers presented to the market and investors by any company of interest. The quality of the numbers is always in question. The reason, as one could deduce from the axioms, is that quality and predictability are derived from the capability of the management. Thus, if one finds the numbers to be of poor quality then one might have little confidence in the company and move on.
But what's the difference one might say. After all they are estimates and so must be wrong. As true as that is the numbers shouldn't be off by too far. This means that they should be of a significant level of quality (or expectation). Even if the management presents an expected profit and the actual turns out to be far higher, that's not going to last. Why? Because chances are that the next time it will be far lower. That's risky to an investor. Should a CEO desire to please the majority of his investors or potential inventors he would need to give higher quality numbers (forecasts).
For this article I suggest a relation of the cycle time variation to the Service Level equation. It's affect on Little's Law and the equations of queue analysis in order to comprehend the actual wastes involved should also be included.