*To search for student contact information, login to FlashLine and choose the "Directory" icon in the FlashLine masthead (blue bar).

Search by campus:

Corporate Services

Posted Oct. 1, 2009

I Have Been Thinking About Data Analysis

By Robert D. Skillman

It seems that I do that a lot, perhaps that is what makes me a Black Belt, or perhaps being a Black Belt caused it. The more I think about data analysis and talk to others about it, the more I realize that outside the Six Sigma community, very few are very good at it.

We go to college and get degrees and good jobs and we are well trained how to add, subtract, multiply and divide, but we are not trained how to think about data. Some of us get engineering degrees and we are well trained in calculus and trigonometry, and perhaps we even endured a statistics class, but we still haven’t been taught how to really think about data. The MBAs run the businesses and have had business statistics training, but my observations are that they have not been properly trained in how to think about data either.

All this deprecating talk; where did I get such notions? I got it the scientific way, by questioning and observation. During my thirty-five years working for large and bureaucratic companies I have been called upon to give explanation for trends, patterns, shifts and jumps in data hundreds of times. As I talk to my students, nothing has changed. I wish I knew then what I know now.

What I know now is that most of the statistical signals my bosses were reacting to were not signals at all, but rather noise. When required to give explanation why production scrap is up in July of this year from July of last year by four percent, I would now say “what makes July of last year a benchmark?” We must let the data interpret the data. Once we have a couple years of time series data with natural behavior limits calculated, we can begin to separate the signals from the noise. If the boss wants an explanation for why scrap has been climbing for the last three months, this may not be a signal at all. If it falls within the behavior limits then no statistical signal is evident.

My point is that management generally is inadequately trained in “how to think about data.” This results in demands for explanations and changes in behavior when no signals are present. When time is spent reacting to non-signals, true signals can be missed. All this inefficiency helps our companies to maximize their costs.

Think about this. In 1927 Babe Ruth hit a record 60 homeruns in one season. This record held until 1961 when Roger Maris hit 61. Since that time the record has been broken several times, but let’s just think about the Babe and Maris, for now. What does the data really say?

I collected some time series data. It’s easy to just Google your way into the statistics. It’s all there. The Babe Ruth data compiled is homeruns per season from 1918 thru 1934 and for Maris is 1957 thru 1968. It would have been better to consider homeruns per times at bat, but this is what was available. When we study this from a time series prospective and generating natural behavior limits, what do you think about the data?

Did you notice that the natural behavior limits suggest the 60 homeruns Babe had was just normal behavior, nothing special, and Roger’s 61 was very definitely a special cause. If the Babe would have hit 60 homeruns again, or even as high as 70, it would still be statistically normal behavior based on his career data. With Roger, the 61 was quite exceptional and most likely would not repeat. If the boss asked “what was so extraordinary about Babe’s 1927 Season,” the answer is “nothing.” But if the question was posed regarding Roger’s 1967 Season, the answer would demand an explanation because it was beyond the statistical tests for normal random behavior and therefore, special cause was present. There was no statistical signal regarding Babe, whereas quite the opposite exists regarding Roger.

I intend that this simple example gets you all thinking. Companies spend way too much time reacting to statistical signals where there are none and frequently missing the important signals. Time Series Studies with properly calculated behavior limits will certainly help in identifying true signals and categorizing the noise.

For my students, the homework is to add the following player’s data: Sammy Sosa, Barry Bonds and Mark McGwire. Tell me what you think then. What questions do you have?

Lean Six Sigma in New Places – The Application to CPA Firms

By Dustin Hostetler

“Trusted Advisor.” The term “partner” at CPA Firms is a name that accounting professionals strive to have their clients refer to them as. It was this line of thinking that brought the firm I work for, Rea & Associates, Inc., to explore and then commit to the principles of Lean Six Sigma three years ago.

We had been hearing more and more of our manufacturing clients talking about Lean Manufacturing and Six Sigma. In order to truly be their trusted advisors, we needed to ramp up our education and training on Lean Six Sigma principles, which is exactly what we did. It has lead us into a point of competitive differentiation from other local and regional CPA firms across the state of Ohio.

Interestingly, what started off as a plan to better understand and help our manufacturing clients, has also morphed into a unique application of Lean Six Sigma inside our own firm and consulting with other CPA firms that we call LeanCPA. The principles that have been effective on the shop floor we have found to be just as effective in our service and transactional-type processes. It all starts with understanding the voice of the client (customer), and making sure we have processes designed to service our clients accurately, timely and with superior client service and added services.

Sounds simple, right? Well it is, and it’s unfortunate that many CPA firms are just getting around to understanding that most of what they do involves processes. There’s more to process than just implementing a new software program. Some simple examples of how we’ve looked at issues from a service perspective include:

• Just as a manufacturer tries to identify and control the process against defects, CPA firms should be collecting statistics on common mistakes (review notes) being made and designing correction programs around these. Often times, the 80/20 rule applies.
• Overproduction in manufacturing involves making product in large batches and being unresponsive to changing customer demands. Overproduction in CPA firms is continuously starting new projects/client work without enough focus on finishing work-in-process.
• Excess transportation and motion on the shop floor are related to departmental manufacturing issues, poor workplace setup and organization. Needlessly walking folders and reports around an office when electronic means can handle this task as well as organizing work teams closer together are a few improvement ideas in the office.

So the next time someone outside of manufacturing says “Lean Six Sigma won’t work for us,” tell them to think again.

Dustin Hostetler is a Lean Six Sigma Master Black Belt for Lean CPA, LLC, a division of Rea & Associates, Inc. He can be reached at dustin.hostetler@reacpa.com or 330-264-0791.

Kent State Announces New Lean Six Sigma Associate Facilitator

Stephen Skillman is a Lean Six Sigma Master Black Belt, currently serving as an Associate Facilitator, training Lean Six Sigma Black Belts at Kent State University. Skillman received his bachelor’s degree in Business Administration from Lake Erie College. He has worked in the automotive, government ballistics and electrical insulation industries. Skillman has held Production Scheduler, Purchasing Manager, Shipping Manager and Quality and Continuous Improvement Manager positions with Nescor Plastics and Iten Industries.