Loading… Please Wait

Hello new and faithful readers alike!

Unfortunately our July profile featuring Patricia McNeill has been delayed. We expect it to be published in the next few days, and we’ll notify you as soon as it becomes available.

In the meantime, check out Bill Kappele’s new article for Quality Magazine, available here:

http://www.qualitymag.com/articles/94097-the-cost-of-a-poor-measurement-system?v=preview

The article describes the costs and drawbacks of poor measurement systems and provides fantastic information for those seeking to improve, or ensure the excellence of, their current systems.

Advertisements

Success Story #4 -Jason Wiggins

I’ve learned the most from circumstances that don’t fit the mold.” -Jason Wiggins

Welcome back to Objective Experiments Success Stories. This month we spoke with Jason Wiggins from US Synthetic. Like our previous interviewees, Jason had only positive things to say about his company’s willingness to embrace DOE.

 

It’s become cultural in our company, probably quicker than some others.”

 

US Synthetic’s open-minded problem solving culture has led to a chain of fantastic successes. One such success was a series of Designed Experiments and Measurement System Analyses for the development of an industry recognized destructive test. The test was used by US Synthetic to develop polycrystalline diamond cutters (PDC) for oil and gas drilling. Tools like Design of Experiments and Measurement Systems Analysis helped the company become the market leader for PDC and earn the Shingo Prize for Lean Manufacturing in 2011.

“DOE is the most efficient method out there,” Jason said, with the caveat that users should be comfortable compromising and making trade-offs to fit their particular budgets. “I use it pretty much exclusively.”

Jason has been described as a “Make It Happen Person” by Objective Experiments President Bill Kappele, and his advice for new or aspiring DOE users fit that description:

 

Don’t be afraid to struggle. I’ve learned the most from circumstances that don’t fit the mold.”

 

On behalf of all of our readers, thank you Jason for your time, energy, optimism, and experience! Please keep an eye out for our next success story on July 13th. Until then, keep celebrating your success!

Do you have a success story of your own? Do you want to? We’re always happy to answer questions, provide guidance, or hear about your success stories toll-free at 1-866-683-6173. Likewise, leave us a comment and we’ll be in touch within 24 hours.

Success Story #3 – Chad Ellis

DOE is an unbiased survey of a phenomenon. The downside is you need to know when to use it. The view must be worth the climb.” -Chad Ellis

Welcome back to the third installment of Objective Experiments Success Stories! This month we picked the brain of Chad Ellis, a research scientist and former chromatography team leader at Phillips 66.

While our previous interviewees shared their wisdom regarding their individual processes of thought and action with regard to DOE, Chad’s leadership experience soon became the topic of our conversation.

In a recent project Chad had to work with his coworkers to identify the best model for application in the DOE. There wasn’t an obvious choice, but after discussing the pros and cons of two different approaches they chose one that seemed to have the most potential.

“We just did it,” Chad said.

Indeed, taking action proved to be the correct decision. One of the beautiful elements of DOE is the ability to gain valuable and objective knowledge with an imperfect model, and Chad’s understanding of that intricacy guaranteed a successful finish to his six-month project.

We also discussed ideas that would be of benefit to new users of DOE and those who’ve yet to learn the method.

“The best way to go into the training is to have at least two specific projects in mind, and to use the training course as a time to evaluate the application of DOE. If you can, talk with your supervisor about how it could fit into your goals.”

In a world so fond of precedent it’s easy to be discouraged from trying new methods. Establishing goals which self-impose a pressure to innovate can help justify your actions when they may appear unconventional. Having a backup plan is prudent in work and for ensuring you gain the maximum benefit from training. Finally, don’t be afraid to just take action.

On behalf of Objective Experiments we’d like to thank Chad for his time, wisdom, and leadership. Our next success story will be published on June 15th. Until then, keep celebrating your success!

Chad’s advice to form specific goals, get buy-in from superiors and decision makers, and to speak to your instructors or experienced coworkers are consistent with successful people in all fields. We’re always happy to answer questions, provide guidance, or hear about your success stories toll-free at 1-888-764-3958

Success Story #2 -Lindsay Patton

Welcome back to OESS! This month we spoke to Lindsay Patton who shared her innovative adventures with Design of Experiments.

Lindsay is a mechanical engineer at US Synthetic, a certified Objective Experiments Problem Solver, and an active user of DOE for the past five years. While some practitioners use DOE only as a start-to-finish process, Lindsay also uses DOE to help with individual stages of existing experiments.

“It guides my thinking,” says Lindsay.

For instance, when considering which factors might be important to solving a problem Lindsay might use the analysis tools found in JMP software to identify the most important factors from previous experiments. Consequently, she might use this data-mining innovation to save time and money, using existing information instead of (or in addition to) a full five-step Designed Experiment. After all, why use valuable resources when your previous successes may already hold a solution?

Like Roland Ruprecht from success story #1, Lindsay hasn’t experienced much resistance from coworkers or management for using DOE.

“I am fortunate to work with a group that is very supportive and very familiar with DOE. If there is any push-back it’s usually a disagreement about the design space.”

When issues do arise, such as a lack of consensus about how widely to vary factor levels, Lindsay emphasized that her group was willing to set aside biases and agree to objectively examine the data before coming to conclusions about a project’s value.

“All models are wrong, but some are useful,” said Lindsay, quoting George Box. With that philosophy in mind, Lindsay and her peers know that they can gain valuable information even if a model happens to be imperfect.

When asked what advice she would give to a new user of DOE, Lindsay replied “Understand the limits of DOE, but also one factor at a time methods. It’s a trade-off between speed, cost, and the quality of your data.” She went on to say that while one factor at a time experiments were usually very fast and inexpensive the data derived from such experiments was generally only useful once. Designed Experiments, while more costly in terms of time and resources, yielded more robust, reliable data that could be useful perhaps indefinitely. Perhaps they might even provide the tools to answer questions yet to be asked.

On behalf of Objective Experiments and the readers of this blog, thank you Lindsay for inspiring us to keep innovating! Our next profile will be published on May 18th. Until then, keep celebrating your success.

Have a success story or DOE advice of your own? Feel free to share it in the comments or give us a call toll-free at 1-866-683-6173