Welcome back to OESS! This month we spoke to Lindsay Patton who shared her innovative adventures with Design of Experiments.
Lindsay is a mechanical engineer at US Synthetic, a certified Objective Experiments Problem Solver, and an active user of DOE for the past five years. While some practitioners use DOE only as a start-to-finish process, Lindsay also uses DOE to help with individual stages of existing experiments.
“It guides my thinking,” says Lindsay.
For instance, when considering which factors might be important to solving a problem Lindsay might use the analysis tools found in JMP software to identify the most important factors from previous experiments. Consequently, she might use this data-mining innovation to save time and money, using existing information instead of (or in addition to) a full five-step Designed Experiment. After all, why use valuable resources when your previous successes may already hold a solution?
Like Roland Ruprecht from success story #1, Lindsay hasn’t experienced much resistance from coworkers or management for using DOE.
“I am fortunate to work with a group that is very supportive and very familiar with DOE. If there is any push-back it’s usually a disagreement about the design space.”
When issues do arise, such as a lack of consensus about how widely to vary factor levels, Lindsay emphasized that her group was willing to set aside biases and agree to objectively examine the data before coming to conclusions about a project’s value.
“All models are wrong, but some are useful,” said Lindsay, quoting George Box. With that philosophy in mind, Lindsay and her peers know that they can gain valuable information even if a model happens to be imperfect.
When asked what advice she would give to a new user of DOE, Lindsay replied “Understand the limits of DOE, but also one factor at a time methods. It’s a trade-off between speed, cost, and the quality of your data.” She went on to say that while one factor at a time experiments were usually very fast and inexpensive the data derived from such experiments was generally only useful once. Designed Experiments, while more costly in terms of time and resources, yielded more robust, reliable data that could be useful perhaps indefinitely. Perhaps they might even provide the tools to answer questions yet to be asked.
On behalf of Objective Experiments and the readers of this blog, thank you Lindsay for inspiring us to keep innovating! Our next profile will be published on May 18th. Until then, keep celebrating your success.
Have a success story or DOE advice of your own? Feel free to share it in the comments or give us a call toll-free at 1-866-683-6173