cause analysis,  information technology,  lifestyle

The CDC’s COVID-19 Test Design Failure as a Lesson for IT Pros

David Willman, writing for the Washington Post on December 26, 2020, in an article titled: The CDC’s failed race against covid-19: A threat underestimated and a test overcomplicated:

“We at the CDC also have the ability to do that today, but we are working on a more specific diagnostic,” Messonnier said, indicating that the agency was seeking a more sophisticated test.

“We had a conversation with [Stephen Lindstrom] and [Julie M. Villanueva] and asked, specifically: ‘Lots of members are asking if we can drop N3 and just keep N1 and N2,’ ” recalled Kelly Wroblewski, director of infectious diseases for the professional association, based in Silver Spring, Md.

“And their response at that point was: ‘FDA isn’t going to go for that.’ Both of them were like, it’s a non-starter.’’

Government officials later told The Post that the FDA would have considered proposals to remove N3.

Some CDC scientists also were questioning among themselves the need for N3.

“Why are we trying so hard? . . . We know there’s a problem with it,” one of them recalled asking.

Instead of dropping N3, the CDC set about trying to manufacture a new batch of reagents in hopes of eradicating possible contamination that had caused the false positives.

What I find most interesting about this article is CDC's insistence on keeping the N3 test as part of the testing protocol to see if a patient is COVID-19 positive or not. As I kept reading the article, I was thinking about my own blind spots and biases when working on complicated information technology solutions. Can IT's insistence to have a complex solution, such as a PowerShell script, C# code, or a SQL stored procedure, cause a CDC-style 46 day delay in testing when a more streamlined WHO-style solution is readily available? Can technologists, like the scientists in Willman's article, get caught up in the "we are going to solve this" tide? When we get swept up in a fast moving production issue, do we have the wherewithal to take a step back and decided that "good enough" is indeed good enough to solve the issue? Or do we take extra time to take a deep dive into the issue to develop an overly complex solution?

As information technology professionals, we all need evaluate our unique production environments. What tools are available to us? How much time is available to diagnose the problem and identify a most probable cause? How much time will it take to test and evaluate the performance of the fix? Often times, there is no one size fits all solution. The key takeaway from this article for IT professionals is to be mindful the problem we are trying to solve and to balance "good enough" verses "perfection" when developing technology solutions.