In the field of cybersecurity, a distinction is made between the "blue team" task of building a secure system, and the "red team" task of locating vulnerabilities in such systems. The blue team is more obviously necessary to create the desired product; but the red team is just as essential, given the damage that can result from deploying insecure systems.
The nature of these teams mirror each other; mathematicians would call them "dual". The output of a blue team is only as strong as its weakest link: a security system that consists of a strong component and a weak component (e.g., a house with a securely locked door, but an open window) will be insecure (and in fact worse, because the strong component may convey a false sense of security). Dually, the contributions to a red team can often be additive: a red team report that contains both a serious vulnerability and a more trivial one is more useful than a report that only contains the serious issue, as it is valuable to have the blue team address both vulnerabilities. (But excessive low-quality reports can dilute attention from critical issues.)
Because of this, unreliable contributors may be more useful in the "red team" side of a project than the "blue team" side, though the blue team can still accommodate such contributors provided that the red team is competent enough to catch almost all of the errors that the contributor to the blue team might make. Also, unreliable red team contributions only add value if they _augment_ the output of more reliable members of that team, rather than _replace_ that output, and if their output can be effectively filtered or triaged by more experienced red team members. (1/3)
Terence Tao
in reply to Terence Tao • • •The blue team / red team distinction extends beyond cybersecurity to many other disciplines as well. In software engineering, for instance, "blue teaming" might correspond to the generation of new computer code, while "red teaming" would consist of such tasks as quality assurance and testing of such code. In mathematics, "blue teaming" could involve coming up with speculative ideas to solve a math problem, while "red teaming" checks the arguments for formal errors, and also raises heuristic objections to a blue team approach being viable. (See also my discussion about "local" and "global" errors in mathematics at terrytao.wordpress.com/advice-… ).
I like to refer to these two teams in mathematics as the "optimists" and "pessimists"; in my experience, the strongest collaborations arise when there is a roughly equal split between the optimists and pessimists in the collaborations. (Depending on the collaboration, I myself have sometimes played the optimist, sometimes the pessimist, and sometimes a mixture of both.) (2/3)
On “local” and “global” errors in mathematical papers, and how to detect them
What's newJohn David
in reply to Terence Tao • • •