The world is made up of systems. Our body is a system, or in fact a system of systems. What we call “society” is another system of systems, as is the natural environment. And all these meta-systems are themselves elements in even more encompassing systems that interconnect them.
But these systems are very complex, difficult to explain or predict. One successful strategy, which has had a revolutionary impact on how we live, is analysis. This approach segments complex entities into smaller parts in order to study them individually. Medical researchers don’t study the body as such, but perhaps kidney function or particular blood cells. Social scientists may specialize in the effect of lobbyists on legislation, labor market patterns among immigrant communities or changing child-rearing norms. Natural scientists study the viability of artificial wetlands, upwelling cycles in certain coastal zones or changes in the carbon balance of a set of tropical forest plots. By biting off chewable portions of a much larger world, science makes it possible to achieve progress in our understanding of how things work: testable hypotheses, demonstrable evidence, causal explanation. Analysis is the art of taking things apart, studying the pieces, and then putting them back together.
But this approach, for all its benefits, fails to capture most of the interactive effects that make a system a system. It leads us to overstate the separateness of the things we study and observe and to understate their connectedness. This is not an argument against thinking analytically, but for not being surprised by what this thinking fails to see so we can at least somewhat compensate for its shortcomings.
I’d like to introduce the term “analytical bias” to refer to this tendency to overlook the interconnectedness of things. Of course, many thinkers, going back centuries, have recognized this problem; it’s the guilty conscience of analysis itself. I’m just giving it a name.
4 comments:
Well, Peter, I agree that is an analytical bias, but I think it is probably not the only one out there, with a variety of well-known fallacies people are subject to in thinking about things that probably qualify as well.
objectives subject to constraints... effective and/or efficient.
'system engineering' tries to build a forest with a bunch of trees.....
the challenge is "integration". what you called "interactions".
50% of software products fail and a large % of the successes don't integrate well.
Yep, gonna use that from now on, your term.
It's kind of like the highbrow version of the problem of media or social echo chambers
Nonetheless, that what bankers perceive as safe is bound to become more dangerous to bank systems than what bankers perceive as risky, is a prediction that should be able to pass the test of time.
https://subprimeregulations.blogspot.com/2016/03/decreed-inequality.html
Post a Comment