"All models are wrong, but some are useful"

The first time I examined a patient on my own, I was struck not by what I knew but by how little of what I knew seemed to apply. Medical school had given me a framework. I could take a history, identify symptoms, construct a differential diagnosis. But the person sitting across from me did not present as a textbook case. She was anxious and vague. She contradicted herself. She mentioned her back pain almost as an afterthought, buried under a longer story about her landlord and a dispute over mould. The model I had been trained in told me to isolate the clinical problem. The reality in front of me was that the clinical problem could not be isolated from the life that contained it. I treated her as best I could, knowing that my understanding was partial, that I was working from a simplification, and that waiting for a complete picture was not an option because she needed help now.

That experience has repeated itself in every context I have worked in since. In research, in venture capital, in building a company. The feeling is always the same. You are standing in front of a situation that is more complex than your tools can fully capture, and you must act anyway. Not because you are reckless or overconfident but because the cost of waiting for perfect information is almost always higher than the cost of acting on imperfect information and adjusting as you go.

George Box, the British statistician, put it best. "All models are wrong", he said, "but some are useful". He was talking about mathematical representations of physical systems, but the observation reaches much further than statistics. Every framework you use to make sense of the world, whether it is a clinical diagnosis, a business plan, a thesis about where a market is heading, or simply your understanding of another person, is a compression of something too complex to hold in full. That compression is where both the value and the danger live. The value is that it lets you act. The danger is that you forget it is a compression at all.

I studied anthropology alongside medicine precisely because I wanted a language for this gap. Anthropology begins with a confession that most other disciplines resist, which is that the observer's model of a situation is never the situation itself. The map is not the territory. The diagnosis is not the patient. The business plan is not the business. This is not a reason to abandon models. It is a reason to hold them with the right kind of grip, firm enough to act on, loose enough to revise when reality talks back.

The temptation, especially for people who are well trained and analytically sharp, is to believe that more information will eventually close the gap. That if you research long enough, interview enough people, run enough analyses, you will arrive at a model so refined that it is effectively true. I have never seen this happen. Not in a clinic, not in a boardroom, not in my own company. What I have seen is people paralyse themselves in pursuit of a certainty that does not exist, while the problem they are studying continues to evolve without them. The physician who orders one more test before making a decision. The founder who runs one more round of customer interviews before committing to a product direction. The investor who waits for one more data point before writing a cheque. At some point, the additional information yields diminishing returns and the act of gathering it becomes a way of avoiding the harder thing, which is to commit to a course of action knowing it might be wrong.

I think about this constantly while building Taxo. Every product decision we make is based on a model of how healthcare clinics operate, and that model is incomplete. It was built from hundreds of hours spent inside practices, watching how front desk teams actually work, listening to the improvisational logic they use to manage chaos that no system was designed to handle. I trust that model more than most because it was built from observation rather than assumption. But I do not pretend it is complete. There are things happening in clinics that I have not yet seen, failure modes I have not anticipated, patient needs I have not accounted for. If I waited until I had accounted for all of them, I would never ship anything, and the people our product is meant to help would continue to be failed by the systems currently in place.

The real discipline is not in building the perfect model before you move. It is in moving with a clear awareness of what your model does not capture, and then going back to check. You act, you observe the consequences, you revise your understanding, and you act again. It is not a straight line from ignorance to knowledge. It is a loop, and the loop never closes, because the world you are trying to understand keeps changing while you study it. Staff turn over. Regulations shift. Patients present with problems nobody anticipated. The model you built last quarter is already slightly wrong. The question is whether you noticed.

There is a particular kind of courage in this that I think gets undervalued. Not the dramatic kind. Not the willingness to take a wild bet or make a contrarian call. Something quieter. The willingness to make a decision you know is based on incomplete information, to own that incompleteness honestly, and to stay close enough to the consequences that you can correct course when the gaps reveal themselves. It is the opposite of both recklessness and paralysis. It is a commitment to moving through uncertainty rather than waiting for it to resolve, because in most of the situations that matter, it never fully does.

Box's phrase has survived for decades because it captures something that most people recognise but few want to say out loud. That you will never have the full picture. That everything you build, diagnose, decide, or believe is an approximation. And that this is not a flaw in your thinking. It is the condition under which all serious thinking happens. The wrong response is to pretend your model is right. The other wrong response is to refuse to act until it is. The useful space is in between, where you move forward with open eyes and the willingness to be corrected by what you find.