Table of Contents
Here is something that should bother you more than it does. Your company probably hired smart people. It might have even fought to recruit them, offered competitive salaries, and bragged about its talent pipeline. Then it placed those smart people inside a system that makes it nearly impossible for them to act smartly.
This is not a bug. According to Herbert Simon, one of the most important thinkers of the twentieth century, this is the entire architecture.
Simon won a Nobel Prize in Economics in 1978, though calling him an economist is like calling Leonardo da Vinci a painter. He was a political scientist, cognitive psychologist, computer scientist, and organizational theorist. He spent decades studying how humans actually make decisions inside institutions, and what he found was quietly devastating. The problem is not that organizations hire the wrong people. The problem is that organizations are structurally designed to prevent the right people from making good decisions.
Let that settle for a moment.
The Myth of the Rational Actor
Classical economics built its cathedral on a beautiful assumption: that people are rational. They gather all available information, weigh every option, calculate the optimal outcome, and choose accordingly. It is an elegant model. It is also, as Simon pointed out with academic politeness that barely concealed his frustration, completely fictional.
Simon introduced a concept he called bounded rationality. The idea is deceptively simple. Human beings do not have infinite time, infinite information, or infinite cognitive capacity. We cannot process every variable. We cannot predict every outcome. We are not supercomputers wrapped in skin. We are biological creatures making decisions under pressure, with incomplete data, limited attention, and a strong preference for lunch.
This does not mean people are stupid. It means people are human. And the difference between those two things matters enormously when you start designing organizations.
What Simon observed is that instead of optimizing, people do something he called “satisficing,” a word he invented by combining satisfy and suffice. We do not look for the best possible option. We look for the first option that is good enough. You do this when you pick a restaurant. You do this when you hire a contractor. You do this when you choose which emails to answer first. And your CEO does this when making decisions that affect thousands of people, no matter how many McKinsey slides are involved.
The Organization as a Decision Machine
Here is where Simon gets truly interesting and where most people stop reading him too early.
Simon did not just study individual decision making. He studied the structures we build to make decisions collectively. An organization, in his view, is essentially a machine for processing decisions. It takes in information, routes it through channels, filters it through hierarchies, and produces outputs. The question is whether that machine is well designed.
Spoiler: it usually is not.
Think about how information flows in a typical company. Data enters at the edges, where employees interact with customers, products, and markets. Those employees understand context, nuance, and the messy reality of what is actually happening. But decisions get made at the center, by people who are the furthest removed from that reality. The information has to travel upward through layers of management, and at each layer, it gets compressed, sanitized, and politically adjusted.
By the time it reaches the executive suite, the signal has been so thoroughly processed that it often bears little resemblance to the original truth. This is not corruption. It is architecture. Simon understood that every layer of hierarchy is essentially a filter, and every filter loses information. The organization is not lying to its leaders. It is structurally incapable of telling them the whole truth.
Why Smart People Make Dumb Decisions at Work
Consider a scenario you have probably witnessed. A talented engineer sees a flaw in the product. She knows it will cause problems in six months. She raises the issue. Her manager agrees it is a concern but explains that the current quarterly targets take priority. The VP above him has already committed to a timeline. The board expects certain numbers. So the flaw gets logged in a tracking system where it will live peacefully until it becomes a crisis, at which point everyone will wonder why nobody saw it coming.
Everybody in this chain was acting rationally within their constraints. The engineer was rational. The manager was rational. The VP was rational. And collectively, they produced an irrational outcome. Simon would recognize this immediately. The system is not failing despite its design. It is failing because of its design.
This is the core paradox. Organizations exist to extend human cognitive capacity, to let groups accomplish what individuals cannot. But the very structures that enable coordination also constrain intelligence. Rules, procedures, reporting lines, approval processes: these are all tools for managing bounded rationality at scale. They simplify decisions so that no single person needs to understand everything. But in doing so, they also prevent any single person from acting on what they do understand.
You have essentially traded individual insight for organizational consistency. And most of the time, consistency wins, even when insight would have been more valuable.
The Attention Economy Before It Was Cool
Decades before tech companies started competing for your eyeballs, Simon identified attention as the critical bottleneck. In a 1971 paper, he wrote something that now reads like prophecy. He argued that in an information rich world, the wealth of information creates a poverty of attention. Read that again. He said this in 1971. Before the internet. Before smartphones. Before your inbox became a second job.
Simon understood that the scarce resource in any organization is not money, talent, or even time. It is attention. Leaders can only focus on a finite number of issues. Employees can only process a finite number of directives. The organization has to decide, explicitly or implicitly, what gets attention and what does not. And those attention allocation decisions shape outcomes far more than strategy documents or mission statements ever will.
This is why companies can have brilliant strategies and still fail. The strategy might be perfect on paper. But if the organizational structure directs attention toward the wrong things, toward internal politics, toward irrelevant metrics, toward whatever the loudest person in the room is shouting about, then the strategy is just expensive wallpaper.
There is a direct line from Simon’s insight to what we now see in the attention economy of the internet. Social media platforms, news outlets, and content creators all compete for the same scarce resource Simon identified. The currency is not information. The currency is the mental bandwidth to process it. Organizations that understand this principle have an enormous advantage, and most do not understand it at all.
The Ant on the Beach
Simon offered one of the most elegant metaphors in all of social science. Imagine an ant walking across a beach. Its path is complex, winding, seemingly erratic. You might think the ant has a sophisticated navigation system. But Simon argued that the complexity of the ant’s path reflects the complexity of the beach, not the complexity of the ant. The ant’s behavior is actually quite simple. It is the environment that is complicated.
Now apply this to people in organizations. When an employee’s behavior seems irrational, confusing, or counterproductive, the instinct is to blame the employee. They need more training. They need better incentives. They need to be replaced. But Simon’s ant tells a different story. Maybe the behavior you are observing reflects the complexity and contradictions of the organizational environment, not the inadequacy of the person navigating it.
This is a profoundly humane insight, and a deeply inconvenient one for managers who prefer simple explanations. If the problem is a bad employee, you can fire them. If the problem is a bad system, you have to redesign it. One of those is easy. The other is an existential project.
What Would Simon Actually Fix?
If you took Simon’s ideas seriously, and most organizations do not, you would redesign from the ground up. Not the people. The architecture.
First, you would flatten information pathways. Not in the trendy Silicon Valley way where everyone pretends hierarchy does not exist while it obviously does. You would create genuine channels for unfiltered information to reach decision makers. You would make it structurally safe for the engineer to flag the product flaw without it being domesticated by four layers of management.
Second, you would design for satisficing instead of pretending to optimize. Most strategic planning processes are theater. Everyone knows the five year plan will be obsolete in eighteen months. Simon would say: stop pretending you can optimize in an uncertain world. Build systems that are good at finding good enough solutions quickly and then adapting when reality shifts.
Third, you would treat attention as your most precious resource and audit it ruthlessly. Where are your leaders actually spending their cognitive energy? If the answer is “in meetings about meetings” or “responding to emails that should never have been sent,” then your organization is burning its scarcest fuel on the organizational equivalent of keeping the lights on in an empty building.
Fourth, you would stop blaming the ants and start redesigning the beach. When patterns of failure emerge, resist the urge to find individual culprits. Look at the environment. Look at the incentives. Look at the information architecture. The path the ant walks tells you about the terrain, not the ant.
The Bottom Line
Herbert Simon’s work is not comforting. It does not offer a quick fix or a five step framework. What it offers is something more valuable and more unsettling: clarity about why organizations fail in predictable, repeated, almost elegant ways.
The failure is not random. It is designed in. Every hierarchy that filters information, every incentive structure that rewards short term thinking, every meeting that consumes attention without producing decisions: these are features, not bugs, of the organizational machinery we have built.
And the deepest irony is that we keep building them the same way. We know, thanks to Simon and decades of subsequent research, that these structures produce suboptimal outcomes. We know that bounded rationality is real, that attention is scarce, that information degrades as it travels through hierarchies. And yet we keep designing organizations as if none of this were true. As if the next strategic plan, the next reorg, the next leadership offsite will somehow overcome the fundamental architecture of the system.
Simon would not have been surprised by this either. After all, the decision to keep designing bad systems is itself made by people operating within bad systems, with bounded rationality, limited attention, and a strong preference for the familiar.
The smart people are not the problem. They never were. The system is the problem. And until we take that seriously, we will keep producing the same elegant failures, staffed by brilliant people, wondering why nothing ever seems to work the way it should.


