Occam’s Razor helps us choose between two or more explanations of a problem. It provides a useful mental model for problem-solving. A razor is a principle or rule of thumb that allows one to eliminate unlike explanations for a phenomenon, or avoid unnecessary actions.
One popular definition of Occam’s Razor is:
If we face two possible explanations which make the same predictions, the one based on the least number of unproven assumptions is preferable, until more evidence comes along.
This mental model help us look for explanations that are least complicated. It does not mean explanation has to be easy so that anyone can grasp it with limited effort but it means explanation can be logically reasoned without making too many assumptions.
The idea is attributed to William of Ockham (c. 1287–1347), who was an English Franciscan friar, scholastic philosopher, and theologian. Occam’s Razor has been applied in a number of fields that include science, biology, medicine, probability theory and statistics.
Occam’s Razor has multiple definitions that different people have written about it in different context. Some of these are mentioned below.
- The simplest solution is almost always the best
- Entities must not be multiplied beyond necessity
- Other things being equal, we should prefer a demonstration which derives from fewer postulates or hypotheses
- We consider it a good principle to explain the phenomena by the simplest hypothesis possible
- Whenever possible, substitute constructions out of known entities for inferences to unknown entities
Applications of Occam’s Razor in Software Engineering
I am a software engineer so I look for places where mental models can help me become a better engineer.
Debugging
Occam’s Razor is a useful heuristic that we can use for effective debugging. When faced with an error situation look for the simplest reason that could have caused the error.
I will share a couple of situations where Occam’s Razor helped me debug an issue faster.
Debugging OutOfMemoryError
A colleague recently shared stack trace of an error log that they encountered when they deployed their application in a production environment. The error log clearly mentioned OutOfMemoryError. This had never happened in any of pre-production environment. So, it was surprising why this happened in the production environment. In this situation, we have to find the explanation for why OutOfMemoryError happened. We came up with two explanations for this:
Explanation 1
In the production environment we are working with much larger dataset so it is possible that our JVM heap setting were not configured properly.
We can fix the error by increasing the heap size of JVM.
In the above we made following assumptions:
- The first assumption is that in production environment we are working with larger dataset. This assumption was correct. We were working with 3 times more data.
- The second assumption was that JVM heap settings were not configured properly. We found that we have not configured heap for this application. It was running with the default heap settings. When you don’t specify heap setting then JVM use 1/4 of the available RAM. In our case, our container was using 4GB RAM but because we have not configured heap so it was using the default value of 1/4 of 4GB RAM i.e. 1GB.
Explanation 2
The stack trace pointed a line in the code where we were fetching records from the database and collecting in a List. The explanation was that as we are working with a larger dataset so we are fetching a much bigger dataset from the database that is causing OutOfMemoryError.
We can fix the error by fetching records from the database in a paginated manner or as a stream so that we don’t have to collect in a List.
In the above we made following assumptions:
- The error is caused by the line where we are collecting records from the database in a list. For this to be true the SQL query that was being fired at that line should return a large dataset. Also, we didn’t took into account other code so we might be looking at the wrong place.
- The second assumption that we made was that if we fetch the data in paginated or as a stream we will not require any more changes in the application code. As it turned out, if we fetch the data in a paginated manner or as a stream the subsequent code didn’t worked. It relied on data being collected in a single list. This will require much bigger refactoring exercise.
- The third and final assumptions we made was that JVM heap settings were tuned. As mentioned in previous explanation this was not correct.
What should you do?
The first question that comes to mind is any of the above two explanation incorrect? I don’t think that is the case. They both could be right. Occam’s Razor does not help you prove that one explanation is correct and other is wrong. It says that you should start with the explanations that make fewer assumptions if those assumptions turn out to be false then go with the next explanation. In our case, we picked the first explanation because the assumption that we made around JVM heap configuration was correct so tuning it was a reasonable solution. If that assumption was incorrect then it would have made sense to look more deeply into explanation 2.
Occam’s Razor can help you find a simple solution to the problem.
System design
One of the useful statements of Occam’s razor used by scientists is
When you have two competing theories that make exactly the same predictions, the simpler one is the better.
As someone who design systems Occam’s Razor can be used as a guiding principle that informs our design decisions and help us avoid creating a complex solution when a simpler solution is better. Occam’s Razor goes well with KISS principle — Keep it simple, stupid. When you build system, think about responsibility of each and every component. Are they doing one thing? Can I reason them about?
Another design principle that goes well with Occam’s Razor is YAGNI – You aren’t going to need it. You should ask yourself Is there a component that I can remove that will simplify my design?
When designing you should think about essential and accidental complexity. Essential complexity is a property of the problem you are trying to solve a problem; for example, domain of a problem. Accidental complexity relates to problems which engineers create and can fix; for example, using a low level language like C or C++ where a higher level language could be more productive.
Simple design should be your ultimate goal. You should strive for simplest solution and only introduce complexity when needed.
Another way you can ensure your design remain simple is by doing periodic design review sessions with your team. You should evaluate all components and see if they are still the best and simple solution to the problem. Many times our understanding of the system improves with time so we should strive to simplify systems if an opportunity arises.
Refactoring
Refactoring is one of the extreme programming practice that I learnt early in my professional life. The main aim of refactoring is to simplify the existing code without changing its external behaviour. This is usually done by restructuring and removing duplicate code. Refactoring leads to simpler code that is readable and less complex.
Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. — Antoine de Saint-Exupéry
UX Design
Occam’s Razor is also used by UX designers to build simple and usable solutions. Laws Of UX website summaries it well:
Analyze each element and remove as many as possible, without compromising the overall function.
Jon Yablonski suggest two ways designers can apply Occam’s Razor to keep UI simple yet effective:
- Constantly ask yourself What is the minimum amount of UI that will allow content to be found and effectively communicate to the user?
- Aggressively edit your work. This is refactoring applied to UX design.
If you are a designer you can also read a good article written by folks at WedDesignerDepot on Occam’s Razor.