Technology needs more friction. Though the removal of friction was central to technology’s growth in the past decade, reintroducing friction will be imperative in combatting a consequence that’s not always beneficial: abundance.
Abundance is desirable in areas like social media or ridesharing, where an application’s utility is directly proportional to its number of users. But that’s not the case in areas dependent on human decision-making. In these cases, an increase in abundance leads to an amplification of the “paradox of choice” effect, which can have paralyzing implications.
The essence of the Paradox of Choice is that more option isn’t always better. It’s famously illustrated in The Jam Experiment. Researchers set up two tables, one with 24 choices of jam and one with 6 choices. The null hypothesis was that more choice was desirable, so the 24-choice table would yield higher sales. But that didn’t happen. Instead, because more choice meant more cognitive load in decision-making, the table with 24 jams generated a tenth of the sales that the 6-jam table did! This decision paralysis is precisely what removing friction and causing overabundance can lead to, and we’re already starting to see this in the recruitment space.
In 2012, companies like Lever and Greenhouse transformed the clunky, tedious job application process by eliminating friction through introducing design-driven, intuitive workflows. But these frictionless application processes attracted floods of passive applicants, causing application volumes to skyrocket and the candidate-to-hire ratio to plummet. While these rising volumes were handy in Lever and Greenhouse’s growth, it’s worked against the core problem they set out to solve: smoothening the hiring process for applicants and recruiters.
These higher application volumes led to the need for more cognitive effort in candidate selection. Even with basic automated filters, the steep rise in volume could cause decision paralysis and missed opportunities. But that’s where friction comes in. By reintroducing it into these processes, the inevitable move towards decision paralysis can not only be slowed but reversed.
One way friction can quickly be introduced into the application process is by extending the number of required steps. This could be done by enabling mandatory explorative questions, adding aptitude puzzles, or through any means that shift the cognitive load back to the applicant. Though the result would be a reduction in application volumes, the resulting increase in the quality of applicants would make it easier to identify genuinely viable candidates. Ultimately, this swing should better enable hiring teams to evade the paradox of choice, accelerate their processes, and make more sound hiring decisions.
As we continue down the path of removing friction, more decision-making areas will experience the challenges of overabundance (For example, choosing what to read). Advances like Machine Learning may eventually nullify the need for friction, but for the foreseeable future, friction’s measured reintroduction remains the fastest path to countering overabundance. But it must be applied carefully - too much friction can kill a product, while too little can’t be effective. Whether friction regains prominence or not, its removal has taught us a key lesson: sometimes, less is more.
Thanks for reading!
Aqil
P.S Please feel free to hit reply to this email with your feedback and thoughts, my goal is to keep making these more interesting/useful so any feedback is welcome!
------------------------------------------------------------------------------------------------------
A few highlights from the week:
As costs of living continue to rise and the combination of connectivity and design makes it easier to form networks, there are more instances of “Airbnb like” companies that capitalize on these trends. Neighbor was interesting because unlike many of its competitors, it transfers the capital-intensive part of storage (the transportation of goods and renting of space) to the customers, which could position them strongly in terms of both scale and pricing.
While large companies have the capital and talent to generate innovation, third party developers are critical in sustained innovation. This is an interesting article on how the iPad didn’t reach the scale it could’ve because of its failure to adequately compensate third party developers who launched applications on it. IBM seems to be pursuing this framework with its RedHat acquisition too. Though the core reason behind acquiring RedHat was to position themselves as the “orchestrator” in the multi-cloud world, perhaps they were also drawn to it because of RedHat’s access to third party developers (RedHat is opensource).
As traditional computer chip design approaches a physical limit under Moore’s Law (the idea that every 2 years the number of transistors in a chip doubles) there’s been increasing focus on developing novel approaches to computing. Quantum Computing, which applies concepts from Quantum Physics to traditional computing, has been gaining the most attention, but it’s led to the advent of even more dramatic innovations such as neuromorphic computing and using DNA to store data. Many of these innovations may not scale, but it’s a fun area to watch where the boundaries between the sciences are slowly merging.