The following is a research paper submitted for my Management Theory course as part of my MBA curriculum at Bradley University.
Introduction
In the modern world, ground-breaking technology is emerging, seemingly every day, to simplify our processes and speed up our lives. In many ways, our quality of life has significantly increased because technology and Internet access have become so ubiquitous. Indeed, advances in information and communication technology have resulted in the accelerated production, and a speedier distribution, of information (Eppler & Mengis, 2004). This explosion of information has manifested itself in the modern organization in the form of Big Data (BD) and Big Data Analytics (BDA).
Big Data can be defined as “datasets that are both big and high in variety and velocity, which makes them difficult to handle using traditional tools and techniques” (Janssen, van der Voort, & Wahyudi, 2017). In other words, Big Data requires more complex information systems in order to sort through the sheer volume of data points. Consider Amazon and its founder, Jeff Bezos, for example. Bezos has risen through the historical ranks of scientific management, surpassing even Frederick W. Taylor and Henry Ford with an approach that some are calling Bezosism. With a mixture of surveillance, algorithm-driven goals, and constant progress-tracking, Amazon has become a tech powerhouse that can deliver a package to your house the day after you order it (Mims, 2021).
This certainly follows Taylor’s message for scientific management, as noted in Gareth Morgan’s (1998) Images of Organization, which aimed to analyze and standardize work procedures. Certainly, we now have the tools to achieve this goal. But is this kind of reliance on technology and data without consequence? In his book The Management Myth, Matthew Stewart (2009) writes, “In the context of complex decisions with uncertain outcomes and no obvious right answer, the managerial mind inevitably longs for some handrails to grasp amid the smoke and flames.” In today’s tumultuous workplace, Big Data and Big Data Analytics represent these metaphorical handrails. The more data, it is said, the better. Better products. Better processes. Better insights. Better decisions. But this perspective might be too superficial.
The purpose of this paper is not to criticize Big Data or deny its usefulness in the workplace. Indeed, in many cases, when we automate production processes for a certain good or service, prices decrease, demand for the product goes up, and this rise in demand subsequently creates more specialized jobs in the industry (Friedman, 2016). As with all things shiny and new, though, we must overcome our initial awe and clearly define the limitations and shortcomings of the new system. As such, this paper will address the following question: At what point does data exceed the capacity of the individual to process it, and how do we respond? This research will focus specifically on the context of organizational work environments. It will also focus on systems used to process information and retrieve data in order to make organizational decisions, more so than the overall flow of various sources of information in an office. As such, this research will focus primarily on the concept of “pulled” forms of technology, such as management information systems, customer relationship management systems, and decision support systems, as opposed to “pushed” technology (e.g., e-mail, Slack, office memos, Google Chat) (Kirsh, 2000).
To conduct this analysis, I will draw on ideas such as Moore’s Law, various types of overload, and the law of diminishing returns. I hope to, first, provide some context about why adding new information systems to the workplace is not always beneficial for organizational decision making, and, secondly, offer a few best practices for maximizing the utility of data analysis in the workplace.
Moore’s Law
In 1957, Gordon E. Moore and several other engineers cofounded Fairchild Semiconductor, which later became Intel, the now multibillion-dollar technology corporation. While designing microchips for the U.S. Defense Department, Moore made an important observation: not only was the microchip decreasing in size on an annual basis and, therefore, becoming cheaper to produce, but its computing power and speed was also skyrocketing. Because of this, in 1965, he coined his eponymous law, hypothesizing that the computing power and speed of microchips would increase exponentially every two years. Moore’s Law has held true for nearly half a century (Friedman, 2016).
Exponential growth usually carries a positive connotation, but the context in which it is examined is important to consider. In the case of Moore’s Law, the context of concern is that of human cognition. If we refer to the figure below, we see an illustration of this dilemma (Friedman, 2016). The graph shown was drawn for journalist Thomas Friedman by Eric “Astro” Teller, CEO of Google X’s research and development lab. Teller explains that, in the past, when a technology was introduced that caused fundamental and “uncomfortable” changes in society, it might be one hundred to two hundred years before we experienced technology of the same magnitude. Nowadays, the introduction of these kinds of technologies may only be separated by five to seven years.
However, Teller notes, it still takes humans ten to fifteen years to fully adapt to paradigm-shifting technology (Friedman, 2016). This creates an obvious gap between ability of the user and capability of the product. Humans cannot keep pace with the current rate of change in technology. Obviously, this is a larger cultural and societal problem, but it has very specific implications in the modern workplace. In the next section, I will explore information overload, cognitive overload, and technology overload, which are three types of overload that stem from the oversaturation of information technology within organizations.
Information Overload, Cognitive Overload, and Technology Overload
The first type of overload, information overload, happens when the information processing demands of a task exceed the ability of the individual to process that information (Karr-Wisniewski & Lu, 2010). It can also be thought of as a situation in which information processing requirements are greater than information processing capacity (Eppler & Mengis, 2004). This type of overload is frequently talked about and debated in today’s society, probably because of the general and all-encompassing nature of the term “information.” A study of 108 MBA students from both the Fox School of Business Management at Temple University and New Zealand’s Waikato Management School at the University of Waikato showed that 90.1% of the group experienced “moderate,” “intense,” or “very intense” levels of perceived information overload. What is also interesting is that, out of all the participants, those who focused more on the big picture were more likely to claim they had experienced information overload (Buchanan & Kock, 2001). In addition, a meta-analysis of information overload in the areas of organization science, marketing, management information systems, and accounting found that managers are more likely to experience overload as they take on an increasing number of concurrent tasks, which, obviously, happens to managers quite frequently (Eppler & Mengis, 2004). Therefore, organizational managers, who are big picture thinkers and decision-makers by nature, are actually more prone to this condition than their employees. Eppler and Mengis (2004) also noted that information overload causes negative side effects such as stress, confusion, and the inability to make decisions.
Information overload runs parallel to the psychological concept of cognitive overload, which can be described as “the situation in which the demands placed on a person by mental work (the cognitive load) are greater than the person’s mental abilities can cope with” (APA Dictionary of Psychology, 2020). Kirsh (2000) states that one of the main components of this type of overload is too much information supply. Therefore, we can infer that information overload and cognitive overload are closely connected. If we consult Moore’s Law and consider the fact that technology is doubling in speed and power every two years, and that the result is more information being brought into the world at an increased speed, then it makes sense that humans, responsible for processing this new information, can experience cognitive strain. There is also evidence to suggest that the amount of low-quality information available is outpacing the amount of high-quality information (Kirsh, 2000). This means that as information supply increases, more and more of the data points being considered by business managers are either incorrect, strongly biased, or irrelevant. Therefore, quality of data is an important consideration as well when making an organizational decision (Kirsh, 2000). Not only does one have to be able to process the data, they also have to be able to discern which slices of information are even relevant to the problem being addressed. With the ever-expanding ubiquity of information and data, this is not always easy.
It is also prudent to consider, more specifically, the phenomenon of technology overload. This phenomenon “occurs at the point in which a marginal addition of new technology reaches the point of diminishing marginal returns” (Karr-Wisniewski & Lu, 2010). All three types of overload imply that there is a point at which information or technology cease to be cognitively conducive for individuals. The definition of technology overload presents us with the concepts of diminishing returns and marginal utility, which helps us to understand, quite crucially, that there can be too much of a good thing.
Marginal Utility and the Law of Diminishing Returns
To express the law of marginal utility in layman’s terms: if an individual is very hungry, then eating a piece of chocolate cake will be of great use (utility) and produce a high marginal return for that person, meaning they will be very satisfied. However, if this individual decides to indulge in a second piece of cake, it will not be of the same benefit to them, as they are not as hungry after eating the first piece. Therefore, the marginal return for the second piece will not be as great, and the same goes for the third piece of cake, the fourth piece of cake, and so on. This incremental decrease in benefit is the law of diminishing returns. The marginal utility of each additional unit of cake decreases until, at some point, perhaps by the tenth piece of cake, eating another piece produces a marginal utility of zero. After this point, eating more cake creates a negative rate of return, meaning that each additional unit consumed is detrimental to the individual instead of carrying any benefit at all.
When graphed, the shape of a total utility curve is an inverted U. Interestingly enough, Eppler and Mengis’ (2004) meta-analysis confirms that an inverted U-shaped relationship exists between information load and quality of decision-making. This same relationship can be found between information load and productivity (Karr-Wisniewski & Lu, 2010). Using the transitive property, we can infer that an organization will experience diminishing returns in both decision-making ability and productivity as a result of the introduction of more information technology into the workplace. The previously mentioned meta-analysis also states that “researchers across various disciplines have found that the performance (i.e., the quality of decisions or reasoning in general) of an individual correlates positively with the amount of information he or she receives – up to a certain point” (Eppler & Mengis, 2004).
The figure above provides us with an illustration for this example (Karr-Wisniewski & Lu, 2010). The positive slope represents the increase in organizational productivity and decision-making that occurs from initially adding technology and information systems to the workplace. The apex of the curve represents the point at which the marginal return is zero, meaning the next unit of technology actually initiates the downward slide of marginal loss. This implies that, at first, introducing information and data systems into the workplace produces a marginal benefit when making decisions. However, as the volume of information and information systems increases, an organization eventually reaches a point at which the overflow of data and technology negatively affects both productivity and the quality of decisions that are made.
Conclusion
I believe the best way to draw this paper to a close is with an anecdotal story that illustrates, quite effectively, the points I have attempted to make. In July and August of 2002, as the United States was just beginning the Afghanistan War and on the verge of another war with Iraq, a military unit known as JFCOM (Joint Forces Command) began testing a new war game on which they had started developing two years earlier — Millennium Challenge 2002. The simulation was designed to develop new, technologically-advanced war strategies. The enemy of the war game was a fictitious military commander in the Persian Gulf who had large religious and ethnic followings, and who was also harboring several different terrorist organizations (completely hypothetical, of course!) (Gladwell, 2005).
Enter Paul Van Riper, a retired Marine Corps Lieutenant General. Van Riper had served two tours in Vietnam and experienced great success in his storied military career. He was asked to play the part of the rogue enemy military commander, the leader of Team Red, using limited information, his previous military experience, and intuition. Conversely, the JFCOM team, Team Blue, was equipped with all the best in military decision-making technology, utilizing advanced software with names like Operational Net Assessment, Effects-Based Operations, and Common Relevant Operational Picture. They were also given access to a plethora of government information and intelligence (Gladwell, 2005). Team Blue had all the tools at their disposal to make all the right decisions and swiftly defeat Team Red.
On Day 1 of Millennium Challenge 2002, Team Blue’s troops sailed into the Persian Gulf with an aircraft carrier, followed by a fleet of ships carrying tens of thousands of troops, and cut off all of Team Red’s satellite and cellular communication. Team Blue then gave Van Riper an ultimatum, believing that they had the technology to squash any counterattacks from the rogue military state. On Day 2 of the simulation, however, Van Riper snuck a fleet of small boats into the gulf, undetected, and destroyed sixteen of Team Blue’s naval ships in only an hour. This proved to be devastating, as it accounted for about half of the ships that the Navy had at its disposal (Gladwell, 2005). As Malcolm Gladwell (2005) notes, if the war game had been real, “twenty thousand American servicemen and women would have been killed before their own army had even fired a shot.”
Using the theories, laws, and research explored throughout this paper, we can infer what happened here. JFCOM, with its wide array of technology, believed it had the necessary tools to analyze any and all situations. They believed they could standardize life-or-death decisions. In the extremely unpredictable and time-sensitive reality of war, the sheer volume of data points caused Team Blue to experience information and technology overload. While they were trying to make sense of all the data, Van Riper attacked with split-second decision making. The demonstration pitted exhaustive data against experience and intuition – experience and intuition won.
Byron Auguste, a former economic adviser in the Obama administration, notes that, “’In today’s knowledge-human economy it will be human capital – talent, skills, tacit know-how, empathy, and creativity” that are the most valuable assets in the workplace, not technology (Friedman, 2016). Gareth Morgan (1998) also notes that more and more management circles are recognizing the benefits of a “sociotechnical” approach, in other words a healthy blend of technical and human assets in an organization. If management is, as Mary Parker Follett posits, “the art of getting things done through people,” then organizations need to prioritize the optimal productivity of their human workers over that of their technology (Stewart, 2009).
Considerations for Effective Data Management
1. Create a specialized data team – As mentioned earlier, both quantity and quality of data should be of concern to an organization. Not everyone is an expert data analyst, in the same way not everyone is an expert accountant, salesperson, marketing director, etc. Therefore, data analysis and distribution must be delegated to a specific data department, who, with their expertise, are able to effectively comb through data, recognize criticale nuances, and omit extraneous details (Buchanan & Kock, 2001). A recent study of the Dutch Tax Organization also shows that by establishing a new data department within the organization, management was “able to attract new, highly skilled workforce which had the right capabilities instead of having to rely on the capabilities of the existing workforce” (Janssen, van der Voort, & Wahyudi, 2017). In other words, acquiring data specialists for a company takes that load off of the shoulders of employees who have other specializations on which to focus.
There is another significant benefit to creating a team of professionals that solely focuses on data analytics. Not only is it important to properly mine data and verify its quality; the communication of data is equally as important (Janssen, van der Voort, & Wahyudi, 2017). Clearly conveying which information to use and how to use it, as well as consistency with these communication processes, can help to reduce information processing requirements and increase information processing capacity (Eppler & Mengis, 2004). Data specialists are more adept in contextualizing and communicating the relevant data to the entire workforce. For this reason, these professionals must possess a combination of technical and interpersonal skills.
2. Find your apex – One of the best ways to fight the oversaturation of data in the workplace is, ironically, with more data. Many companies already regularly track and measure their productivity. When new information technology is introduced, changes in productivity can be noted and analyzed for their marginal return. Over time, this analysis would give management teams terrific insight about the impact different information systems have on their respective organizations in terms of productivity and decision-making. Companies could easily begin to graph their own inverted “U” curves and prevent technology overload in their work environments. There may be some challenges in holding constant for other variables that influence productivity and decision-making, but it is worth exploring.
There is some overlap between considerations 1 and 2. Some research has indicated that data which is “visualized, compressed, and aggregated” can prevent information overload (Eppler & Mengis, 2004). Therefore, managers must not only compress the relevant data provided to employees, as mentioned in consideration 1, but they must include data visualization (e.g., graphs, charts, tables), such as their own inverted-U curve, to increase the effectiveness of their communication.
3. There are no “perfect” decisions – Again, the shared belief is that with the enhanced perspectives that more and more data provide, management teams can make near-perfect decisions for their organization. However, this ignores the fact that time is always a factor. It is extremely difficult, if not impossible, to name an occupation that does not include some type of deadline. Even with enhanced information processing technology, information is flooding the workplace at an accelerated pace, meaning data analysts will face challenges with time when combing through the data. With this is mind, “perfect” decisions are not possible and should not be sought after.
There will always be one more source of information that would have helped to make a better decision. For this reason, organizations should focus on satisficing decisions (i.e., satisfactory decisions) instead of chasing perfection (Karr-Wisniewski & Lu, 2010). It is better for managers to make the best decision they can at the time, with the data that is available, and move on. Using the research and considerations above, managers can begin to improve the function of Big Data within their organization.
References
Buchanan, J., & Kock, N. (2001). Information overload: A decision making perspective. In Multiple criteria decision making in the new millennium (pp. 49-58). Springer, Berlin, Heidelberg.
Cognitive overload. (n.d). In APA Dictionary of Psychology. Retrieved from: https://dictionary.apa.org/cognitive-overload.
Eppler, M. J., & Mengis, J. (2008). The concept of information overload-a review of literature from organization science, accounting, marketing, mis, and related disciplines (2004). Kommunikationsmanagement im Wandel, 271-305.
Friedman, T. L. (2016). Thank you for being late: An optimist’s guide to thriving in the age of accelerations. New York: Farrar, Straus, and Giroux.
Gladwell, M. (2005). Blink: The power of thinking without thinking. New York: Little, Brown and Co.
Janssen, M., van der Voort, H., & Wahyudi, A. (2017). Factors influencing big data decision-making quality. Journal of business research, 70, 338-345.
Karr-Wisniewski, P., & Lu, Y. (2010). When more is too much: Operationalizing technology overload and exploring its impact on knowledge worker productivity. Computers in Human Behavior, 26(5), 1061-1072.
Kirsh, D. (2000). A few thoughts on cognitive overload. Intellectica, 1(30).
Mims, C. (2021, September 11). The Way Amazon Uses Tech to Squeeze Performance Out of Workers Deserves Its Own Name: Bezosism. Retrieved from Wall Street Journal: https://www.wsj.com/articles/the-way-amazon-uses-tech-to-squeeze-performance-out-of-workers-deserves-its-own-name-bezosism-11631332821.
Morgan, G. (1998). Images of organization: The executive edition. Thousand Oaks, CA.
Stewart, M. (2009). The management myth: Debunking modern business philosophy. WW Norton & Company.