Several years ago, while I was visiting a museum dedicated to the Civilian Conservation Corps (CCC), a docent remarked, “Today’s young people wouldn’t be interested in doing this kind of work.” The CCC, established in 1933, enlisted nearly three million unemployed young men to plant trees, construct trails and bridges, and build lodges and recreational facilities. Many of them came from farms where they’d labored from sunup to sundown. Working just 40 hours a week for the CCC, which also provided housing, clothing, medical care, and three meals a day, was considered laid-back.
Fast forward 85 years: While most CCC projects are still in use, what constitutes “work” has evolved dramatically, with technology playing a greater role in automating manual tasks and complementing uniquely human traits like creativity, intuition, flexibility, and judgment. This “man-with-machines” evolution is known as Worker 4.0.
With any drastic change comes trepidation. Worker 4.0, however, isn’t all gloom and doom. Let’s first examine previous industrial revolutions, then explore how tomorrow’s work will depend more on human intelligence and innovation than on brawn and repetition.
Industrial revolutions transform work
The rise of Worker 4.0 is tied to the Fourth Industrial Revolution, commonly known as Industry 4.0 or 4IR. At the end of the 18th century, the introduction of mechanical machines powered by water and steam ushered in the First Industrial Revolution. Electricity was the trigger that set off the second industrial revolution, which focused on mass production.
The advent of electronics and information technology (IT), starting in the 1960s, inaugurated the Third Industrial Revolution and laid the groundwork for today’s cyber-physical systems, the underpinnings of Industry 4.0. Over the course of these revolutions, work became easier, with manual strength increasingly replaced by machines and the tedium of assembly lines morphing into decoupled, fully flexible, and highly integrated manufacturing systems.
Industry 4.0 connects the “internet of things” (IoT) with manufacturing techniques that enable systems to share, analyze, and visualize information in order to guide intelligent actions. Instead of machines simply doing a single process in a regimented manner, they can employ cognitive technologies, such as artificial intelligence (AI), to independently guide their functionality.
Uniquely human traits aren’t easily replicated
In November 2016, Sun Microsystems founder Vinod Khosla boldly asserted that 80 percent of jobs in an IT department could be replaced by AI-type systems. While it’s easy to conclude that machines will eventually replace humans, the reality is that machines are built and trained by humans and lack uniquely human traits such as intuition, imagination, determination, and judgment.
A grocery self-checkout terminal illustrates machine limitation. The terminal can work flawlessly if a bar code or price look-up (PLU) code is scanned. However, today’s terminals don’t even have the ability to discern that an unmarked item is an orange, let alone whether it’s organic, a blood orange, or a mandarin tangerine. A person, however, can easily look up and enter the correct code.
While large volumes of data can be used to teach machines via AI, humans are better able to piece together unrelated information and use judgment to more rapidly arrive at astute conclusions. For example, in the summer of 2007, after their mortgage business had suffered 10 days of losses, Goldman Sachs questioned the sophisticated risk-assessment models used by the foremost financial institutions. They assembled a team to examine the details of their business and concluded that mortgage-backed securities exposed them to too much risk. They took corrective actions that ran counter to computer-generated models, thereby avoiding catastrophic losses during the 2008 financial crisis.
Humans will continue to have the upper hand
When contemplating the impact of automation and AI-based systems, it’s essential to keep in mind that humans are the fulcrums, determining how machines will be used and what learnings are fed into their decision-making processes and knowledge bases.
Consider robotic paint sprayers in car manufacturing plants. While they can consistently mask-off and paint cars, they don’t have the ability to independently maintain themselves or swap crimson red for electric red when new exterior colors are introduced.
Technology is both transforming work and creating new types of workers. Today’s kindergarteners are the first generation who grew up teaching themselves to use digital information, usually before they could read or write. To them, technology is the norm to be embraced and not questioned.
As they enter the workforce, they will seek jobs that provide the creativity and freedom to craft outcomes, influence decisions across organizations, and collaborate in large ecosystems. Fluent in mobile and social platforms, they will expect immediate results that spill over into the workplace. Machines and AI systems will be designed not to replace but to foster efficiencies and to further support the development and personalization of products and services.
What’s manufactured, placed on shelves, and conceptualized won’t be driven by predictions, but by real-time demand. Just-in-time technologies like 3D printing enable the creation of one-off products, which are designed, planned, scheduled, and managed by people.
The future holds higher value and safer jobs
Because repetitive and traditional manual activities will be automated or digitized, work will become safer. The use of wearables will enable real-time tracking of a worker’s vitals and location to speed up responses to accidents and health emergencies like heart attacks or diabetic shock. Wearables can also send real-time alerts to encourage workers to modify their activities when their bodies are stressed, inactive too long, or experiencing a health issue.
Jobs will become more flexible, with the potential to work from anywhere at any time and to collaborate with workers in different locations and countries. According to the American Community Survey (ACS), regular work-at-home has grown 140 percent since 2005. Not only are more people working from home, but there’s also an increase in self-employment, which grew 43 percent between 2005 and 2016. Compared to five years ago, 40 percent more U.S. employers offer flexible workplace options.
The ability to work from home, coupled with evolving work roles, has ushered in the on-demand or gig economy that enables workers to contract with multiple employers to complete short-term, temporary engagements and also enjoy the autonomy to set their own hours, strengthen their skills, and pursue personal interests. Intuit predicts that by 2020, 40 percent of American workers will be independent contractors.
The attributes that make contractors and freelancers appealing to employers – creativity, agility, resourcefulness, flexibility, and, most of all, experience in a breadth of industries and applications – are the same traits that are shaping Worker 4.0.
Man must leverage and complement technology
Tomorrow’s workers will be more connected and empowered to make informed decisions based on data gathered from IoT-enabled devices. Decision-making will be more streamlined, with increased visibility of real-time operations.
The growth of IoT has the potential to drive an increased demand for jobs focused on developing, maintaining, analyzing, and reporting insights from connected devices.
In the same vein, new jobs will emerge that require a human touch and can’t easily be automated. While an AI-enabled machine can identify cancer cells, it can neither comfort a patient nor visualize next-generation services based on changing consumer attitudes. It can’t patiently teach a child to read, design an innovative building, write and design an annual report, or formulate corporate policies.
The key to matching Worker 4.0 jobs with today’s and tomorrow’s workers is in developing education and training initiatives that provide the skills and pathways necessary to fully realize the potential of each person. It’s in letting go of the perception that machines are replacing humans and in recognizing the need for lifelong learning that allows workers to evolve with technology.