Workplace Culture

Workforce Management Looks Back at Workplace History (1920s-1970s)

By Staff Report

Jun. 28, 2012

In many respects, we take today’s focus on workplace safety for granted. Sophisticated computer models test machines and equipment before they’re ever placed on an assembly line or put into service. High-tech sensors monitor and gauge working conditions and prevent industrial accidents. And a 24-7 news and information stream ensures that the public knows about any major accident or problem. There’s never been a greater emphasis on occupational safety and health.

But it hasn’t always been this way. In the 1920s, workplace injuries and deaths were common and, in many cases, labor conditions were nothing less than grueling. Movies played up unsafe conditions, including silent-film comedian Harold Lloyd’s iconic 1923 picture Safety Last!—where a worker is seen dangling perilously from the hands of a large clock near the top of a 12-story building. Government regulations were nearly nonexistent, workers’ compensation was still largely voluntary and labor unions hadn’t yet emerged as a significant force. “It was a different era with entirely different thinking,” says Tom Leamon, adjunct professor of occupational safety at the Harvard University School of Public Health.

Over the past century, changes in occupational safety have benefited society by radically reducing accidents and deaths. In 1913, the U.S. Bureau of Labor Statistics documented approximately 23,000 industrial deaths among a workforce of 38 million—a rate of about 61 deaths per 100,000 workers. Although the reporting system has changed over the years, the figure dropped to 37 deaths per 100,000 workers by 1933 and 3.5 per 100,000 full-time-equivalent workers in 2010.

Injuries and deaths continue to decline in many fields. These include transportation-related accidents, fatal falls, incidents occurring in warehouses and industrial sites, and at construction sites. “As a society, we have become a lot more conscious and focused on safe working environments,” says Jack Glass, principal consultant at J. Tyler Scientific Co., a Tabernacle, New Jersey-based environmental consulting and services firm. “We also witnessed remarkable advances in ergonomics and achieved a far greater understanding of what safety is and how to achieve it.”

“Although there’s no way to reduce risk to zero and there are still steps that industry can take to continue to make workplaces safer, the last century has brought enormous changes in thinking as well as our … ability to design work environments that minimize risks,” Leamon says.—S.G.

To read this story in its entirety, go to

To understand the 1930s and the origins of the Great Depression, you have to look back to the 1920s. In the Roaring ’20s, the United States went through a massive economic boom led by technology as a vehicle to bring conveniences to everyday life — from the emergence of radios that brought entertainment and news to peoples’ living rooms to the mass production of automobiles that allowed them to travel more efficiently or take a leisurely Sunday drive. Urbanization took hold and skyscrapers were built to show off the country’s strength and prosperity.

Some private companies and the public sector offered pension plans to help employees keep paying the bills after retirement, but who could be jazzed about retirement planning with The Charleston playing on the phonograph? With a prevailing carpe diem attitude and a soaring stock market, it must have felt like the party would never end.

For the U.S. and much of the world, the stock market crash in October 1929 ended the good times and brought about a Great Depression that would plague the country throughout the 1930s and lead to an unemployment rate of 37 percent. But for those people lucky enough to be employed in the 1930s, private pension plans, while still rare, survived. From 1929 to 1932, only 3 percent of workers with company pensions saw their plans discontinued, and, somewhat surprisingly, the number of company pension plans increased by 15 percent.

During this period, most elderly people had little means of support. Millions of Americans who saw their life savings swept away during the Depression were becoming more aware of a need to provide for their future economic security.

It’s no wonder that U.S. workers became increasingly interested in defined benefit retirement plans. “It’s hard for us to imagine the 1930s today, because what happened then would never happen now, because we have institutional safety nets,” says Randall Holcombe, DeVoe Moore professor of economics at Florida State University in Tallahassee. “If you lost your job, you had to rely on your family, and there was no guarantee of their financial security.”

Today, “Defined benefit plans are dead,” says Bob Pearson, CEO of Pearson Partners International in Dallas. “No company I know offers them even as a means to attract senior executives.”

Pearson, who works with small employers on their retirement plan offerings, says that today’s mainstream retirement device, the 401(k) plan, is not the answer. “For now, we can use stock options, grants, equity and cash, but once the war for talent heats up again … and it will … we will see improvements in long-term benefits from the current anemic state,” Pearson says.—L.B.

To read this story in its entirety, go to

For veterans returning from the devastation they had witnessed during World War II to the jubilance and “normalcy” that awaited them at home, the world must have felt like their oyster. Soldiers came back to heroes’ welcomes and ticker-tape parades. Just like they had conquered the world.

What might not have been top-of-mind for those veterans was the job market that awaited them. Much like today’s military personnel who leave behind the wars in Iraq and Afghanistan, World War II vets returned home to financial uncertainty. That economic anxiety was the result of not-so-distant memories of the Great Depression while today’s economy is shadowed by the ghost of the Great Recession. In both the ’40s and today, the issue of military personnel returning from service has created challenges for employers, policymakers and the soldiers themselves.

Assimilating veterans back into society has always been a challenge. Over the past several decades, members of the armed forces have returned from a variety of wars and conflicts. Along the way, they have faced different political, social and economic situations.

“Each generation of veterans is defined by the era in which they served,” says Glenn Altschuler, Litwin Professor of American Studies at Cornell University. However, no period had a greater influence on the military and society than World War II.

Although World War II revved up the U.S. economy and lowered unemployment, it also raised concerns about what would happen to the 15.7 million veterans after the war ended. “The sheer magnitude of returning servicemen prompted concerns about the impact on the economy and the possibility of another depression,” Altschuler says.

It was clear that some type of government program was required. A vigorous debate followed between Democrats and Republicans about how to best address America’s changing needs. But on June 22, 1944, President Franklin Delano Roosevelt signed into law the Servicemen’s Readjustment Act, better known today as the GI Bill. The Veterans Administration, as it was known then, was charged with carrying out the law’s key provisions.

In 2012, as the 2.8 million men and women who have served since 2001 leave military service and return to civilian life, the United States is again facing important decisions about how to get veterans back to work. To ease the transition, President Barack Obama has proposed a number of changes for veterans returning from service. He supports incentives for hiring veterans as police officers and firefighters as well as putting veterans to work restoring land and resources through a Veterans Job Corps program.

Make no mistake, veterans have played a key role in shaping the workplace and serving as a backbone for the U.S. economy. Meanwhile, the GI Bill—in all of its iterations—has not only put military personnel back to work but also profoundly affected the direction of the country. Peter Drucker once described the GI Bill as “the most important event of the 20th century.” He believed that it “signaled a shift to the knowledge society.”—S.G.

To read this story in its entirety, go to

In 1958, Mel Bloom started working for the CBS owned-and-operated television station in Chicago. The young journalist was eager to work in the fast-growing medium. After all, almost 90 percent of U.S. households owned a television by then, a tenfold increase since the decade’s dawn.

Bloom and others his age became known as the conformist “company men.” Bloom’s peers have largely left the workforce by now. But their experience is a reminder of the way each American generation finds its way into the world of work.

Dubbed the “Silent Generation” by Time magazine, those born between 1925 and 1942 had their entrance into careers eased by a thriving economy. Consumers were not only buying televisions but also cars and other big-ticket items unavailable during wartime shortages. Fueling spending, the postwar population was growing at a historic pace.” Builders raced to meet the demand for houses as new parents left cities for the plush parks and new schools of suburbs.

Some of the older workers in Bloom’s office grumbled about the new guys. After all, many of them weren’t old enough to remember the Great Depression and hadn’t fought in World War II.

“They talked about our generation not knowing what it was like to fight and struggle,” Bloom says. They were critical that “we never seemed to get excited or upset.”

In 2012, another large generation—this time, the millennials—is entering the labor market in a time vastly different than what their grandparents faced during the Eisenhower era. And unlike the baby boomers who pride themselves on working independently, millennials often ask first and make decisions later. Today, questions surrounding the aging of the workforce remain pressing. And four generations can be found in the labor force—more than ever before.

Meanwhile, retirement has moved out of reach for their baby boomer parents. Boomers increasingly gauge when to retire not by age but by personal savings. And collectively, they haven’t saved enough.

With boomers postponing their retirement, the talent pipeline has become clogged. Boomers are staying put in senior leadership posts, and Generation X sees little chance of advancing, at least anytime soon.

Forward-thinking companies are responding to the situation by exploring everything from special projects for Generation X to lateral moves for baby boomers.

Looking at just two of the four generations in today’s workplace—Generation X and the millennials—shows some of the differences that could give way to these sentiments. Generation Xers didn’t receive a lot of grooming and mentoring, says David Stillman, co-founder of generational consultancy BridgeWorks. “It was sink or swim,” he says. “Now they’re managing millennials who want endless collaboration, lots of group activities and tons of feedback. They’re clashing.”

The younger generation also seeks a different type of employer, says Neil Howe, president of the consulting firm LifeCourse Associates, who is credited with naming the millennial generation. They’re looking for “the perfect employer who will be their ally and take care of them.”—T.H.

To read this story in its entirety, go to

Just before noon on Aug. 28, 1963, a quarter of a million people began slowly moving toward the Lincoln Memorial. They were mostly African-American, but they represented all creeds and colors of U.S. citizens. The March on Washington for Jobs and Freedom was the largest demonstration ever staged in the nation’s capital.

The Rev. Martin Luther King Jr. was the last speaker of the day. His speech laid emphasis on freedom, the freedom he dreamed would someday “ring from every village and every hamlet, from every state and every city.”

Unspoken, but not forgotten, was the march’s emphasis on jobs. As the civil rights bill languished in Congress, the marchers called for a major public works program to provide jobs to African-Americans, the passage of a law prohibiting racial discrimination in public and private hiring, and a $2 minimum wage. The dramatic photos and newsreels of the demonstration had the desired effect. Congress finally acted to approve the bill and, the following year, the Civil Rights Act of 1964 became law.

Despite that milestone, racial disparities and tensions continue to plague the workplace nearly 50 years later. To some observers, rules designed to prevent job discrimination have gone far enough if not too far. But others say America remains far from a color-blind economy in which all people have an equal chance to thrive. And some advocates see this as a perfect occasion for companies to lead the way to the final fulfillment of King’s dream.

Lawsuits and U.S. Supreme Court decisions have changed the employment landscape, particularly in Southern states, such as Alabama and Mississippi, which drew the nation’s attention during the civil rights movement. Segregated workplaces, particularly in government agencies, eventually became integrated, according to Morris Dees, co-founder and chief trial attorney of the Southern Poverty Law Center in Montgomery, Alabama.

“In the workplace across the South today, you’ve seen an enormous number of lawsuits integrate public employment,” Dees says. “Some of the better jobs that African-Americans hold, especially throughout the Southeast, are government jobs.”

But as far as we’ve come since those tumultuous times in the ’60s, there’s still a long way to go.

“We need to understand how difference as a negative is ingrained in us,” says Georgette Norman, director of the Rosa Parks Library & Museum in Montgomery.—S.G.H.

To read this story in its entirety, go to

A new social movement took center stage in the 1970s. It followed the lead of the civil rights movement, as well as the mounting protests against the Vietnam War. In this volatile era, the women of the nation were determined that their voices be heard above the din of discontent.

“A woman needs a man like a fish needs a bicycle” was a popular slogan frequently used by activist Gloria Steinem. The phrase suggests an independence and stature for women that still four decades later is not fully realized.

“We take five steps forward and 10 steps back, but we try to keep moving forward and not get too discouraged,” says Nancy Kaufman, CEO of the National Council of Jewish Women, which supports social and economic justice for all women. “We really try to be advocates, and that’s what the women’s movement has been all about.”

Outspoken leaders of the women’s liberation movement, like Steinem and Betty Friedan, aimed to raise women up from home and work situations that they considered subjugation. And both forward-thinking college students and working women organized marches and protests for equal rights in the workforce. One of the more noteworthy rallies was the Women’s Strike for Equality where an estimated 150,000 women marched across the country in August 1970 to mark the 50th anniversary of the 19th Amendment, which gave U.S. women the right to vote.

“You’ve come a long way, baby,” was another popular saying of the 1970s, which originated on cigarette advertisements meant to acknowledge the giant strides of the women’s movement.

The women’s movement of the ’70s was in part a reaction against the type of happy homemaker that was often portrayed in television sitcoms of previous decades.

Today, women comprise nearly half of the U.S. labor force. While 70 percent of families in 1960 had a stay-at-home parent, now 70 percent of families have either both parents working or a single parent who works. In two-thirds of all households, women are either the main breadwinner or the co-breadwinners, according to the Center for American Progress. In 40 percent of all households, women are the only wage earners. Yet on average, women in the workplace earn 20 percent less than men doing comparable jobs,

As for the disparity between wages earned by men and women, there was slow but steady improvement in closing that gap after the 1963 Equal Pay Act became law. But once the female equivalent of a man-earned dollar passed 70 cents in 1990, progress began to sputter. The issue has not lacked attention. Indeed, it has its own unofficial holiday, April 17, which is meant to show how long a woman must keep working into the next year to earn the equivalent salary earned by a man in the previous year.—S.G.H.

To read this story in its entirety, go to

Comment on this story below, or email

Schedule, engage, and pay your staff in one system with