Chapter 1: The How and Why of Generations CHAPTER 1 The How and Why of Generations
In the Bay of Bengal between India and Myanmar lies North Sentinel, an island about the size of Manhattan.
In 2018, a 26-year-old American paid a group of fishermen to take him there. He was never seen again.
North Sentinel is the home of one of the last groups of humans isolated from the rest of the world. Outsiders have visited over the centuries, including a group of anthropologists between the 1960s and 1990s, but the tribe has made it clear they want to be left alone. Boats and helicopters that get too close are greeted by tribesmen waving spears and bows, and the few lone outsiders who have ventured there have been killed, leading India to ban boats from traveling within a three-mile radius of the island. Although the tribe uses metal from shipwrecks for their weapons, they have no modern technology. Their day-to-day lives today are, in all likelihood, barely different from how they were two hundred years ago.
As a result, parents on North Sentinel are not shooing their kids off video games and telling them to go outside and play. Parents are not worrying that their teenage children are spending too much time on TikTok. They are hunting, gathering, and cooking over an open fire instead of picking the best Amazon Fresh delivery window. With no birth control, young women on the island have children at about the same age that their mothers, grandmothers, and great-grandmothers did. We can guess that cultural values have changed little; the North Sentinelese likely follow the same rules for communal living as their ancestors.
Not so in most of the rest of the world. New technologies have reshaped social interaction and leisure time, value systems have shifted from rigid rules and strict social roles to individual expression and an embrace of diversity, and the milestones of adolescence and adulthood are now reached much later than they were seventy years ago. A time traveler from 1950 would be shocked that same-sex marriage was legal—and then they’d probably faint after seeing a smartphone.
The breakneck speed of cultural change means that growing up today is a completely different experience from growing up in the 1950s or the 1980s—or even the 2000s. These changes have an impact: The era when you were born has a substantial influence on your behaviors, attitudes, values, and personality traits. In fact, when you were born has a larger effect on your personality and attitudes than the family who raised you does.
These differences based on birth year are most easily understood as differences among generations. Traditionally, the word generation
has been used to describe family relationships—for example, that a three-generation household includes grandparents, parents, and children. The word generation
is now more commonly used to refer to social generations: those born around the same time who experienced roughly the same culture growing up.
The United States is currently populated by six generations: Silents (born 1925–1945), Boomers (1946–1964), Generation X (1965–1979), Millennials (1980–1994), Generation Z (aka iGen or Zoomers, 1995–2012), and an as-yet-unnamed generation born after 2013 (I call them Polars; some marketers have called them Alphas). Generations aren’t just an American phenomenon; most other countries have similar generational divisions, though with their own cultural twists.
Not that long ago, it was difficult to determine whether and how generations differed from each other, even on average. More than one pundit has complained that musings on generations occasionally resemble horoscopes. They have a point: Many books and articles on generational differences are long on subjective observations but short on hard data. Others poll a small segment of people and attempt to draw broad conclusions. With the age of Big Data upon us, that no longer needs to be the case. In these pages, you’ll find the results of generational analyses spanning twenty-four datasets including thirty-nine million people—nearly as many people as live in California, the most populous state in the U.S. With so much data, it’s possible to get a better understanding of generational differences than ever before.
Appreciating generational differences is crucial for understanding family relationships (Why is my teen always on her phone? Why do my parents not know what nonbinary is?), the workplace (Why are younger employees so different? Why does my boss think that way?), mental health (Which generations are more likely to be depressed, and why?), politics (How will each generation vote as they grow older?), economic policy (Are Millennials actually poor?), marketing (What does each generation value?), and public discourse (Why are more young people so negative about the country? Is putting your pronouns in your email signature just a fad?). These questions capture just a few of the reasons why generations are endlessly discussed online. At a time when generational conflict—from work attitudes to cancel culture to “OK, Boomer”—is at a level not seen since the 1960s, separating the myths from the reality of generations is more important than ever.
Studying the ebb and flow of generations is also a unique way to understand history. Events such as wars, economic downturns, and pandemics are often experienced differently depending on your age. Having Dad at home because he was laid off during the recession might be fun for the kids but terrifying for Dad. However, history is not just a series of events; it’s also the ebb and flow of a culture and all that entails: technology, attitudes, beliefs, behavioral norms, diversity, prejudice, time use, education, family size, divorce. What your grandmother called “living in sin” is today’s accepted unmarried partnership. What a teenager now considers entertaining (Instagram scrolling) is very different from what her parents considered entertaining when they were teens (driving around with their friends).
Generational differences also provide a glimpse into the future. Where will we be in ten years? Twenty? Because some traits and attitudes change little with age or change in predictable ways, the data—especially on younger people—can show us where we are going as well as where we are. Although people continue to change throughout their lives, our fundamental views of the world are often shaped during adolescence and young adulthood, making the younger generations a crystal ball for what is to come.
I’ve spent my entire academic career—more than thirty years—studying generational differences. It all began when I noticed something odd while working on my college honors thesis in 1992: College women in the 1990s scored as significantly more assertive and independent on a common personality test than their counterparts in the 1970s. But this was at the University of Chicago, where everyone is a little weird, so I thought it might just be a fluke. After getting the same result the next year with undergraduates at the University of Michigan (who were considerably less weird), I realized there might be something more systemic going on. A few months of library work later, I’d found a steady rise in college women’s self-reported assertiveness and independence across 98 psychology studies from 1973 to 1994—a result that made perfect sense given the shift in women’s career aspirations over that time. I’d documented my first generational difference.
Over the coming years, I would gather studies from scientific journals ensconced on dusty shelves, finding generational differences in personality traits, self-views, and attitudes. By the mid-2000s, large, nationally representative datasets became accessible online, including the results of huge surveys of young people conducted across the country since the 1960s. Other sources of data, like the Social Security Administration database of baby names and Google’s huge database on language use in books, both of which draw from data going back to the 1800s, appeared online as well, giving additional glimpses into how the culture was changing.
Seeing big shifts in self-confidence, expectations, and attitudes around equality, I wrote a book on Millennials, called Generation Me
, in 2006. When optimism plummeted and teen depression rose during the smartphone era, I wrote a book on Generation Z, called iGen
, in 2017. But as I traveled the country giving talks about iGen
, managers, parents, and college faculty would ask, “But hasn’t new technology affected all of us?” Or they’d want to know, “Do other generations also look different now from before?” This book is the answer to those questions—and to many others about Silents, Boomers, Gen X, Millennials, Gen Z, and Polars.
To begin, let’s consider two broader questions. First, what causes generational differences? And second, how can we discover the actual differences among generations?
What Causes Generational Differences?
Unlike the more static culture of a place like North Sentinel Island, modern societies are always changing. Cultural change leads to generational change as each generation effectively grows up in a different culture. But which specific cultural changes are the most responsible for generational differences?
The classic theories of generational change focus almost exclusively on just one aspect of cultural change: major events. In the 1920s, Karl Mannheim wrote that “generation units” who experienced the same events while they were young were bonded by common experiences. In the 1970s, sociologist Glen Elder found that people who experienced the Great Depression as children were different from those who experienced it as adults. In the 1990s, William Strauss and Neil Howe theorized that American generations cycled through four different types, with each type in a particular life stage when the country was seized by major events, such as the Civil War or World War II; for example, the GI or “Greatest” generation born 1901–1924 was the “civic” type, perfect for rising adults leading the country through war. Many presentations and books on generations start with a list of the events each generation experienced when they were young, like the Vietnam War for Boomers; fears of nuclear war with Russia for Gen X; September 11, 2001, for Millennials; and the COVID-19 pandemic for Gen Z.
Major events can certainly shape a generation’s worldview. Those who lived through the Great Depression, for example, were often frugal for the rest of their lives. However, this view of generations as shaped by cycles of events misses the rest of cultural change—all the ways in which life today is so different from life twenty years ago, fifty years ago, or one hundred years ago. A hundred years ago, household tasks like laundry and cooking took so much time and effort that much of the population could do little else. As recently as the 1990s, publicly sharing an opinion on politics meant physically attending a protest or writing a letter to the editor and hoping it got printed; it now involves a few keystrokes on a smartphone to create a post on social media. In much of the U.S. in the mid-20th century, Whites accepted racial segregation as normal, while today it is considered morally repugnant. The average woman born in 1930 ended her education with high school, married at 20, and had two kids by 25, while the average woman born in 1990 went to college and was unmarried with no children at 25. These cultural changes were not caused solely by major events—for one thing, they are linear, moving in roughly the same direction year after year, rather than cycling in and out like recessions or pandemics.
So what is the root cause of these cultural changes—and thus the root cause of generational differences? It should be something that keeps progressing year after year, and something with a big impact on day-to-day life. The strongest candidate is technology.
Technology has completely changed the way we live—and the way we think, behave, and relate to each other. Unlike the ebb and flow of wars, pandemics, and economic cycles, technological change is linear. The mode may change (say, from TV sets to streaming video), but technology keeps moving in roughly the same direction: easier, faster, more convenient, more entertaining. Technology and its aftereffects—on culture, behavior, and attitudes—have broken the old cycles of generations to form something novel. This model—let’s call it the Technology Model of Generations—is a new theory of generations for the modern world.
Technology isn’t just tablets or phones. The first humans to make controlled fire, invent the wheel, plant crops, or use written symbols were using technology (defined as “science or knowledge put into practical use to solve problems or invent useful tools”). Today, technology includes everything that makes our modern lives possible, from medical care to washing machines to multistory buildings. Large cities, with many people living close to each other, are not sustainable without modern architecture, sanitation, and transportation, all things made possible by technology. Our lives are strikingly different from the lives of those in decades past, primarily due to the technology we rely on. That’s why it’s reasonable to guess that the culture on North Sentinel Island is similar now compared with a hundred years ago, because the people of North Sentinel have experienced very little technological change.
On the surface, many cultural changes don’t seem related to technology at all. What does same-sex marriage have to do with technology? Or the shift from formal to casual clothing in the workplace? Or the trend toward having children later in life? In fact, each of these cultural changes is, ultimately, due to technology—via a few other intervening causes (we’ll come back to these questions later).
Technological change isn’t just about stuff; it’s about how we live, which influences how we think, feel, and behave. As just one example, the technological change of agriculture about ten thousand years ago completely transformed the way humans lived, with downstream effects on cultural attitudes and beliefs. With more stable homesteads, personal property became more important and societies of more people became possible, resulting in a more collective mindset and more emphasis on following rules. While hunter-gatherers lived in small groups, agriculture led to larger towns and eventually complex societies that required more structure and cooperation. In more recent times, certain technological developments have ultimately led to behavioral and attitude changes far beyond the device itself (see Figure 1.1
Primary Years of Growth
Immediate experience of events; exposure to other regions and cultures; decline of reading; materialism
Home appliances (microwaves, washing machines, refrigerators)
Ability to live alone; women pursuing careers; increase in leisure time
Population growth in the U.S. South and West; fewer people socializing outside
More premarital sex; lower birth rate; women pursuing careers
Increase in skills and education necessary for many jobs; rise in work productivity
Instant access to information; decline of newspapers; ability to filter news to preferences
Ability to reach large social network; decline in face-to-face social interaction; political polarization
Figure 1.1: Examples of the wide-reaching effects of technological advancements
Technology also contributes to many of the major events prized in classic generational theories. Consider airplanes, a key technological development of the 20th century. Airplanes played a role in at least four major events of the last one hundred years: World War II (where planes were used in combat, including dropping the first nuclear bomb), 9/11 (where planes were used as weapons), and the AIDS and COVID-19 pandemics (where both viruses spread via airplane travel).
A classic anecdote relates the story of an anthropologist gathering origin stories from hunter-gatherer tribes. One elder says the earth rests on the back of a giant turtle. “But what does the turtle rest on?” the anthropologist asks. “Oh,” says the elder. “It’s turtles all the way down.” The story evokes the image of a chain of turtles, with the smallest at the top and each turtle below a little bigger as the chain fades down into infinity. Although meant to illustrate the limitations of origin stories, the idea of turtles resting on progressively larger turtles has always reminded me of the search for ultimate causes of phenomena: Each cause leads to another below it, in an endless chain of turtles, making it difficult to see what is really causing things to change.
Sometimes, though, the chain does have an ultimate origin. For generational differences, that origin is technology. Technology does not always cause generational differences directly—there are intervening causes as well, which we can think of as daughter turtles resting on the back of the big mother turtle of technology. Two of these intervening causes are individualism (more focus on the individual self) and a slower life trajectory (taking longer to grow to adulthood, and longer to age). A modern theory of generations can be modeled this way (see Figure 1.2
), with technology as the root cause of the intervening forces of individualism and a slower life and a side role for major events. Technological change is the mother turtle, individualism and a slower life are the daughters, and major events are friends of the family that show up every once in a while.
This model is not completely comprehensive—there are certainly some causes of generational differences not included here, like income inequality—but it captures the strongest influences. Along with the direct impacts of technology, individualism and a slower life trajectory are the key trends that define the generations of the 20th and 21st centuries.
Figure 1.2: The Technology Model of Generations
Notes: Major events include wars, terrorist attacks, economic cycles, pandemics, natural disasters, crime waves, impactful people, and other factors.
Daughter Turtle 1: Individualism. Individualism, a worldview that places more emphasis on the individual self, is often discussed in the context of world cultures. Individualistic cultures such as the U.S. value freedom, independence, and equality, while more collectivistic cultures such as South Korea instead value group harmony and rule-following.
Levels of individualism also vary over time. Two hundred years ago—say, in early 1800s Regency era England, when Jane Austen’s novels take place—behaviors and life choices were heavily constrained. Gender, race, and class were destiny. Many boys entered the same profession as their fathers. Nearly all upper-class women married by age 25 and had children; those of the lower classes married or became servants. Lower-class men and all women could not vote, and slavery was legal. There were some individual freedoms, particularly for upper-class men, but even those men were required to follow strict rules for dress, speech, and behavior. The culture strongly promoted the idea that individuals should sacrifice for the greater good, with, for example, young men expected to fight in the military if they were asked.
Over the decades, these social rules began to fall. By the 1960s and 1970s the highly individualistic world we know today had begun to emerge in many countries around the world: Personal choice was paramount, the U.S. military became an all-volunteer force, and “do your own thing” became a mantra. Sacrificing for the greater good was less prized. Treating people as individuals means setting aside the idea of group membership as destiny, which gave rise to movements for individual rights based on gender, race, and class, enshrining equality as a core value of the culture.
With so much reliance on the self, it was important that people feel good about themselves, so viewing the self positively received more emphasis. Between 1980 and 2019, individualistic phrases promoting self-expression and positivity became steadily more common in the 25 million books scanned in by Google (see Figure 1.3
; you can try this database yourself by googling “ngram viewer”). Assuming verbal language mirrored written language, Boomers growing up in the 1950s were only rarely told “just be yourself” or “you’re special,” but Millennials and Gen Z’ers heard these phrases much more often. Writing “I love me” would have garnered questions about tautology and perhaps onanism in 1955, but was an accepted expression of high self-esteem by the 2000s.
Figure 1.3: Use of individualistic phrases in American books, 1950–2019
Source: Google Books database
Notes: Shows the percentage of each phrase in all books published in that year. Percentages are smoothed across three years. The scale has been adjusted for some phrases by factors of 10 so they can appear in the same figure; the phrases are not actually equally common.
Two important caveats are worth mentioning. First, neither individualism nor collectivism is all good or all bad. They both involve trade-offs, and whether one judges the outcomes of either system as good or bad is heavily influenced by which system you were raised in. For example, is it good or bad that Western societies have become more accepting of single parents? Your answer partially depends on whether you lean toward individualism or collectivism. In general, individualism has the advantage of more individual freedom and choice, and the downside of more social disconnection; collectivism offers less choice but tighter social connections.
Second, it’s important not to conflate individualism and collectivism with political ideologies—they are not the same. Conservatism embraces some aspects of individualism (favoring light regulation of the individual by government) and some aspects of collectivism (emphasizing family and religion). Liberalism prizes individualism’s insistence that race, gender, and sexual orientation should not restrict rights or opportunities, but also supports collectivistic social policies such as government-funded health care. Thus, it’s best to think of individualism and collectivism as cultural systems, not political ideologies. The one possible exception is libertarianism, a political philosophy that takes some views from the liberal basket and others from the conservative one and overlaps with individualism to a good degree. But individualism and collectivism are not proxies for Democrats and Republicans.
In the Technology Model of Generations, individualism is caused by technology. How? Technology makes individualism possible. Until well into the 20th century, it was difficult to live alone or to find the time to contemplate being special, given the time and effort involved in simply existing. There was no refrigeration, no running water, no central heating, and no washing machines. Modern grocery stores didn’t exist, and cooking involved burning wood. Those who could afford it hired servants to do the enormous amount of work involved, but the poor did it all themselves (or were the servants doing it for someone else). Daily living in those eras was a collective experience.
In contrast, modern citizens have the time to focus on themselves and their own needs and desires because technology has relieved us of the drudgery of life. Being able to hit the drive-thru at McDonald’s and get a hot meal in under five minutes is not an unmitigated good, but it’s a prime example of the amazing convenience of modern life and the flexibility it allows the individual. Or consider laundry: Instead of slaving over a hot cauldron for an entire day, often with a group of other people, you throw your clothes in a machine and go watch TV for forty minutes. Then you put your clothes in the dryer and watch more TV. Electric washing machines were not widely used until the late 1940s, and clothes dryers were not common until the 1960s. In 1940s rural Minnesota, my grandparents and their neighbors used outside clotheslines for drying. If there was an unexpected cold snap, the clothes would freeze solid.
Technology also made the middle class possible. With labor-saving devices decreasing the need for servants and farmworkers, more people could do other types of work, and most of that work paid better and allowed for more freedom. One of the great success stories of 20th century America was the emergence of a stable middle class. A society where most people
consider themselves middle class (70% of Americans did so in 2017) is fertile breeding ground for individualism, which posits that everyone is equal. That belief is easier to hold when daily chores require less time and thus less division of labor based on gender, race, and class.
Overall, technological progress shifted economies away from agricultural and household work, which required many people to work together collectively, to information and service work, which are often performed more independently. People still work together, but family farms and family businesses are less common. Large cities, which promote individualism as they allow people to live fairly anonymously without their behavior being monitored by everyone else (as is common in small towns), are made possible by technology. Technology also favors paid work that relies more on verbal and social abilities and less on physical strength, which brings more women into the workplace, promoting more gender equality.
More recent technological progress has also gone hand in hand with individualism. When people first bought TVs, they had one per family. TVs were so big they were often styled with wood on the outside like a piece of furniture. Then it became popular to have more than one TV in a house so members of the family could watch different things. Now each family member has their own phone or tablet, complete with streaming video and earbuds, so each person can watch exactly what they want to when they want to.
Technological change doesn’t always result in uniformly high individualism—for example, Japan is a collectivistic country immersed in technology. But individualism can’t exist without modern technology. Every individualistic country in the world is an industrialized nation, although not every industrialized nation is individualistic.
Let’s return to two of the questions we posed earlier: What does same-sex marriage have to do with technology? What about the shift from formal to casual clothing in the workplace? Both of these changes are rooted in technology’s daughter, individualism. Individualistic countries were the first to embrace equal rights for lesbian, gay, and bisexual (LGB) people, while more collectivistic countries have not. Same-sex marriage is legal in the Netherlands and Canada but is not in China or Saudi Arabia. The link between individualism and LGB rights is also true over time. As cultures grow more individualistic, they place more emphasis on individual choice and less on everyone being the same. For most of the 20th century, Western cultures shunned same-sex relationships because they were different, with these beliefs often intertwined with collectivistic religious tenets. Same-sex relationships also challenge the traditional social structure of male-female marriage and family-building that forms the basis of collectivistic societies. When families come in many shapes and sizes in an individualistic culture, however, LGB relationships are just another variation. LGB family-building is also directly impacted by technology, with assisted reproductive technology enabling gay and lesbian couples to have genetic children via intrauterine insemination, egg donation, and surrogacy.
Individualism also promotes equal treatment on the basis of gender, race, ethnicity, and transgender status. Individualism is at the root of the civil rights movement, of Black Lives Matter, of the feminist movement, of the gay rights movement, and of the transgender rights movement. It says: You are who you are, and you should be treated equally. The charming novel Nine Ladies
, by Heather Moll, imagines the aristocratic Mr. Darcy from Jane Austen’s Pride and Prejudice
time-traveling from 1812, when race, gender, and class were destiny, to 2012. He’s of course amazed by smartphones, airplanes, and restaurants, but the advice the born-in-1987 version of Elizabeth Bennet gives him the most often is, “Remember, treat everyone equally.” Equality is one of the unifying themes of cultural change over the last one hundred years, making it one of the unifying themes of generational change.
The ascendance of casual clothing is a more trivial but tangible result of individualism. In the early 20th century, leaving the house usually meant a suit and hat for men and a dress and gloves for women—and often a tight girdle. People dressed this way even in their time off. Pictures of the crowd at baseball games in the 1950s reveal a sea of men wearing formal suits, ties, and hats—fedoras, not baseball caps. Tennis shoes are called that because people once wore them only to play tennis. During this era, the goal of clothing was to communicate status. Being respectable meant dressing a certain way to be presentable to others.
Individualism turns this around: The goal of clothing is for the person wearing it to be comfortable. It’s a material example of the individualistic advice that “you shouldn’t care about what other people think of you.” We still do, of course—otherwise we might go to the office naked, or wearing pajamas—but the balance between individual comfort and other-focused status-signaling has definitely started to tilt toward comfort.
Daughter Turtle 2: A slower life. Technology also leads to another cultural trend that’s had an enormous impact on how we live: taking longer to grow up, and longer to grow older. This trend isn’t about the pace of our everyday lives, which has clearly gotten faster, but about when people reach milestones of adolescence, adulthood, and old age, like getting a driver’s license, getting married, and retiring.
In my daughter’s desk drawer, there’s a picture of my maternal grandparents and four of their eight children, taken in the late 1950s. They stand outside their farmhouse in rural Minnesota. My grandmother wears a white-and-blue dress, my grandfather wears a suit and a beige fedora hat, and my mother and her siblings wear their Easter best, including small hats for my mother and aunt Marilyn, a bow tie for my uncle Mark, and a blue suit coat and movie-star pompadour for my uncle Bud.
Their lives, from childhood to old age, followed a different trajectory from today. My grandmother, born in 1911, went to school only until the 8th grade and married at 19. She had eight children over eighteen years (the youngest was born on the day of the eldest’s high school prom). In the picture, my grandmother is 47, but she looks like she’s in her mid-fifties or early sixties. My grandfather, born in 1904, went to school only through the 6th grade before he left to work on his family’s farm. He’s in his mid-fifties in the picture, but looks closer to retirement age.
Their children, born between 1932 and 1950, grew up doing work around the farm—milking the cows, mucking out the stalls, feeding the chickens, making the meals. They also had the run of the neighborhood. One of Uncle Bud’s favorite stories is about the time he and his brothers went skinny-dipping in the river and the neighborhood girls stole their clothes. I asked him how old he was when that happened and was surprised when he said, “eight or nine”—it’s hard to imagine many American kids with that kind of freedom now. It wasn’t just farm kids—my father, who grew up in a medium-sized city in the same era, described roaming the neighborhood with his friends when he was still in grade school, playing baseball in the summer and ice-skating in the winter. This was childhood in the mid-20th century: You had responsibilities, but you also had freedom. Mothers told their children to play outside as long as they were home by dinner; parents considered it normal for 8-year-olds to be gone, unsupervised by adults, for the entire day. In more recent decades, however, few children have this much independence. Even teens have their every move tracked by their parents via smartphone apps.
What changed? A model called life history theory gives some insight. Life history theory observes that parents have a choice: They can have many children and expect them to grow up quickly (a fast life strategy) or they can have fewer children and expect them to grow up more slowly (a slow life strategy).
The fast life strategy is more common when the risk of death is higher both for babies and for adults, and when children are necessary for farm labor. Under those conditions, it is best to have more
children (to increase the chances that some will survive) and to have those children early
(to make sure the children are old enough to take care of themselves before one or both parents dies).
In the late 1800s, an incredible 1 out of 6 babies died in their first year—so for every six women who had a baby, one would lose the child within a year. Infant mortality declined precipitously during the 20th century, but 1 out of 14 babies still died in their first year when the first of the Silent generation were born in 1925. When the first Boomers were born in 1946, 1 out of 30 babies died before reaching their 1st birthday (see Figure 1.4
). Infant mortality did not dip below 1 out of 100 until 1988; in 2020, it had decreased to 1 out of 200.
Figure 1.4: Infant mortality rate, U.S. and Massachusetts, 1850–2020
Source: National Vital Statistics (CDC), Statistical Abstract of the United States, Colonial Times to 1957
Notes: Rate is out of 1,000 live births. Infant mortality refers to death in the first year of life. Massachusetts data shown for earlier years as records are available beginning in 1850 for that state, when national data is not available.
Child mortality was also higher.
At the beginning of the 20th century, 1 out of 10 children who reached their 1st birthday did not reach their 15th. By 2007, however, only 1 out of 300 Americans died in childhood. Deaths of children 5 to 14 plummeted more than 80% between 1950 and 2019. My mother’s family experienced this firsthand: My grandparents’ fifth child and first girl, Joyce, died at age 13 in 1954 of a kidney infection that would not have been fatal today.
The environment of the past was different for other reasons as well. Education took fewer years and lives were shorter, so development happened faster at each life stage. That meant more independence for young children; more working and dating for teens; marriage, children, and jobs for those in their late teens and early 20s; feeling old by 45; and death in one’s 60s. Average life expectancy in the U.S. did not consistently top 60 until 1931, did not reach 70 until 1961, and did not reach 75 until 1989 (see Figure 1.5
; the huge downturn in 1918 was due to the double impact of the influenza pandemic and World War I, both of which killed many young people; the decline in 2020–2021 is due to the COVID-19 pandemic).
In the 21st century, infant and child mortality is lower, education takes longer, and people live longer and healthier lives. In this environment, the risk of death is lower, but the danger of falling behind economically is higher in an age of income inequality, so parents choose to have fewer children and nurture them more extensively. As an academic paper put it,
“When competition for resources is high in stable environments, selection favors greater parental investment and a reduced number of offspring.” This is a good description of the U.S. in the 21st century: It is a stable (low-death-rate) environment, but also one with considerable competition for resources due to income inequality and other factors.
Figure 1.5: Life expectancy in years, U.S., 1900–2021
Source: National Center for Health Statistics
The result is a slow-life strategy, with lower birth rates, slower development, and more resources and care put into each child. Thus, children do fewer things on their own (fewer walk to school by themselves or stay at home alone), teens are less independent (fewer get their driver’s license or date), young adults postpone adult milestones (marrying and having children later than earlier generations), life stages once considered middle-aged tilt younger (“fifty is the new forty”), staying healthy past retirement age is the rule rather than the exception, and life expectancies stretch toward 80. The entire developmental trajectory has slowed down, from childhood to older adulthood.
These slower life trajectories are all ultimately caused by technology, including modern medical care (which lengthens life spans), birth control (allowing people to have fewer children), labor-saving devices (which slow aging), and a knowledge-based economy (which requires more years of education). Especially at older ages, the slowing is actually biologically quantifiable.
A recent study using eight biomarkers of aging found that 60- to 79-year-old Americans in 2007–2010 were biologically 4 years younger than the same age group in 1988–1994, and 40- to 59-year-olds were biologically 2 to 3 years younger.
An important note: Neither the slow- nor fast-life strategy is necessarily good or bad. Both are adaptations to a particular place and time, and both have advantages and disadvantages. The same is true of individualism, which also has upsides and downsides. This is a good caveat to keep in mind for the rest of the book: Just because something has changed over the generations does not make it bad (or good). Often, it just is.
The breakdown of generational cycles. These three influences—technology and its daughters individualism and a slower life—have fundamentally changed the culture and shaped each generation. Especially since World War II, these linear influences have been strong enough to overpower the previous generational cycles.
In their 1991 book, Generations
, Strauss and Howe argued that major events caused generations to cycle through four different types (Idealist, Reactive, Civic, and Adaptive), with each type suited for their age during the event. For example, they predicted that Millennials, the young adults during the next big event, would resemble the Greatest (or GI) generation born 1900–1924, the Civic-type generation who were the young soldiers, officers, and factory workers during World War II. Using that same model, Gen Z would resemble the Adaptive-type Silent generation, who were kids and teens during the war and young marrieds in the postwar era.
Although some of Strauss and Howe’s predictions were eerily prescient—for example, they forecasted a major event would occur around 2020—the generations did not behave as predicted. If Millennials resembled the Greatest generation, for example, they would have come together collectively as one to face the challenge of the pandemic, relying on their strong sense of patriotic duty and rule-following. Primarily due to individualism, that’s not what happened. Instead, patriotism declined and rule-following was controversial. As for Gen Z resembling the Silent generation, Silents embraced traditional gender roles and married young. So far, Gen Z has done exactly the opposite. The strong influence of technology since the middle of the last century has seemingly broken the previous pattern of generational cycles.
Strauss and Howe are correct that American history goes through somewhat predictable cycles of stability followed by conflict; for example, their theory predicted that the late 2010s and early 2020s would be an unsettled time. If recent technological change has thrown off the generational types, however, the current generations may be ill-suited to the crisis at hand. Strauss and Howe argued that during previous crises, each generation had the traits they needed to lead the country through the calamity and out the other side. If generational personalities are now misaligned with the temperaments needed to help the country triumph over adversity, that may spell trouble for the coming years.
How Can We Discover the Actual Differences among Generations?
Technology has not only shaped generations but has made it possible to study them in more depth. Not that long ago, authors of books on generations described the events and demographics that impacted each group but were then forced to guess about what those events might mean for each generation’s attitudes, traits, and behaviors, often relying on anecdotes alone. One-time polls and surveys could assess people of different generations, but it was impossible to tell which differences were due to generation and which were due to age.
Now, however, we live in the era of Big Data, and a much sharper and more definitive picture is beginning to emerge. With large national surveys conducted across many decades, we can reach back in time to see the viewpoints of decades past, follow generations as they age, and compare young people in one era with those in another. We can see how generations really differ—based not on guesses, but on solid data collected in real time.
This book’s conclusions about generational differences are based on twenty-four datasets, some of which go back to the 1940s. They assess children, adolescents, and adults and include a staggering total of 39 million people (see Figure 1.6
), considerably more than the combined population of the ten largest cities in the U.S. This is a significant upgrade from my previous book, iGen
, which relied on four datasets including approximately 11 million people. These datasets allow us to hear each generation’s story through the voices of its members. That fulfills the primary goal of this book: To separate the myths from the realities of generational differences so we can understand each other better.
Nearly all of the datasets are nationally representative, meaning that respondents resemble the whole population in terms of gender, race/ethnicity, age, socioeconomic status, and region of the country. Most of the data is from the U.S., but other datasets were collected in countries around the world.
Number of People Included
National Health and Nutrition Examination Survey (NHANES)
Ages 2 and up
National Survey on Drug Use and Health (NSDUH)
Ages 12 and up
U.S. Dept. of Health and Human Services
Monitoring the Future
8th and 10th graders (ages 13–16)
University of Michigan; funded by National Institutes of Health
Monitoring the Future
12th graders (ages 17–18)
University of Michigan; funded by National Institutes of Health
Youth Risk Behavior Surveillance System and Adolescent Behavior and Experiences Survey
9th–12th graders (ages 14–18)
Health Behaviour in School-Aged Children (international)
13- to 15-year-olds
World Health Organization
Millennium Cohort Study (UK)
University College, London
Programme for International Student Assessment (international)
15- and 16-year-olds
Organisation for Economic Co-operation and Development
World Values Survey (international)
Ages 15 and up
World Values Survey Association
American Time Use Survey
Ages 15 and up
Bureau of Labor Statistics
Current Population Survey, Annual Social and Economic Supplement
Ages 15 and up
U.S. Census Bureau, U.S. Bureau of Labor Statistics
American National Election Studies
Stanford University and University of Michigan
American Freshman Survey
Incoming college students (most ages 18–19)
Panel Study of Income Dynamics
University of Michigan
General Social Survey
NORC, University of Chicago
Behavior Risk Factor Surveillance System
National Health Interview Survey
Cooperative Election Study
Harvard and YouGov
Pew Research Center polls
Pew Charitable Trusts
CIVIQs polling company
Household Pulse Survey
U.S. Census Bureau
Total respondents, all surveys
Figure 1.6: Source datasets
Notes: Numbers of people includes all those participating in the years used; exact sample sizes vary by question. Some datasets have earlier years or other age groups not used here. Most datasets were analyzed at the individual level; some (such as the Current Population Survey and the American Freshman Survey) were analyzed at the group (average) level.
Most of these datasets do not yield their secrets easily. Getting at the data involves downloading the datafiles, scouring them for variables of interest, merging them across years, recoding variables, running analyses, and a whole array of the data-analysis equivalent of sausage-making. Fortunately, crunching data is what I do for a living. With a few exceptions, you’re not going to be able to find the graphs in these chapters in a Google search or on a government web page; the analyses behind them are unique to this book.
What generational differences are fair game? Just about everything. These datasets cover sexuality, birth rates, political affiliation, income, time use, views about gender, life goals, drug and alcohol use, age at marriage, divorce, leadership roles, education, obesity, self-confidence, and desires for material things. They also delve into mental health and happiness. In his book Sapiens (A Brief History of Humankind)
, Yuval Noah Harari noted that historians rarely consider how technological progress impacts people’s happiness and well-being. That should end now: We need to understand not just how things have changed but the impact on the generations’ mental health.
Before we dive into what the data say about each generation, we should consider a few frequently asked questions about generations, and a few misconceptions.
Am I Still a Millennial If I Don’t Feel Like One?
We all belong to a generation. For some, it fits like a familiar glove, enclosing us in the warm, fleece-like protection of solidarity with our birth mates. For others, a generation is more like an itchy sweater, annoying in its overgeneralizations and misunderstandings of who we feel we are. For many, it is both: wonderful in its sense of common experience and support, but not as wonderful when it’s weaponized as an insult, as in the derisive use of “OK, Boomer,” the labeling of Gen X’ers as slackers, or the accusation that Millennials can’t afford houses because they spent too much money on brunch plates of avocado toast.
No one has a choice in the year they were born. Thus we belong to a generation whether we like it or not. As writer Landon Jones puts it,
“A generation is something that happens to people; it is like a social class or an ethnic group they are born into; it does not depend on the agreement of its members.” Someone doesn’t have to know or care that they are a Millennial to have been influenced by the technology and culture present when the generation was growing up.
So even if you don’t feel like a Millennial, if you were born between 1980 and 1994, you are one. It’s true that these birth-year cutoffs are somewhat arbitrary—if you were born between, say, 1978 and 1982, you could argue that you are either a Gen X’er or a Millennial and have a point. In fact, some people born in this span have taken to calling themselves Xennials, a combination of Gen X and Millennials. Even though the cutoffs aren’t exact, it is clear that people have different experiences depending on the year they were born; it’s just a question of where you draw the line.
What if you don’t feel like a Millennial (or a Silent, Boomer, Gen X’er, or Gen Z’er) because your generation’s traits are not similar to yours? Not everyone will be a typical member of their generation, just as not all women are typical members of their gender and not all New Yorkers are typical New Yorkers. Like all group differences, generational differences are based on averages. For example, the average Gen Z teen spends more time online than the average Millennial teen did in 2005. Of course, some Gen Z teens spend little time online, and some Millennial teens spent a lot of time—there is considerable overlap between the two groups.
Just because there is an average difference doesn’t mean that everyone in the generation is exactly the same. When someone says, “But I’m a Gen Z’er and I don’t spend much time online—so I don’t think there’s really a generational difference,” they are committing what some call the “NAXALT” fallacy, for “Not All [X] Are Like That.” The NAXALT fallacy is the mistaken belief that because someone in the group lies at the extreme, the average does not exist. It’s like someone saying, “Seat belts save lives,” and her friend arguing back, “You’re overgeneralizing. I know somebody who got strangled by their seat belt.” Maybe so, but seat belts have saved many more lives than they have taken. The rare counterexample does nothing to disprove the average result, which is a much lower risk of death when wearing a seat belt than when not. Groups differ within themselves as well as between each other, but the group differences still exist.
Some people have argued that generational differences are just stereotypes. If someone is guessing about how one generation differs from another, yes, that’s a stereotype. But if generations really do differ from each other—say, in their average age at marriage, or how religious they are, or how self-confident they feel—it is not stereotyping to conclude that there are generational differences (that, for example, Millennials get married later, are less religious, and are more self-confident than previous generations). This isn’t stereotyping—it’s comparing groups using a scientific method. Plus, it’s interesting that people tend to cry “stereotyping” if the generational difference is seen as negative, but are more than willing to embrace it if it’s positive.
Even if the generational differences are verifiable, stereotyping can still occur if someone assumes that any individual person must be representative of his or her group. Someone who assumes every Millennial they meet got married in their 30s, is less religious, and is highly self-confident is stereotyping, because they are assuming every individual fits the average. However, such stereotyping is an error in interpretation, and not in the studies themselves. Finding a generational difference does not
mean that everyone in the generation is the same, nor does it imply that other characteristics (like gender, race, or religion) don’t matter. They do, because people vary in many ways. Thus it’s not a valid criticism of generational studies to say that they “overgeneralize.” If a study finds, say, that Gen X’ers are more materialistic on average, that doesn’t mean all Gen X’ers are highly materialistic. Someone who assumes so is overgeneralizing, but the study itself is not.
Still, many people don’t feel like a member of their generation because they are not typical of the generation in their attitudes, traits, or behaviors. Even so, they are still influenced by their generation’s place in history. Consider this scenario. Ethan is 21, is not religious, and goes to college in a large city on the East Coast. He decides that he’d like to get married in the next year and start having children soon afterward.
If it were 1961, Ethan would have little trouble finding a young woman in his social circle who wanted to get married as soon as they graduated from college. His family and friends would be happy for him, and his choice would be considered normal. But if it were instead 2023, few young women around Ethan’s age in his social network will be thinking about getting married or having children in their early 20s. His friends and family will think he’s getting married too young and will try to talk him out of it. His desire to get married at his age would be considered odd, and he might not be able to find a partner to marry that young. Ethan has different desires than the typical Gen Z’er, but he is still impacted by being born in the 2000s.
Thus, generational change is not just about individual people changing; it’s about cultural norms shifting. Most Westerners have been trained to think of choices as stemming from personal preferences alone, and our behavior as impacting only ourselves. But we are all interconnected.
That’s important to keep in mind for a number of reasons. First, it means generational trends have an impact even if you’re the exception, even if you dislike the trends, and even if you’re not a member of the group advocating for change. Even if Ethan does find a young woman who wants to get married at 22 and have children at 23, the couple will probably be the only people in their peer group who have kids, making their experience different from young couples in the 1950s, who were surrounded by like-minded peers. The social equality movements of the last seventy years are another example. The feminist movement didn’t just bring more opportunities for women who marched or filed court cases—it changed the lives of women and men in future generations, most of whom did not consider themselves feminists but who work and parent very differently than their parents and grandparents did.
Second, our interconnected relationships mean the causes of generational changes aren’t centered just on individual behaviors but on group-level dynamics. The smartphone, introduced in 2007 and owned by a majority of Americans by 2013, is a good example. Smartphones are communication devices—they don’t impact just the individual user but their whole social network. As smartphones and social media became the pervasive norm, everyone was affected whether they used them or not. The whole social dynamic changed as communication shifted online and away from in-person meetings and phone calls. In-person interactions were interrupted by people looking at their phones. Spending a lot of time on social media meant you could see what everyone else was doing without you, but not using it at all meant you felt excluded from certain interactions. As a college first-year once told me, “You’re left out if you don’t use social media, and left out if you do.” Everyone is affected by the shift in the mode of social interaction whether they use these technologies or not. Similarly, everyone is a member of a generation whether they want to be or not.
Do Generations Exist at All? And Whose Fault Is It, Anyway?
The concept of generations has taken some hits recently. Several academics and writers have argued that generations “aren’t real” or are “just in your head.” In most cases, these writers are not saying that people live the same way now that they did fifty years ago. Instead, they take issue with the way generations divide people using birth-year cutoffs (say, that someone born in 1964 is a Boomer, but someone born in 1965 is a Gen X’er) and how books and articles on generations make broad generalizations about a heterogeneous group.
It is true that any generational cutoff is arbitrary—there is no exact science or official consensus to determine which birth years belong to which generation. Still, as you’ll see later in the book, there are often transitions around the birth year cutoffs, though they rarely take place instantly. That’s because people born right before and right after the cutoff have experienced essentially the same culture. But the line has to be drawn somewhere. It’s also true that generations are sometimes too broad: those born ten years apart but within the same generation have experienced a different culture. Still, too many micro-generations would be confusing and would make it harder to discern broad generational trends. I’ve tried to take a compromise position: Although the chapters are organized by generations, most of the graphs in this book are line graphs showing all of the years instead of bar graphs averaging everyone in the generation together—the transitions between generations and within generations also tell important stories.
Generational groupings are not alone in facing challenges. Just like city boundaries, the demarcation of 18 as legal adulthood, and personality types, the birth-year cutoffs draw bright lines when fuzzy ones are closer to the truth. Generational groupings are not perfect, and valid arguments can be made for doing them differently, but they persist because they are useful. It’s much more concise to use the label Millennials than “people born in the 1980s and early 1990s,” and easier to group people based on birth years instead of examining each birth year separately.
Another argument is that generations have lost meaning because they are getting shorter—for example, that the Millennial generation (1980–1994) is only fifteen years long, while the Silent generation (1925–1945) was twenty-one years long. However, that’s not a coincidence—generations are turning over faster because the pace of technological change has sped up. It took decades after the introduction of the landline telephone for half of the country to have one, but the smartphone went from introduction to more than 50% ownership in just five and a half years, the fastest adoption of any technology in human history. Some have questioned why the numbers of years in the defined generations are getting smaller as people are having children later, thus lengthening reproductive generations. The answer is straightforward: The generations we label and discuss publicly, like Boomers and Gen Z, are social generations, a different concept than reproductive generations.
Then there’s the issue of separating generational differences from those due to age or to time period (meaning it affected everyone of all generations). For one-time polls or surveys, the differences could be due to age instead. However, most of the data you’ll see in this book has been collected over decades. That means we can compare different generations at the same age, so age can’t be the cause of any differences. If more Gen Z young adults are depressed than older Gen X adults in any given year, that could be due to either age or generation. But if the number of 18- to 25-year-olds who are depressed has increased over the years, that’s not due to age—it means something is different for this generation of young adults.
It’s more difficult to eliminate the possibility that the differences are due to a particular time period where all generations are affected in exactly the same way. In most cases, time period effects and generational effects work together. For example, support for same-sex marriage increased among all generations between 2000 and 2015 (a time period effect), but Millennials were more likely to support same-sex marriage than Silents in all years (a generational effect, if we can assume that support for same-sex marriage doesn’t decline with age). As another example, social media changed the lives of people of all ages after it became popular after 2010, but it had a bigger impact on younger people since they were still building their social lives and communication skills. Although older people began to use social media, too, they had already developed their social ties and honed their communication skills in an earlier, less technology-saturated time. Sometimes we can be confident a generational effect is occurring if a change impacts only people of a certain age. Millennials are marrying much later than Silents did, for example; since first marriages tend to occur when people are younger, that’s clearly a generational difference and not a time period one. Overall, though, many of the trends covered in the chapters on each generation reverberate across the generations, even if one generation began the trend or was most impacted by it.
What about the idea that older people have “always” complained about younger generations? This is often used to argue that generational differences don’t actually exist—how can younger generations be “too soft” when people said the same thing fifty years ago?
It might be because they were always right. With technology making life progressively less physically taxing for each generation, each generation is
softer than the one before it. Just because something has been said before doesn’t make it wrong, especially if the change keeps going in the same direction. The first humans to use fire probably said to their children, “You have no idea how good you have it” in the same tone that Gen X’ers complain to their Gen Z children, with nostalgia-tinged wistfulness, about the library card catalog, landline phones in the kitchen, and other inconveniences of the 20th century. With technology ever-progressing, both sets of parents are right, thousands of years apart. The survey data also help address this question: They rely on young people’s own reports, not the complaints of their elders. Studying generational differences is about understanding, not about criticism.
When I give talks on generations, I’m inevitably asked some version of the “blame question.” “Whose fault is it that young people are so entitled?” someone will ask. Or, alternatively, “Don’t blame us—the Boomers were the ones who messed everything up.” These are also extremely common questions when generational differences are discussed online or in books. For example, Millennial Jill Filipovic writes,
“?‘OK Boomer’ is more than just an imperious insult; it’s frustrated Millennial shorthand for the ways the same people who created so many of our problems now pin the blame on us.”
This way of thinking has, to put it mildly, some issues. First, not all generational changes are negative; many are positive or neutral. If everything is the Boomers’ “fault,” does that also mean they should take credit for the good trends? In addition, generational changes are caused by many factors, mostly large cultural changes (like in technology) that can’t be laid at the door of a single generation. Trying to decide whom to “blame” is counterproductive, leaving us carping about fault rather than understanding the trends, both good and bad. These arguments make the generations seem like squabbling siblings, arguing over “who started it” when everyone is getting punched. The analogy to a family works fairly well in the 2020s: Silents and Boomers are the powerful older siblings, Millennials and Gen Z are the energetic but misunderstood younger siblings, and Gen X—the middle child—is often forgotten.
Another question is whether these generational differences apply to countries outside the U.S. This book focuses primarily on American generations, but many of the cultural changes identified here have appeared around the world. For example, smartphones were adopted around the same time in most industrialized countries. Given that, if a generational difference is caused by smartphones, we’d expect to see roughly the same pattern of change in countries that adopted the technology around the same time (though there will of course be other cultural influences as well). In the Gen Z chapter I’ll share some international data that reveals what the smartphone age meant for teens around the world.
What about the impact of the COVID-19 pandemic? Surprisingly, most attitudes and behaviors do not
show unprecedented changes between 2019 and 2020–2022. That might be because so many trends of the 2010s, from declining face-to-face interaction to increasing political polarization, were heightened by the pandemic, not reversed by it.
As historian Kyle Harper wrote in 2021, pandemics “seem to… find and expose all of our other social pathologies.… [L]ike a radioactive tracer, this COVID pandemic has given us a view into our own faults and failings and the cultural polarization that makes it impossible to achieve societal consensus.” In other words, the pandemic amplified what was already there, instead of changing it into something different. With virtual communication already increasing in the late 2010s, we were in dress rehearsal for the pandemic we didn’t know was coming.
Where We Go from Here
The chapters that follow feature the generations with a quorum of living members in the 2020s: Silents, Boomers, Gen X, Millennials, Gen Z, and Polars. (Each chapter builds on the previous ones, but you’re forgiven—and will be fine—if you flip to your generation’s chapter first… or the one on your kids’ generation.) After the introduction, each chapter includes a box with the generation’s birth years, population, and the generation of their typical parents, children, and grandchildren. There’s also a rough racial breakdown, with multiracial and multiethnic people included with their non-White identification (the U.S. Census found that 10% of Americans were multiracial in 2020). A note on language: I will use the U.S. Census names for racial and ethnic groups, capitalizing all to avoid confusion; I sometimes refer to people as Whites, Blacks, Hispanics, Asians, and so on for brevity and do not mean to imply that race is the whole of identity.
After this background, there’s a list of the most popular first names of the generation, drawn from the amazing Social Security names database of all Americans with a Social Security card. The list includes all names that ever cracked the top five for popularity during the generation’s birth years (because girls’ names cycle in and out of popularity more frequently, this list is usually longer for the girls’ names than the boys’).
That’s followed by a list of some of the generation’s famous members from entertainment, politics, sports, and business. Given the focus on U.S. generational trends, I’ve limited the list almost exclusively to Americans (both native-born and immigrant), so your favorite actor, singer, or soccer player might not be there if they are, for example, British or Portuguese. (I’ve made an exception for some Canadians who gained fame on American TV, usually via comedy. TV isn’t very funny without Canadians.) Some of these luminaries are still famous, while others were well-known in decades past and have since faded, so might provide a pleasant surge of nostalgia if you remember them when.
Listing people by generation provides a new perspective outside of the usual prototypical representatives of a generation. Most people know that Kurt Cobain of Nirvana was a Gen X’er—but so are Jimmy Fallon, Kanye West, Blake Shelton, Julia Roberts, Elon Musk, and Jennifer Lopez. Mark Zuckerberg is a quintessential Millennial, but the generation also includes Beyoncé, Michael Phelps, and Lady Gaga. You may find a few surprises: Until I made these lists, I didn’t realize that Melania Trump was a Gen X’er. There are also some intriguing parallels: Bill Gates and Steve Jobs were born in the same year, 1955.
After that, it’s off to the races with the generational trends, including in marriage, sexuality, birth rates, drugs and alcohol, equal rights movements, pop culture, technology, income, education, politics, religion, gender identity, mental health, happiness, and everything in between. Each generation is unique in its character and experiences, so each chapter is structured differently. You’ll also notice that there’s a lot more data—and thus more book pages—on the middle four generations (Boomers, Gen X’ers, Millennials, and Gen Z) than on Silents (who were already well into adulthood when many of the large national surveys began) and Polars (who are mostly still too young to participate in surveys).
Many technologies and events impacted more than one generation, but to avoid repetition I don’t cover them in each chapter. I’ve placed the trends with the generation most affected, or with the generation of their leaders or exemplars. For example, changes in religion were the most pronounced among Millennials, so trends in religious commitment are found in that chapter. Some topics were tough calls; the fight to legalize same-sex marriage, for example, was led mostly by Gen X but had (and will have) the biggest impact on Millennials and Gen Z. I ended up putting it in Gen X given the huge change over their lifetimes, and because Jim Obergefell, the lead plaintiff in the 2015 Supreme Court case, is a Gen X’er. Events that had a broad impact across many generations are interspersed between the chapters; September 11, 2001, for example, doesn’t just belong to one generation, or even two, but to all.
With pop culture and technology, the emphasis is on media that reflect the generation’s ethos, experiences, and innovators. You’ll notice a mix of the obvious and the less well-known; your favorite pop culture snack of the time might not be there—but many of them will be. By the time Millennials and especially Gen Z were coming of age, pop culture fractured across so many modalities that it became more difficult to summarize.
The last chapter explores what generational differences are likely to mean for the future in various realms, including the workplace, politics, and consumption. These trends portend fundamental changes to American society in the next few decades. Predicting the future is not an easy task, but with data from the very young, the view becomes less murky. Generations are a way of understanding the past, but they also can help us understand the future. As the generations go, so goes the world.