On the Mind: What Science Says About Digital Natives

April 10, 2017
Author: 
Carolyn Crist

This column, On the Mind, is a series about the latest in cognitive science and neuroscience-related research that applies to our everyday lives. This biweekly series is for those interested in cutting-edge findings about the practical side of habits, memories, multitasking and the human-brain interface. What are the recent studies, and what is the context? See what science says and how you can apply it to your life.

For about a decade, scientists have wondered whether “digital natives,” or those who grew up using technology, operate differently from generations before. They’re considered “native speakers” of the language of electronic media and often multitask with technology, favor interactivity during learning and naturally adapt to change in the digital landscape. The older crew — called “digital immigrants” — weren’t raised in a digital environment and may be less adept when using technology. The differences can create a rift between the two groups, but does the divide really exist? Some researchers and bloggers have questioned the terms in recent years.

Studies Say

The term first spread by education consultant Marc Prensky in 2001, when he argued that the digital gap has profound implications for education and could mean schools aren’t serving today’s students. In the years following that, other researchers applied the idea to decision-making, shopping and seeking health information online, too. These generalizations also expanded to the “millennial” and “Google generation” labels, designating a generational divide between technology users.

Since 2011, other researchers began looking at “digital nativeness” based on what digital activities were being done and who was doing them, putting less focus on the age. Experience and education are just as important, if not more so, they said. And in the most recent years, of course, new studies are rejecting the idea of digital nativity altogether.

Key Takeaways

Overall, the Internet has reshaped the way we think and search for information, brain scientists say. But how does it rule the way we use that technology?

1. The terms “digital native” and “millennial” aren’t synonyms.

It’s tempting to lump them into one group, but scientists are now making a distinction. Millennials were born during the 1980s to 2000s and are around 17-37 now. Those who were born since 2000 are considered Generation Z, and those kids and teens are digital natives, too. At the same time, not all Millennial or Gen Z kids are necessarily digital natives. Those who grew up in poverty or restricted households may have had limited access to technology.

In fact, in recent years, researchers in the United States and United Kingdom have looked at different generations of digital natives — born after 1980 and born after 1993 — to determine how Internet use, Internet anxiety and Internet identification differ. It’s predictable, but the younger group had more positive attitudes toward the Internet, lower anxiety scores about the Internet, and higher web, e-mail and social media usage.

2. Digital natives’ brains may operate differently.

In 2008, Gary Small of the University of California at Los Angeles Semel Institute for Neuroscience and Human Behavior promoted the idea that digital natives’ brains are hardwired differently since they were exposed to different digital products early. This sparked panic when people read his thoughts as a negative association, and his work is sometimes criticized as fueling frenzy about parents damaging today’s kids.

Since then, other studies have backed up the idea in specific ways. Data from the same University of California at Los Angeles brain institute, for example, says that digital natives’ brains were more actively engaged while scrolling through a webpage than while reading printed text. Social interactions, friendships and civic activities may operate differently in the brain, too, researchers at the University of Minnesota say.

3. We have inherent biases about the “digital natives” label.

Scientists are also interested in how we — digital natives and immigrants alike — view this group of savvy technologists. They’re finding that our brains accept certain myths that are hard to break, including beliefs that natives are better at multitasking or have natural instincts about how to fix computers, tablets or phones. Studies at the University of California at Irvine and California State University show that younger generations are more likely to multitask than older generations because they use different media at the same time. But we’re still identifying the best multitaskers and what makes them good at it.

4. In fact, many digital natives still need technology training.

Although younger generations use their phones and computers often for basic schoolwork or social purposes, they may not have the skills required for technology-based careers in computer science, marketing, business and health care. Students often have high digital confidence but low digital competence, a collaboration of international researchers said in 2015. For example, a digital native may know how to use social media for social interactions but not to produce content for a media organization or branding for a company.

“We need to move away from this fetish of insisting in naming this generation the Digital/Net/Google Generation because those terms don’t describe them and have the potential of keeping this group of students from realizing personal growth by assuming that they’ve already grown in areas that they so clearly have not,” said Apostolos Koutropoulous in 2011 in a 10-year review of the “digital native” label.

“Learners don’t know what they don’t know, but if they come to the table from a position of superiority, like they are better than the so-called digital immigrants, they lose an opportunity to learn something that they don’t know that they don’t know,” he said.

At Stanford University School of Medicine, for example, medical residents are among the first “digital natives” who are being trained to incorporate technology into social interaction with patients. As part of the process, they’re being trained to use telemedicine during medical school to talk to patients in other places.

5. We’re still learning about digital natives’ brains.

Even though brain scans show differences in active engagement and multitasking, researchers still aren’t quite sure why that is and what it means. Rather than define “digital natives” by their age or a label, scientists are broadening their views to understand how technology is being adopted. Different cultures — or “digital tribes” — may be the next step for research.

In France, for example, the Digital Natives Assessment Scale asks 15 questions to better gauge how people identify with technology, across generations and cultures.

In the next decade of “digital native” labeling, there’s more work to be done. After all, Deloitte and the Harvard Business Review are thinking about it, too.

Image: Juan Cristóbal Cobo, Flickr, CC-BY

Carolyn Crist is a freelance health and science journalist for regional and national publications. She writes the Escape Artist column for Paste TravelOn the Mind column for Paste Science and Stress Test column for Paste Health.