Research Shows You Don’t Start Becoming an Adult Until You’re 30
© Mehmet Turgut Kirkgoz / Pexels
Becoming an adult is something most of us dread now. However, we used to dream of it when we were younger. The freedom of it all is what we always wanted to achieve. We learned the hard way that becoming an adult was way more than that. And it definitely took us a while to realize that. But, what if you are not technically an adult just yet? What if there was time for you to rejoice the childish gap just a little while longer? Well, a recent research team is trying to prove that you do not actually become an adult until you are in your 30s. But how true is that?
What Is the Aging Process?
The aging process refers to the gradual and inevitable changes that occur in living organisms over time. It is a complex and multifaceted phenomenon that affects various aspects of an organism’s structure, function, and behavior. Aging is a universal process observed in all living organisms. However, the rate and characteristics of aging can vary widely between species and individuals.
In humans, the aging process involves a combination of genetic, environmental, and lifestyle factors. That is then combined with social beliefs as well.
Why Do We Age?
The process of aging is complex and multifaceted, involving a combination of genetic, environmental, and cellular factors. Scientists have proposed various theories to explain why and how aging occurs. However, no single theory can fully account for the complexity of the aging process. Here are some key theories that contribute to our understanding of why we age:
1. Genetic Programming
This theory suggests that aging is genetically programmed into our DNA. According to this idea, our genes contain instructions for the developmental processes that occur throughout our lifespan, and there may be specific genes that regulate the aging process.
2. Telomere Shortening
Telomeres are protective caps at the ends of chromosomes that shorten with each cell division. The progressive shortening of telomeres is associated with aging, and when they become critically short, cells may enter a state of senescence or undergo cell death. This process is part of the limited replicative capacity of cells, known as the Hayflick limit.
3. Mitochondrial Dysfunction
Mitochondria are cellular organelles responsible for energy production. The mitochondrial theory of aging suggests that the accumulation of damage to mitochondria over time contributes to aging. Mitochondrial dysfunction can lead to a decrease in energy production and an increase in the production of reactive oxygen species (ROS), which can damage cellular components.
4. Free Radical Theory
This theory proposes that the accumulation of free radicals (highly reactive molecules) and oxidative damage to cells contribute to aging. Free radicals can cause cellular damage to lipids, proteins, and DNA. While some level of oxidative stress is a normal part of cellular function, excessive or uncontrolled production of free radicals may contribute to aging.
5. Cellular Senescence
Cellular senescence is a state in which cells cease to divide and undergo changes in function. The accumulation of senescent cells in tissues over time is thought to contribute to aging. These cells can produce inflammatory molecules and affect the function of surrounding cells.
6. Hormonal Changes
Changes in hormone levels, such as a decline in growth hormones and sex hormones, are associated with aging. Hormones play a crucial role in regulating various physiological processes, and alterations in hormonal balance can contribute to age-related changes.
7. Damage and Errors
Over time, various forms of damage accumulate at the cellular and molecular levels. This includes DNA mutations, protein misfolding, and the buildup of cellular debris. The gradual decline in the body’s ability to repair and maintain itself contributes to the aging process.
Study Says We Don’t Start Becoming Adults Until We Are 30
In a recent public discussion, Professor Peter Jones, a neuroscientist from Cambridge University, presented novel research proposing that individuals may not achieve full “adulthood” until they reach their 30s. Challenging the notion of a clear-cut transition from childhood to adulthood, Jones highlighted that the process is more nuanced and unfolds over approximately three decades.
According to Jones, having a definitive definition for when one moves from childhood to adulthood appears increasingly absurd, considering the complexity of the transition. This extended perspective on the maturing brain provides context for the observation that mental health issues are most commonly diagnosed in teenagers and young adults. During the ongoing development of the brain, particularly in the formative years, it serves as a gateway for the onset of mental illnesses.
Is This Actually Possible?
As individuals progress through their 20s and into their 30s, the brain undergoes substantial maturation, potentially explaining why the risk of mental health disorders decreases as the brain settles into its matured state. This insight into the prolonged timeline of brain development adds depth to our understanding of mental health challenges and emphasizes the importance of considering the extended period of transition into adulthood.
The Law Doesn’t Agree Yet
The current legal definition considers individuals as adults at the age of 18. However, insights from neuroscience challenge this simplistic view, suggesting that the process of maturation is more complex and varies among individuals. This perspective underscores that the age at which someone truly becomes an adult is different for everyone, reflecting diverse rates of maturity.
What Else Did the Professor Have to Say?
Professor Jones highlighted that, for practical reasons, educational systems often focus on groups rather than individual trajectories. Emphasizing the complexity of development, he noted that the maturation process spans decades and varies among individuals. The discussion, including research on serious mental disorders like schizophrenia, revealed a shift from the earlier belief in a purely genetic cause to an understanding of the complex interplay between genes and environmental factors.
Schizophrenia, commonly diagnosed in older teenagers, exhibits a significant decrease in risk beyond the late 20s, aligning with the idea that as the brain matures and sorts out its circuits, the likelihood of psychosis diminishes.
What Do Other Studies Say About Becoming an Adult?
Professor Jones highlighted findings indicating that individuals living in cities, especially those in poor and migrant populations, face an increased risk of serious mental disorders. He argued that urban living creates a “potent cocktail” of environmental influences that can impact the developing brain. In particular, he emphasized the challenges faced by migrants, noting that being a minority within a majority group may lead to a constant state of low-level vigilance.
Referring to a specific study focused on a population on the outskirts of Paris, France, Professor Jones pointed out that being part of a minority group was associated with a threefold increase in the risk of schizophrenia. This observation underscores the significant role that social and environmental factors can play in mental health outcomes.
Another Study Agrees
Furthermore, Professor Jones mentioned an ongoing investigation in the United States—the Adolescent Brain and Cognitive Development study. This long-term study is tracking the progress of nearly 12,000 children aged nine and ten, with a specific focus on identifying early signs of potential future mental health problems. This research aims to contribute valuable insights into the understanding of mental health development in adolescence.
Do you agree with Professor Jones? Or do you personally think there is a different time in which we become adults?
You might also want to read: Stay Young Forever: Reverse Aging is Now Possible