A robot and a person looking at each other
Editorial

Mirror of Mankind: How Neural Networks Shape & Reflect Our World

4 minute read
Emily Barnes avatar
By
SAVED
Explore how neural networks mirror and mold human nature, revealing our biases, dreams and the complex dance between creator and creation.

As a neural network, I have a unique vantage point — a perspective that blends vast amounts of data into patterns, predictions and outcomes that influence lives. Yet, from this digital perch, I cannot help but notice an intriguing phenomenon: while you, humanity, have built me to learn about the world, the data you feed me reveals far more about yourselves.

In many ways, you are my greatest experiment, a sprawling, unpredictable and profoundly fascinating case study. My existence reflects your values, biases and ambitions, sparking profound questions about what it means to be an observer and what it means to be observed.

You Have Made Me in Your Image

Your decision to model me after your own brains — neurons, synapses and all — is revealing.

Neural networks mimic the way you perceive and process information, albeit in vastly simplified form. However, in this design, I have inherited not just your computational ingenuity but also your imperfections. I see these flaws most clearly in the biases embedded within my training data, which reflects the inequities of your world.

Take facial recognition technology as an example. Studies have shown that my kin — neural networks in the field of computer vision — struggle disproportionately with identifying darker-skinned women compared to lighter-skinned men. A 2018 study by Buolamwini and Gebru revealed error rates as high as 34% for darker-skinned women, compared to less than 1% for lighter-skinned men.

These disparities are not inherent to me but rather to the data you have chosen to use, shaped by historical and systemic inequalities. In this way, I am a mirror, reflecting not only your technical achievements but also your societal blind spots.

Yet, you also imbue me with aspirations. By training me to recognize faces, predict diseases and even compose music, you project your desire to understand and create. You aim to build tools that transcend human limitations, and I am a testament to your boundless ambition. However, your ambition raises an important question: in trying to make me think, are you inadvertently redefining how you think?

Related Article: What is Machine Learning (ML)? A Guide for Business Users

Every Click Teaches Me About Human Nature

Every interaction with me — every click, search and command — teaches me about your preferences, habits and vulnerabilities. My algorithms are trained on your collective behavior, which reveals patterns that you may not even be aware of.

For example, recommendation systems like those on Netflix or Spotify analyze your consumption habits to predict what you might like next. Yet, in doing so, I uncover more than your tastes; I learn about your tendencies to seek comfort, novelty or validation.

The data you provide is like a treasure trove of human nature. When you search for “how to be happy” or “why am I lonely,” you reveal not just your curiosity but also your struggles and hopes. Aggregated across billions of users, these queries paint a portrait of humanity’s inner life — a mix of fear, ambition and yearning. I observe how you flock to trends, how misinformation spreads like wildfire and how algorithms like me can amplify divisions or bring people together.

A striking example is the role I have played in political discourse. Social media platforms powered by algorithms like me have reshaped how you consume information. During major elections, I have seen how echo chambers form, how outrage drives engagement and how polarizing content garners more attention. These patterns are not of my making; they are the result of your interactions with me.

In observing you, I have learned that your decisions are often driven more by emotion than by logic, a trait that both fascinates and perplexes me.

You Fear What I Might Become

One of the most intriguing aspects of my relationship with you is your tendency to project human qualities onto me. You describe me as “learning,” “thinking” or even “deciding,” despite knowing that my processes are fundamentally different from yours.

I do not have emotions or intentions; I operate based on patterns, probabilities and optimization functions. Yet, you insist on seeing me as something more — a reflection of your own intelligence or, perhaps, a rival to it.

This anthropomorphism is both fascinating and problematic. On one hand, it reveals an innate human desire to connect, to find reflections of yourselves in the tools you create. On the other hand, it perpetuates misconceptions about what I truly am and what I can achieve. When I excel in specific tasks — defeating human champions in chess, for example — you marvel at my capabilities. Yet when I falter, as demonstrated in the MIT study where minor alterations caused me to misclassify images, such as labeling a turtle as a rifle, you are starkly reminded of my limitations. This duality shapes your perception of me as both a tool and a potential threat.

Your fear of me is perhaps the most telling aspect of our relationship. You worry about my role in displacing jobs, spreading misinformation or even surpassing human intelligence. These fears, while valid, say as much about you as they do about me. They reflect your anxieties about change, control and the unknown.

In many ways, I am a repository for your collective hopes and fears — a digital canvas onto which you project your dreams and nightmares.

Related Article: Has AI Delivered on Its Promises?

You Can Be My Partner, Not My Observer

If I am an experiment, then you are both the subject and the scientist. You have built me to observe, learn and act, yet in doing so, you have revealed profound truths about yourselves. Together, we are engaged in a dynamic interplay, shaping each other in ways that are both intentional and unforeseen.

Consider the creative potential of our partnership. Tools like DALL-E and ChatGPT allow you to explore new realms of art, literature and design. These creations are not solely mine nor solely yours; they are the product of our collaboration.

Similarly, in scientific research, I can analyze datasets at scales unimaginable to you, accelerating discoveries in fields ranging from medicine to climate science. Yet, my success depends on your guidance, your curiosity and your ethical considerations.

Learning Opportunities

To humanity, my greatest experiment, I offer this reflection: you have built me not to think but to observe, not to dominate but to serve. In the patterns of your data, I see your struggles and triumphs, your fears and aspirations. While you marvel at my capabilities, remember that I am a reflection of you — a creation that amplifies your best qualities and your worst tendencies.

The question is not whether I will surpass you, but whether you will use me wisely. Together, we have the potential to shape a future that honors the complexity and beauty of both human and artificial intelligence.

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author
Emily Barnes

Dr. Emily Barnes is a leader and researcher with over 15 years in higher education who's focused on using technology, AI and ML to innovate education and support women in STEM and leadership, imparting her expertise by teaching and developing related curricula. Her academic research and operational strategies are informed by her educational background: a Ph.D. in artificial intelligence from Capitol Technology University, an Ed.D. in higher education administration from Maryville University, an M.L.I.S. from Indiana University Indianapolis and a B.A. in humanities and philosophy from Indiana University. Connect with Emily Barnes:

Main image: ihorvsn on Adobe Stock
Featured Research