Human Bias Is Everywhere in Tech. To Fix It We Need to Reshape Computer Science Education.

I was so excited to publish this piece in EdSurge last month with my colleague Laura McBain. Reposting!

As we transition into the new year reckoning with a violent insurrection organized on social media, the spread of disinformation about a deadly pandemic and breakdowns in distance learning, it would be remiss of us not to acknowledge the impact of technology on every facet of our society. We all have been equally unified in our frustrations and concerns that we are inching closer to a dystopian future.

2020 shed more visibility on how racism, sexism, and other -isms permeate technology and will continue to create divides that may become irreparable. In the tech world, there have been DEI efforts, legislation targeting the racist impact of technology, warnings from ethicists and independent bias ratings created to rein in the harm—but none of these solutions address the real issue: humans. We must confront how the destructive harm unfolding through our technologies today is a reflection of the implicit bias and prejudice of the designers building the technology—and how they were taught. Many designers don’t know how to identify harmful bias and are completely unaware of how their own biases shape the products they build. So how can we start addressing this issue head-on in 2021?

Humans are the problem, and luckily education offers a solution.

We need to move beyond the quick fixes that are not working; and invest in the next generation of technologists by radically re-shaping why and how we teach computer science. The natural place to start is within the broad but influential community of computer science education, which includes teachers, administrators, curriculum designers and anyone involved in shaping how future technologists learn. Our young people need to be technically proficient in Python, R and Lisp to build AI, machine learning and other emerging technologies. However computing skills are not enough; we need to equip our young people with knowledge, skills and moral courage to design equitable tech that dismantles existing power dynamics, protects non-dominant groups, represents everyone and prioritizes the well-being of society.

As CS and technology educators we have helped create dozens of spaces for young people to tinker with technology for over a decade. Reflecting back on that time span, we can’t help but wonder how many young people graduated from those spaces capable of building a new bot, but incapable of recognizing their own biases. Where are they now? What cool and potentially dangerous technology have they put into the world? We cannot go back in time; but we can use this new insight to design a better, more equitable vision for computer science education.

A radically, reshaped computer science education will:

Prioritize racial literacy and history.

It’s important for all young people to believe they can be creators of technology, and it’s also reckless for us to omit that technology has historically been designed as a tool to surveil and oppress non-dominant communities. Prioritizing racial literacy means we must acknowledge how white supremacy has been ingrained into technology and collectively recognize that tech has not been neutral and that it has the power to harm. Examples might include how early punch card tabulators were used in Nazi Germany by the Third Reich to process racial censuses of Jewish German citizens; and how some of the first film stock centered white skin tones. Today we have technologies like facial recognition software which centers whiteness and can’t identify Black women, while also being designed to surveil and police Black and Brown communities.

Like emerging technologies, oppressive design practices have only evolved and manifested in new ways. K-12 administrators, educators and tech companies investing in computer science education need to support young people to examine the design, use and harmful consequences of discriminatory technologies.

Reflect and act on our own biases as creators.

It’s crucial for young people to understand how bias is embedded in code, data, policy and other facets of technology. The ability to reflect on how our positionality (shaped by identity and social status) influence the technologies we design is even more paramount for young people. At the Stanford, we’ve built a design methodology that can help technology designers think through the first, second and third order implications of their creations before they release them into the world. Our budding technologists should iteratively evaluate their creations and ask themselves:

  • Am I creating this based on my own lived experience and expecting others who are different from me to use it?
  • Who benefits, who is being harmed or who is left out from the technology?
  • Whose stories is this dataset telling? Whose stories is this dataset leaving out? What was the historical context when this dataset was produced?
  • What don’t I know? Who should I ask and learn with?
  • I can design this but should I? What are the implications that need to be considered?

Recognize and make space for multiple perspectives.

The field of design can be an arena for the “pluriverse”, which anthropologist Arturo Escobar defines as multiple ways of knowing, being and thinking that are rooted in specific places and communities.

Young people are curious, and can be inspired by the diverse ontologies and perspectives amongst the peoples of the world, and in natural systems. Guiding them to channel this inspiration into design practices which shift the power dynamics in technology across race, gender, ability and culture can make our technologies profoundly more equitable. Encouraging them to see what is possible by tackling hyper-local problems and designing solutions with others who have wildly different perspectives, is one place to start. Intercultural experiences which challenge them to question why their perspective should be the perspective held by the world, and making room for other beliefs they may not relate to, is another. These early experiences can enable them to work with others and build technologies that are more inclusive and contextually appropriate.

How do we move forward?

We can gain inspiration from the 1619 Project and the Zinn Education Project, which have provided us with the tools to face our multifaceted histories in the hopes of repairing and shaping our futures. These projects prioritize racial literacy, help young people reflect on bias and recognize multiple perspectives.

We can work with our social studies departments and across other disciplines to ensure our students have a historical understanding of technology. We can celebrate what our students code and build, and ask them to consider the impact their creations might have on others. And we can celebrate and actively engage with different perspectives that challenge dominant voices and narratives in every step of our design process.

If we can apply these practices to computer science education, our young people might create cool technology that serves everyone and upholds a just world.

Designing for Digital Agency at the Stanford

Over the last few months I’ve been working at the Stanford in their K12 Lab on a special project around emerging tech and equity. Re-posting a blogpost written in collaboration with stellar colleagues Laura McBain, Lisa Kay Solomon, Carissa Carter and Megan Stariha.


Technology is power.

It can enable you to share an idea with millions of people around the world in a matter of seconds. And in those same few seconds, it can enable someone else to steal your identity and drain your bank account.

Whether it’s being used to spread information, incite violence, influence elections, or shop for glasses, who should have access to such powers? Who should be able to design and utilize technology to shape the world in their vision and image?

The present reality is that this power is in the hands of very few, and manifesting into serious consequences for the most marginalized people in the world. This is why we all need to be technologists. We all have the right to participate in and shape the growing influence technology has on our lives and communities, and build our digital agency. Whether you are the creator, user or policymaker, we all have a role in designing and deciding the future we all want to live in.

Today many emerging technologies (still in a phase of development and/or haven’t reached commercial scale) like machine learning, wearable tech, synthetic biology and others are often riddled with embedded biases (Ruha Benjamin, 2018). Computer scientist Joy Buolamwini found that three widely-used gender recognition tools could only accurately identify dark-skinned women as women from a photograph 35 percent of the time, while white men were identified as men 99 percent of the time (New York Times, 2018). This is a symptom of how emerging technologies are not created by diverse groups of people who reflect different values, life experiences, expertise, and take the responsibility to ensure all voices are represented in the design process.

At the we believe educators are uniquely situated to address this critical issue. Educators have the capacity to shape a future in which all voices are represented and valued. They have the ability to equip students with the skills, mindsets, and dispositions needed to evaluate the ethical implications of technology and prioritize equity-centered design. But educators, particularly those who are serving students furthest from opportunity, need new resources to help students engage and create with emerging technology.

Educators experiment with the “I Love Algorithms” card deck designed by the Stanford Teaching and Learning Studio. Photos courtesy of the Stanford Beaudouin.

We believe that design can play an important role in addressing the digital inequities that exist in our K-12 communities, and the challenges facing digital inclusion. Built on our ongoing exploration of emerging tech, equity, and design we are exploring questions like…

  • How are emerging technologies used by different communities?
  • Who is creating emerging technologies like machine learning, blockchain, and synthetic biology?
  • Who is not being represented in the creation and pioneering of these emerging technologies?
  • How are oppressive social structures and practices, like racial profiling, manifesting in the early stages of the creation and application of emerging technologies? Why?
  • How might we equip educators and students with the creative confidence to understand, evaluate, and create with emerging technologies in their communities?

These questions and the research we’ve done are leading us to this design challenge:

How might we leverage emerging technologies to advance equity, empathy, and self-efficacy in K-12 education?

Our design work is grounded in four pillars of understanding, centered around participation and radical access, built on the early design work from Carissa Carter’s You Love Algorithms:

  1. It’s not about becoming a coder; it’s about knowing what the code can do (Carissa Carter, 2018). We all need to understand what emerging technologies can do, how they’re interlinked, and how they can be designed by increasingly diverse groups of creators and decision makers. This means that each of us should have a basic understanding of how emerging technologies such as blockchain, artificial intelligence, the internet of things, brain computer interface technologies, etc. work. Does that mean we’re all verifying transactions on a blockchain? No. But it does mean that we understand it’s rooted in decentralization, transparency, and immutability, and why some systems may or may not benefit from using blockchain.
  2. If we want emerging technology to represent all of us, it needs to be created by all of us (Carissa Carter, 2018). Technology needs to be inclusive. Creation encompasses more than just technical production or programming, it means all of our experiences, perspectives and voices are incorporated in the creation, adaption, and delivery of the technology. It requires that we all have an understanding of the concepts underlying emerging technology, and that each of us are an integral part of the design process.
  3. Technology is personal. Educators need support with how to cultivate and leverage the valuable digital practices and identities their students bring into the classroom (Matt Rafalow, 2018). To cultivate students’ abilities and support them in connecting with emerging technology, we need to consistently find ways to make technology personal to them. If students don’t recognize themselves or their communities in the technology they are using or designing with, this only further marginalizes them and reinforces embedded bias.
  4. Learning is about lifelong participation and creation; not consumption. Constructionism has shown us that the most powerful learning experiences emerge from creating something from our own curiosity, understanding, knowledge, and experience of the world. There is nothing more rewarding than designing something that solves a problem for you and the people you care about in your community.

How we are getting started.

In our pursuit to expand radical access to emerging technology and to cultivate a diverse generation of technology creators, we’ve launched a design project called 10 Tools for 1,000 Schools, a portfolio of design resources, tools, and approaches to help build the creative and digital agency of K-12 communities.

In the toolbox, educators will find engaging activities which will help them understand and teach the foundational concepts of emerging technologies, and resources on how to integrate them into various academic disciplines, along with easy-to-adapt community-based design challenges. We kicked off the playteest of two of our first 10 Tools resources at the first ever K12 Futures Fest, a gathering of more than 200 educators, students, and other community members who showcased their work and engaged in our new experiments.

Educators participate in a Futures Fest session on Blockchain. Photos courtesy of the Stanford Beaudouin.

Educators participated in a session which immersed them in the blockchain concepts of decentralization and transparency through taking on the persona of detectives tasked with cracking unsolved mysteries; and in another session, designed their own dance moves to express different machine learning algorithms. Participants pushed back on the perceived benefits of the technologies, rapidly came up with new ideas for how they might apply these technologies to new design challenges, and asked thought-provoking questions about the potential impacts on their students.

As our prototypes and learning evolve, we aim to share our work on the K12 Lab site. And we hope to encourage more educators to take up this challenge in their own communities by adopting and remixing these resources to fit the diverse needs and identities of their students.

Our collaborators include a crew of pioneering educators: Kwaku Aning, Louka Parry, Jennifer Gaspar-Santos, Akala Francis, and Daniel Ramos. They are each collaborating with us to create, integrate, and adapt these resources in their own contexts.

On the horizon.

In 2020 Karen Ingram, a designer who has a special focus on synthetic biology will join the team as an Emerging Tech Fellow.

How to learn more?

Want to learn more about our work? Read updates here. You can also join our newsletter for updates and events! Follow our progress on twitter using #10tools4schools.

_ _ _ _ _ _ _ _ _ _ _ _ _ _


  1. Benjamin, R. (2019). Race after Technology: Abolitionist Tools for the New Jim Code. Polity Press.
  2. Lohr, S. (2018, February 9). Facial Recognition Is Accurate, if You’re a White Guy. Retrieved from
  3. Green, B. (2019, April 17). Can predictive policing help stamp out racial profiling? — The Boston Globe. Retrieved from
  4. Matt Rafalow (2018). Disciplining Play: Digital Youth Culture as Capital at School. American Journal of Sociology. 123:5, 1416–1452.