I’m sorry, did you say there’s a class that loans me a Fitbit for free? A class that peels back the shiny layers of technology to reveal the truth underneath? The perfect intersection of STEM and the humanities? Sign me up! 

DIGITAL 359, “Gendered and Racialized Bodies and Technologies,” is a course led by Professor Apryl Williams, which investigates how everyday technologies shape our ideas of gender and race by reinforcing stereotypes. I took DIGITAL 359 in the Winter 2024 semester, and the course taught me to question the biases of different tech companies and ask, “Who is this technology actually designed for?” 

Early in the semester, every student is given a Fitbit to wear—tracking steps, heart rates, and sleep patterns. Suddenly, this little black band was on my wrist throughout all hours of the day. Making my way back to my dorm on campus, I would often feel the Fitbit’s celebratory vibrations for reaching my step goal. Wearing the Fitbit datafied my life and led me to see my daily habits in the form of bar graphs and pie charts. On the bright side, I realized that by simply walking to my classes and extracurriculars on campus, I would usually meet the 10,000 steps-per-day goal. On the other hand, I noticed that the device was mostly ineffective in helping me improve my habits.

However, the purpose of wearing the Fitbit wasn’t just for fun, self-reflection, or even becoming more “fit.” At the end of the term, students write both a data review and an auto-ethnography paper regarding their experiences, using their downloaded Fitbit data as a useful source of evidence. Deciding to do a project with the Fitbit, Professor Williams stated, “I was thinking about how technologies serve bodies differently, and I wanted a course that not just taught about that but also allowed students to experience it.” 

My peers and I went from never hearing the word “affordance” at the beginning of the semester to being able to articulate what wearable devices encourage, demand, and allow. We researched the accuracy of Fitbit’s measurements, and I found a lack of transparency behind their formulas and a failure to consider systemic factors that contribute to an individual’s overall health. All bodies are different, yet despite this obvious fact, there seems to be a belief that “one size can fit all.” 

In addition to the Fitbit, we also discussed inequities in facial recognition software (FRT), voice-changing apps, and even dating apps like Tinder and Bumble. Through examples of everyday tech, Professor Williams reveals that “technology has the power to marginalize some while enabling others. Race, gender, society, and power function together to structure pathways for individuals throughout their lifetime.”

According to the 2019 facial recognition study done by the National Institute of Standards and Technology, African American and Asian faces are 100 times more likely to be misidentified by FRT than their white counterparts. This leads to increased racial profiling when FRT is used by police, as innocents are wrongly accused of committing a crime. Even more basic technology, like soap dispensers, is not accessible to all skin tones. Most soap dispensers are designed for the white majority, with technology that requires light skin tones to reflect light back to the device. An everyday task like washing your hands should be available to everyone. 

By the end of the semester, we were able to recognize these flaws within technology as we read through various patents. Then, we were tasked with making more equitable designs, with supplies to make our own mock-ups. This is one of Professor Williams’ favorite moments in the semester because she sees the students’ creativity, ingenuity, and joy in creating something with their hands. 

When it comes to designing new technology, Dr. Williams wants her students to find the beauty in building slowly with intention instead of racing to find the next big thing. She expressed, “We need to return to community and understand what it looks like to care for the whole person in our society and not just treat them as data points. Humans aren’t data.”