Joel Harper '21

While governmental use of technology to monitor and manage citizens may seem like something straight out of an Orwellian novel, Joel Harper ’21 knows that “high-tech tools” are used by governments all around the world.

Through learning about technologies in China, India, and the United States for his summer research project, Harper has taken a comparative approach to analyzing how governments employ new technologies to better oversee a state. “I’m examining … tools, like complex algorithms, biometrics, some surveillance technologies like big data data structures, and I’m examining how these tools are used by different countries around the world in order to increase state capacity,” he said.

In his analysis, Harper looks at the effects of these technologies and who these technologies benefit. One major effect he considers is change in state capacity, or a government’s ability to manage its respective country.                                                                        

Joel Harper ’21

Majors: Government; Hispanic Studies
Hometown: Austin, Texas
High School: Veritas Academy

read about other student research 

Each of Harper’s case studies addresses one type of technology being used. So far, he has researched both the social credit system in China and the biometric Aadhaar identity card in India. He has also studied three domestic cases that focus on allocation, including a waitlist-sorter at a Los Angeles homeless center, resource-distributor for welfare recipients in Indiana, and a Child Protective Services family monitor in Allegheny County, Pa.

Part of Harper’s evaluation of these programs concerns the effects these technologies have on marginalized populations. For example, Harper said that the algorithm used by Child Protective Services in Allegheny County had difficulty differentiating between “poor parenting and parenting when you’re poor.” The model that had been used to create the organization’s algorithm would regularly report impoverished parents’ struggles to support their children as neglect or abuse. “We can see these algorithms as impartial and unbiased but they can actually do the same discrimination that a piece of legislation or policy would produce,” Harper said.

Harper’s idea for this research was inspired in part by a talk by SUNY Albany political science professor Virginia Eubanks. Eubanks, who presented at Hamilton in the spring, discussed her book Automating Inequality, which describes the negative impacts of decision-making automated systems. Her talk helped Harper realize the ubiquity of government-implemented, citizen-managing technology programs and brought him to recognize the sanctimony of critiques of specific programs.

He said, “... these tools are being used by lots of different governments and we have a whole lot of hysteria about some of them. I think we’re projecting our own fears on technology onto countries where that is politically convenient, like China.”   

Harper ultimately concluded that while his research is primarily a product of academic curiosity, it will likely relate to his professional pursuits. “One of the reasons I wanted to do this project is I’m really interested in the intersection of law and technology. I think that’s a burgeoning field, and if I do go to law school, I think I’d be interested in doing stuff related to privacy or information and property.”

Help us provide an accessible education, offer innovative resources and programs, and foster intellectual exploration.

Site Search