What is AI called in your native language?
Over the past few years, the conversation around emerging technologies such as AI and machine learning has grown significantly. However, this conversation is limited only to the research and developer community. The general public, which is the recipient, is largely excluded from these conversations. This is mainly because there has been very little effort to give a cultural and linguistic context to these technologies. To give an example, most of us might not know what AI is called in our local language or worse; there may not be a local term for AI to begin with.
Recognizing this problem, Asvatha Babu, a doctoral student in the School of Communication at American University, studied the interplay between technology, media, governance and the public. Analytics India Magazine caught up with Asvatha for a detailed conversation.
Edited excerpts from the interview:
OBJECTIVE: What are the challenges of translating technological terms / concepts into a local cultural context?
Asvatha Babu: I am a doctoral student at the School of Communication at American University; I did my master’s at the same university, focusing on cybersecurity and technology policy. I worked briefly on blockchain and its use to create social impact (for example, its possible use in the Aadhaar system). A series of events and further research interested me in facial recognition technology (FRT).
My current thesis work focuses on FRT and its use in solving major social and humanitarian problems. As part of my thesis work, I interacted with the police and understood how they use these technologies (currently, in the context of Tamil Nadu). In addition, I am also trying to understand the attitude of the police towards this technology and if it really helps them to make their job easier or to add more burden.
Another aspect of my research is to understand the general attitude of society towards FRT – what they think and talk about it. So when we say FRT, we are culturally building several things under that umbrella. In order to understand the cultural implications of FRT, I study the media coverage around this technology. For example, I realized that even though the police in Tamil Nadu have been using the FRT for three years, they don’t really have a name in the local language (Tamil).
There are so many ways to translate the term facial recognition to express the purpose of technology. This is what journalists do to report on FRT in the local language. For this reason, there is a tendency to talk about FRT in a more functional way (what it does and why it is needed). This approach is a double-edged sword. On the one hand, this translation makes FRT appear very contextualized, which is good. But on the other hand, my study shows that such an approach also leads to media coverage of the FRT from the perspective of the police / authorities. This means that there is less emphasis on critical analysis of such a tool.
OBJECTIVE: What can be done to improve understanding of technologies such as FRT, especially through appropriate and contextual translations?
Asvatha Babu: Much more attention needs to be paid to the translation process. There are some very good research organizations, civil society organizations and think tanks working on the concept of facial recognition and surveillance in India, algorithmic surveillance, digital rights and data privacy. But there is a lack of focus on translation at the local level. It is important to study how the technology is translated or how it is constructed locally in a specific language. So for the first step, I think we need to pay more attention to it.
In the second step, there needs to be more engagement with linguistics such as academics in language studies, digital rights specialists, as well as people who study the effect or impact of these technologies, as well as justice. social and social rights like think tanks and organizations. There needs to be more engagement between them and the engineering community. Currently, translation software is created by larger organizations like Google and used by authorities like the police. There is a lack of critical voices. We’re missing all that perspective that needs to be there for people to understand technology for what it is and for the media to cover it more fully.
OBJECTIVE: How good is software like Google Translate with regard to technical translation at the local level?
Asvatha Babu: I think any future where we have to rely on companies like Google to do translation work for us is not a future where I see some kind of emphasis on justice or rights-based types of translation, because their translation services are built with the goal of being used more and being more profitable as a product.
To this end, the New Zealand tribal group is fighting to keep its Maori language alive without interference from large conglomerates like Google. They understand that once the tech giants have access to this language, they (the tribe) will let go of any sense of contractual ownership of this language. Companies like Google, Microsoft or IBM don’t care about the language as such; they’re interested in the concept of having that language in their arsenal, so more people need to rely on these companies. So the tribe is now building their own automation process. There are other ways to automate the translation process that shouldn’t depend on these big, profit-driven tech companies.
PURPOSE: Given the risk of surveillance and breach of privacy technologies like FRT, campaigners believe they should be banned altogether. What is your opinion ?
Asvatha Babu: Yes, this is a major concern – pros vs. cons and the real cost of benefits.
To begin with, surveillance has always existed; people have always looked at others, especially in the context of, for example, a public health problem or in prisons. There is always this notion of having to watch your comrades or subordinates for public safety, etc. Today, the availability of information, processing prowess, and the entry of private tech companies trying to take advantage of it have made surveillance even more advanced.
It is important to consider two perspectives here. The first is that we already live in a hostile society where there is an asymmetry of power between the authorities and the public. The development of more advanced surveillance tools would further exacerbate this asymmetry. So – any amount is too much.
The second perspective, often adopted by authorities and those who develop such technologies, is that it makes life easier and enables the police to better serve the public and increase public safety and security.
These technologies cannot be eliminated or canceled. On our side (engineers, developers, media and authorities), we have to make sure that people are sufficiently educated about these technologies and their effects in a more cultural context.
Join our Discord server. Be part of an engaging online community. Join here.
Subscribe to our newsletter
Receive the latest updates and relevant offers by sharing your email.