Artificial Intelligence and Human Values
https://twitter.com/alexstamos/status/916697104197627904 This recent tweet by Facebook’s security chief Alex Stamos stimulated my thinking about literature reading of Silicon Valley executives and importance of humanities education to the tech students. Stamos argues here that his company should not become “Ministry of Truth” (the propaganda machine in Orwell’s 1984) by running algorithms on deciding what is truth and what is propaganda or fake news. An analogy from Orwellian classic helped Stamos explain a complex evil scenario in simple terms. But is the role of cultural literacy limited to simplifying otherwise complex scenarios? No, if culture is defined by the values of its practitioners, I think it cannot be separated from anything they create. We create Artificial Intelligence. An algorithm cannot be neutral when it is designed by biased humans. An undiscriminating AI model trained by discriminating expert is improbable. We need to find ways to pass our values to what we are set to create. Fortunately, tech leaders do understand this. Emma Williams is a general manager at Microsoft whose area is Anglo-Saxon literature. Her role is to ensure that Cortana’s personality is calm and sober. Cortana is Microsoft’s premiere chat AI. It is evident that identifying role of human values in technology has never been more important than now. In Cortana’s case, for example, who would want to chat with a short tempered AI? [caption id=”attachment_202” align=”alignnone” width=”436”] Satya Nadella Quotes Ghalib at a Presentation in New Delhi[/caption] “Hazaaron khwaishein aisi, ke har khwaish pe dum nikle. Bohat nikle mere armaan, fir bhi kam nikle,” people were surprised when Microsoft CEO Satya Nadella quoted great Rekhta poet Ghalib during a presentation in New Delhi last year. ‘Yet another geek into poetry’ one may wonder. But it perfectly suites CEO of a tech giant whose products aspire to be as good as human in some ways. If poetry can ease complexity of affairs by creatively deploying words why shouldn’t robots use them? Also, ill-informed reading in history undermines innovations and experiments in social and political thoughts. Which I think is a great cause of concern for tech students. For example, civil disobedience as an idea may not occupy any place in the mind of an engineer as a revolutionary approach towards fighting oppression. It is only hypocrisy talking about singularity and at the same time denying human culture and values any place in there. If future is what we build today, we should build it good and not evil. In India universities are finally moving towards offering more diverse learning experience to the students where an engineering student can study Shakespeare. Though the progress is very slow the outcomes should be positive with CBCS or choice based credit system. References:
- Facebook Stumbles With Early Effort to Stamp Out Fake News, Sarah Frier, Bloomberg
- The Bing Search Experience: A Practice In Italian Renaissance Art & User Perspective, Amy Gesenhues, Search Engine Land
- Yes, Microsoft CEO Satya Nadella can quote Mirza Ghalib, The Indian Express
- Choice based credit system: the path ahea, M.S. Bhat, The Hindu
Artificial Intelligence and Human Values
http://idevji.com/artificial-intelligence-and-human-values.html