Regarding: Joseph Menn Yahoo! Article –
Yahoo!’s new scanning emails revelation relates to Islamophobia and should concern everyone, Muslims in particular, regarding the increasing number of liberties the American government has persistently been taking while breaching the privacy of its citizens.
The revelation adds to the fears that big Internet conglomerates like Google, Facebook, and Yahoo! are collecting or have access to large amounts of private information and there is a growing danger that this information can be misused by the companies or the government, enacting out a nightmarish Orwellian scenario. For Muslims this threat is much more serious and real; islamophobia, surveillance, CVE, and Trump seem more like the Big Brother at the gates of the internet.
As the Reuters article points out, these probing policies have been elevated from requests of older data deliveries and brief time-sensitive probings, to full scale programs engineering and the real-time scanning of key words. The article points out that the scanning programs mainly search for keywords and this in of itself does not seem so threatening. The argument that the programs are sophisticated enough to differentiate between threatening and non-threatening vocabulary is faulty because words by themselves lack context. Without context looking for keywords could lead to the reaching of a wrong conclusion. The de-contextualizing of phrases has put Muslims in harm’s way in the past; there are a rising number of incidents where Muslim passengers on planes have been removed due to panic and fear over overhearing the phrases “Insha-Allah” and “Allahu-Akbar”- words used very frequently in conversation between Muslims. In other instances it was simply the overhearing of passengers using the Arabic language that saw to their removal and victimization. If the same modus operandi is being used with the programs then similar unnecessary criminalizations of Muslim people will come to be.
Others still might argue that automated systems are free from bias, and look only at hard (threatening) data and language use. Unfortunately people have to remember that these programs too are written by ordinary people; people today that live in a world of fear, hatred, and islamophobia. It would not be unreasonable to think that these biases get reflected in the algorithms that the programmers use to write these programs. One example of one such algorithm is the recent Google Photos App which used Artificial Intelligence for identification, and classified an African-American person as a monkey (http://www.news.com.au/technology/online/black-friends-furious-after-google-photos-app-tags-them-as-gorillas/news-story/97bc0871bc5858160cb4df601324c172). Google immediately corrected the mistake, but the implications of how such computer programs are designed and flawed was lasting.
In this environment of islamophobia these surveillance techniques, if they are being designed for use on Muslims or in the event that they are used to target Muslims, can violate citizens’ fundamental rights. Although the article states Yahoo!’s compliance came as they acquiesced due to fear of losing a court battle to fight the classified edict; its hard to imagine Yahoo! not fighting the edict harder had the edict not specified a specific target audience, and was demanding to spy on all Yahoo! users equally. Techniques like Yahoo!’s email scanning could violate Muslims’ place as equal citizens in the United States. Are Muslims the victims of today’s Big Brother?