Washington: Google is refusing to censor the rants of Al Qaeda preacher Anwar al-Awlaki, who reportedly influenced the attackers who committed the 7/7 London bombings.
Dubbed the “pied piper of Jihad,” al-Awlaki’s hateful teachings were, according to RT online, easily searchable using Google’s enhanced search function and are believed to have inspired jihadist terrorists and radicalised many more.
When users search al-Awlaki’s name, Google’s autocomplete function – which uses algorithms based on what others have searched before – suggests they view his “quotes” and “lectures,” a media report said.
Also Read: Turkish president Erdogan accuses US of supporting terrorists
An easily found downloadable PDF urges: “Martyrdom operations; we must refrain from calling it what the West labels it, ‘suicide bombings,’ since suicide is haram in Islam; and shahada [martyrdom] is not suicide.”
The top hit for “Anwar al-Awlaki quotes” was a website that includes comments from the preacher such as: “Jihad is still flourishing like a blessed tree sprouting through an earth of waste and pollution.”
Al-Awlaki, who is an American and Yemeni imam who was implicated in the planning of several Al Qaeda attacks, lived in London for 18 months before moving to Yemen in 2004, where he was a leader of the group.
He was killed in an American drone strike in 2011, but lives on in many YouTube videos.
The prompting of these extra search terms has been described as “incredibly dangerous” by terrorism experts.
Julie Shain of the Counter Extremism Project told the Times: “This incentivizes people to search for these things and suggests it’s normal to do so.
“In combination with the thousands of videos and pages calling for attacks on the West, it’s incredibly dangerous.”
Google recently changed its algorithms to prevent autocomplete suggesting anti-Semitic and misogynist searches. The company has also moved to prevent white supremacist and Holocaust denial websites from appearing in high or top results for certain searches.
Google’s director, Matt Brittin, told Radio 4’s Today programme the company works “very hard” to remove hateful and illegal content, but said autocomplete does save people time.
“It’s algorithmic, and I think people understand that these are suggestions based on what other people are searching for.
“So we can always improve that and we work hard to do that, but I think people are smart and they realize that not everything you find on the internet is accurate and there’s a range of opinions there.
A YouTube spokesman said: “YouTube has clear policies prohibiting content intending to incite violence and we quickly remove videos violating these policies when flagged by our users.”