Welcome to SPN

Register and Join the most happening forum of Sikh community & intellectuals from around the world.

Sign Up Now!

Opinion Is Autocomplete Evil? Some Women Take A Hard Look at Google !

Discussion in 'Breaking News' started by spnadmin, Nov 7, 2013.

  1. spnadmin

    spnadmin United States
    Expand Collapse
    1947-2014 (Archived)
    SPNer Supporter

    Joined:
    Jun 17, 2004
    Messages:
    14,551
    Likes Received:
    19,200

    Is Autocomplete Evil ?


    by Tom Chatfield

    http://www.bbc.com/future/story/20131106-is-google-autocomplete-evil/1

    Women shouldn’t have rights.” “Women shouldn’t vote.” “Women shouldn’t work.” How prevalent are these beliefs? According to a recent United Nations campaign, such sexism is dispiritingly common, and it’s why they published these sentiments on a series of posters. The source? These statements were the top suggestions offered by Google’s “instant” search tool when the words “Women shouldn’t…” were typed into its search box.

    Google Instant is an “autocomplete” service – which, as the name suggests, automatically suggests letters and words to complete a query, based on the company’s knowledge of the billions of searches performed across the world each day. If I enter the words “women should,” the number one suggestion on my own screen is “women shoulder bags,” followed by the more depressing “women should be seen and not heard.” If I type “men should”, the enigmatic suggestion “men should weep” pops up.

    The argument behind the UN campaign is that this algorithm offers a glimpse into our collective psyche – and a disturbing one at that. Is this really true? Not in the sense that the campaign implies. Autocomplete is biased and deficient in many ways, and there are dangers ahead if we forget that. In fact, there is a good case that you should switch it off entirely.

    Like many of the world’s most successful technologies, the mark of autocomplete’s success is how little we notice it. The better it’s working, the more seamlessly its anticipations fit in with our expectations – to the point where it’s most noticeable when something doesn’t have this feature, or Google suddenly stops anticipating our needs. The irony is that the more effort that’s expended making the results appear so seamless, the more unvarnished and truthful the results feel to users. Knowing what “everyone” thinks about any particular issue or question simply means starting to type, and watching the answer write itself ahead of our tapping fingers.

    Yet, like any other search algorithm, autocomplete blends a secret sauce of data points beneath its effortless interface. Your language, location and timing are all major factors in results, as are measures of impact and engagement – not to mention your own browsing history and the “freshness” of any topic. In other words, what autocomplete feeds you is not the full picture, but what Google anticipates you want. It’s not about mere truth; it’s about “relevance”.

    This is before you get on to censorship. Understandably, Google suppresses terms likely to encourage illegality or materials unsuitable for all users, together with numerous formulations relating to areas like racial and religious hatred. The company’s list of “potentially inappropriate search queries” is constantly updated.

    False premise

    None of this should be news to savvy web users. Yet many reactions to the UN Women campaign suggest, to me, a reliance on algorithmic expertise that borders on blind faith. “The ads are shocking,” explained one of the copywriters behind them, “because they show just how far we still have to go to achieve gender equality.” The implication is that these results are alarming precisely because they are impartial: unambiguous evidence of prejudice on a global scale. Yet, while the aim of the campaign is sound, the evidence from Google is far from straightforward.

    The greatest danger, in fact, is the degree to which an instantaneous answer-generator has the power not only to reflect but also to remould what the world believes – and to do so beneath the level of conscious debate. Autocomplete is coming to be seen as a form of prophecy, complete with a self-fulfilling invitation to click and agree. Yet by letting an algorithm finish our thoughts, we contribute to a feedback loop that potentially reinforces untruths and misconceptions for future searchers.

    Consider the case of a Japanese man who, earlier this year, typed his name into Google and discovered autocomplete associating him with criminal acts. He won a court case compelling the company to modify the results. The Japanese case echoed a previous instance in Australia where, effectively, the autocomplete algorithm was judged to be guilty of libel after it suggested the word “bankrupt” be appended to a doctor’s name. And there are plenty of other examples to pick from.

    So far as Google engineers are concerned, these are mere blips in the data. What they are offering is ever-improving efficiency: a collaboration between humans and machines that saves time, eliminates errors and frustrations, and enriches our lives with its constant trickle of data. All of which is true – and none the less disturbing for all that.

    As the company’s help page puts it, “even when you don’t know exactly what you’re looking for, predictions help guide your search.” Google has built a system that claims to know not only my desires, but humanity itself, better than I could ever manage – and it gently rams the fact down my throat every time I start to type.

    Did you know you can turn autocomplete off just by changing one setting? I’d recommend you give it a try, if only to perform a simple test: does having a computer whispering in your ear change the way you think about the world? Or, of course, you can ask Google itself. For me, typing “is Google autocomplete...” offered the completed phrase “is Google autocomplete a joke?”. Unfortunately, the answer is anything but.
     

    Attached Files:

    • Like Like x 2
  2. Loading...

    Similar Threads Forum Date
    1984 Institutionalizing Evil History of Sikhism Jun 3, 2016
    Movies Ashdoc's movie review---Deliver us from evil Theatre, Movies & Cinema Sep 29, 2014
    The evil you do remains with you, The good you do comes back to you! Inspirational Stories Nov 8, 2012
    Controversial Curse, Black Magic (aka Toona) and Evil Spirits Hard Talk Aug 3, 2012
    Controversial God's Almightyness Versus Evil Hard Talk May 5, 2012

  3. Ishna

    Ishna
    Expand Collapse
    On hiatus
    Writer SPNer Contributor

    Joined:
    May 9, 2006
    Messages:
    2,942
    Likes Received:
    5,002
    Reminds me of once when I was 15 and I did a Google search for 'teen pagan network' and got a sponsored ad result for "pre-teen l.olitas"...

    15 year old me sent a VERY angry message to Google. I never got a reply. Wow... that was a long time ago.
     
    • Like Like x 3
  4. spnadmin

    spnadmin United States
    Expand Collapse
    1947-2014 (Archived)
    SPNer Supporter

    Joined:
    Jun 17, 2004
    Messages:
    14,551
    Likes Received:
    19,200
    They don't even reply when you write about a frank violation of the law. And I am talking about the law, and not about one's personal social or ethical school of thought. Now Google expects us to believe them when they launch public relations stunts to regain public confidence after playing ball with the NSA.
     
    • Like Like x 3
  5. Inderjeet Kaur

    Inderjeet Kaur
    Expand Collapse
    Writer SPNer Supporter

    Joined:
    Oct 13, 2011
    Messages:
    774
    Likes Received:
    1,712
    I turned it off a long time ago because it creeped me out to have my computer pretending to read my mind.

    And, anyway, those things usually slow down computers, as well.

    Don't need, don't like, don't want. Buh-bye!
     
    • Like Like x 1

Share This Page