Apologies, Chinese iPhone using lonely gentlemen*: You’ll no longer be able to use Siri as your virtual pimp in an attempt to make finding prostitutes that little bit easier now that Apple has reportedly blocked any query to the app that includes the words “hookers” or “escorts.”
In a story that sounds as if it belongs in the Onion, the Times of India is reporting that Apple has confirmed that those terms have been added to other banned queries in the Chinese market for Siri, including where to buy firearms (Amusingly, the Times says that those asking Siri the latter will receive the reply “I don’t know what that means,” before being redirected to Google.com. That’s funny on two levels: Firstly, the indirect implication that Google is where you go if you want answers to immoral/potentially illegal search queries, and secondly, the accidental suggestion that Google is a superior search engine to anything Apple has come up with). According to the Times’ story, users were asking Siri “Where can I find hookers?” and “Where can I find escorts?” and being given listings of the closest locations, which in many cases were local bars and clubs.
That’s not the case anymore; now, when faced with the same question, Siri will tell Chinese users “I couldn’t find any escort services” apologetically. Despite 36 percent of people participating in an online poll at Sohu.com last week feeling that the police should use Siri’s locating powers to arrest prostitutes – “Siri could help them locate the hookers,” one user helpfully suggested- Chinese legal authorities weren’t exactly concerned about this, it seems; a police officer with the Information Office of the Shanghai Municipal Public Security Bureau who refused to be named said that authorities hadn’t verified the locations given by Siri prior to Apple’s blocking of the search terms, and said that they “[had] not received any complaints or reports regarding Siri’s providing pornographic information so far.”
Apple itself was behind the removal of the terms, the company confirmed. A customer service staff member identified only with the last name “Lin” told reporters that the company had blocked all information “related [to] ‘escorts'” at some point in the recent past, but refused to be drawn on exactly when that action had taken place. The action was prompted by the “reports from our users,” Mr. Lin explained.
As Ars Technica points out, this block was only put into place in China, apparently; an US-attempt to ask Siri where to find escorts will result in suggestions and even a helpful map if you ask properly.
* -Yes, I know that it’s possible that lonely ladies in China may also be asking Siri to find them some hookers, but let’s be honest: It’s probably men.