Skip to main content

Halo’s Cortana set to take on Siri

microsoft siri rival cortana will be everywhere
Image used with permission by copyright holder

If you’re a big fan of that purpley blue holographic glitchy girl inside your head in Halo, you may like what Microsoft is cooking up for the next big version of Windows Phone, Windows, and Xbox.

We’re talking about a virtual personal assistant along the lines of Siri and Google Now. Codenamed Cortana after the popular Halo character, it’s expected to take Windows Phone’s current voice-command capabilities to another level – Cortana, just like the AI character, will apparently be able to learn and adapt over time.

Rumors of the existence of a Cortana-based virtual assistant have been doing the rounds ever since an app listed as zCortana was spotted in a Windows Phone 8.1 leak in June. The addition of a ‘z’, which indicates a test build, evidently fooled no one.

And on Thursday, according to ZD Net, sites closely following Windows Phone developments confirmed that indeed, a much-improved virtual assistant is on the way.

Revolutionary not evolutionary

This follows on from a recent Cnet interview with Bing director of search Stefan Weitz in which he said that Microsoft is planning to release a Siri/Google Now competitor, but only when the team behind it was happy it was something special.

“We have had internal debates about when to ship something,” Weitz said. “We could come out with something now like them, but it wouldn’t be state of the art. It’s too constrained to be an agent now.”

He added, “We are not shipping until we have something more revolutionary than evolutionary.”

Cortana’s real-time query processing will reportedly be dealt with by Satori, Microsoft’s enormous knowledge repository that also powers Bing.

According to ZD Net’s report, Microsoft’s upcoming virtual assistant will be “more than just an app that lets users interact with their phones more naturally using voice commands,” in that it’ll be “core to the makeover of the entire ‘shell’ – the core services and experience – of the future versions of Windows Phone, Windows and the Xbox One operating systems.”

Windows Phone users keen to get chatting with Cortana will have to be patient, however, as some Microsoft executives have reportedly said the new virtual assistant might not appear for another couple of years, though hopefully will arrive in 2014 with the the expected launch of Windows 8.1. 

Such voice-activated virtual assistant technology went mainstream with Siri when it launched with Apple’s iPhone 4S in 2011. It was in the news a lot in the early days, as much for its knack of responding with amusing answers, or failure to understand what was being asked of it, as for the technology behind it. Funny, you don’t hear so much about it now, suggesting either that Apple has refined the technology or that iPhone owners have given up using it.

Perhaps purpley blue Cortana will fare better. 

Editors' Recommendations

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
With Cortana in a coma, Microsoft’s smart home ambitions look bleak
cortana is dead hi im feat 123521532

Microsoft’s Surface event on Wednesday showed all of us yet again that there’s always one more thing to get excited about. The unveiling of its latest Surface line of tablets and laptops in New York City had all of the hallmark trademarks of past Microsoft events, like Panos Panay’s impeccable showtime delivery, but the tease about the company’s upcoming foldables -- the Microsoft Duo and Neo -- stole the show at the end.

However, there was just one thing that got under my skin during the announcement. And that was the lack of any mention of Cortana, Microsoft's oft-forgotten voice assistant.
No love for Cortana
Microsoft’s virtual assistant had no mention whatsoever during the event, which itself was already packed with some compelling product announcements. Even more puzzling is how Microsoft unveiled its Surface Earbuds, a direct rival to many of the popular true wireless earbuds recently announced, without having any sort of Cortana integration. Considering how much Microsoft has poured into the development of Cortana, it’s shocking that it was left out of it altogether.

Read more
Apple contractors will no longer be able to listen to your Siri recordings
apple siri whispering

Last week, The Guardian revealed that Apple uses human contractors to review Siri recordings -- and that the contractors often end up hearing everything from drug deals to sexual encounters as a result.
Apple said Friday that it has suspended the use of those contractors while it reviews the process. It will allow users to opt out of having their Siri discussions being reviewed by a human contractor in the future. That ability will come in a future software update.
In an email to The Washington Post, an Apple spokesperson says that Apple was committed to user privacy. Following Apple's announcement, Google said it was also going to temporarily suspend the use of humans to review "OK Google" voice assistant recordings. We've reached out to both Google and Apple for more details and will update this story if we hear back. 
The human contractors reviewed conversations with Siri in order to improve the virtual assistant. Contractors would listen to recordings when Siri was triggered but did not provide an answer in order to determine if she should have been able to in that instance.
The goal behind the human listeners was to better understand situations where Siri might be failing so that she could be improved going forward. They would listen to a recording, determine if the user meant to activate Siri, and grade her response (or lack thereof).

That human element, however, wasn’t made clear to users.
A whistleblower who worked as a Siri reviewer told The Guardian that the contractors would often be able to hear things like sensitive medical information, and that recordings were accompanied by user data showing users' location and some personal details.
Less than 1% of daily Siri activations were sent to human contractors for evaluation.
 “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said in a statement to The Guardian when the issue was originally revealed. 
Apple isn't the first company to run into this issue. Earlier this year, Amazon came under fire for similar issues with its personal assistant, Alexa. Much like Apple, Amazon uses humans to analyze recordings of Alexa’s answers. It also retains text data of requests, even when a user deletes a recording.
Amazon now offers an option within Alexa’s settings where customers can opt out of their data being used to improve the service.

Read more
Apple contractors listening to Siri requests hear sex, drug deals, and more
A Human Might Be Listening To Your Siri Requests
Siri voice command on an Apple Watch.

Apple contractors routinely hear sensitive things like confidential medical information, couples having sex, and drug deals as part of their work related to quality control for the company’s virtual assistant Siri, The Guardian reports.
The recordings are passed on to contractors who are asked to determine whether the activation of Siri was intentional or accidental and to grade Siri’s responses.
Less than 1% of daily Siri activations are sent on to a human for grading. However, Apple does not expressly tell customers that their recordings might be used in this way. The issue was brought to light by an anonymous whistleblower who spoke to The Guardian. That individual said that the recordings often contain sexual encounters as well as business dealings and that they feel Apple should expressly tell users that Siri content might be reviewed by a human.
“A small portion of Siri requests are analyzed to improve Siri and dictation," Apple told the Guardian in a statement. "User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” 
We reached out to Apple to additional details but have yet to receive a response. We'll update this story if we hear back. Siri can sometimes turn on and start listening to you if it thinks it has accidentally heard a wake word -- typically "Hey Siri!" or something similar -- even if you didn't mean to turn it on.

The human beings who listen to these conversations (or worse) work to determine what the person who was recorded was asking for and if Siri provided it. If not, they determine whether Siri should have realistically been able to answer your question.
If the complaints about Apple sound familiar, it’s likely because Amazon battled a similar issue earlier this year. While Amazon also sends recordings to humans to analyze later and retains text data of requests even when recordings are deleted, the company also offers an option within Alexa’s settings where customers can opt-out of their data being used for that purpose.
Apple does not currently offer an opt-out option for Siri.

Read more