Skip to main content

Microsoft accidentally released 38TB of private data in a major leak

It’s just been revealed that Microsoft researchers accidentally leaked 38TB of confidential information onto the company’s GitHub page, where potentially anyone could see it. Among the data trove was a backup of two former employees’ workstations, which contained keys, passwords, secrets, and more than 30,000 private Teams messages.

According to cloud security firm Wiz, the leak was published on Microsoft’s artificial intelligence (AI) GitHub repository and was accidentally included in a tranche of open-source training data. That means visitors were encouraged to download it, meaning it could have fallen into the wrong hands again and again.

A large monitor displaying a security hacking breach warning.
Stock Depot / Getty Images

Data breaches can come from all kinds of sources, but it will be particularly embarrassing for Microsoft that this one originated with its own AI researchers. The Wiz report states that Microsoft uploaded the data using Shared Access Signature (SAS) tokens, an Azure feature, that lets users share data through Azure Storage accounts.

Recommended Videos

Visitors to the repository were told to download the training data from a provided URL. However, the web address granted access to much more than just the planned training data, and allowed users to browse files and folders that were not intended to be publicly accessible.

Please enable Javascript to view this content

Full control

A person using a laptop with a set of code seen on the display.
Sora Shimazaki / Pexels

It gets worse. The access token that allowed all this was misconfigured to provide full control permissions, Wiz reported, rather than more restrictive read-only permissions. In practice, that meant that anyone who visited the URL could delete and overwrite the files they found, not merely view them.

Wiz explains that this could have had dire consequences. As the repository was full of AI training data, the intention was for users to download it and feed it into a script, thereby improving their own AI models.

Yet because it was open to manipulation thanks to its wrongly configured permissions, “an attacker could have injected malicious code into all the AI models in this storage account, and every user who trusts Microsoft’s GitHub repository would’ve been infected by it,” Wiz explains.

Potential disaster

A digital depiction of a laptop being hacked by a hacker.
Digital Trends

The report also noted that the creation of SAS tokens – which grant access to Azure Storage folders such as this one – does not create any kind of paper trail, meaning “there is no way for an administrator to know this token exists and where it circulates.” When a token has full-access permissions like this one did, the results can be potentially disastrous.

Fortunately, Wiz explains that it reported the issue to Microsoft in June 2023. The leaky SAS token was replaced in July, and Microsoft completed its internal investigation in August. The security lapse has only just been reported to the public to allow time to fully fix it.

It’s a reminder that even seemingly innocent actions can potentially lead to data breaches. Luckily the issue has been patched, but it’s unknown whether hackers gained access to any of the sensitive user data before it was removed.

Alex Blake
Alex Blake has been working with Digital Trends since 2019, where he spends most of his time writing about Mac computers…
Copilot: how to use Microsoft’s own version of ChatGPT
Microsoft's AI Copilot being used in various Microsoft Office apps.

ChatGPT isn’t the only AI chatbot in town. One direct competitor is Microsoft’s Copilot (formerly Bing Chat), and if you’ve never used it before, you should definitely give it a try. As part of a greater suite of Microsoft tools, Copilot can be integrated into your smartphone, tablet, and desktop experience, thanks to a Copilot sidebar in Microsoft Edge. 

Like any good AI chatbot, Copilot’s abilities are constantly evolving, so you can always expect something new from this generative learning professional. Today though, we’re giving a crash course on where to find Copilot, how to download it, and how you can use the amazing bot. 
How to get Microsoft Copilot
Microsoft Copilot comes to Bing and Edge. Microsoft

Read more
OpenAI and Microsoft sued by NY Times for copyright infringement
A phone with the OpenAI logo in front of a large Microsoft logo.

The New York Times has become the first major media organization to take on AI firms in the courts, accusing OpenAI and its backer, Microsoft, of infringing its copyright by using its content to train AI-powered products such as OpenAI's ChatGPT.

In a lawsuit filed in Federal District Court in Manhattan, the media giant claims that “millions” of its copyrighted articles were used to train its AI technologies, enabling it to compete with the New York Times as a content provider.

Read more
Microsoft Copilot: tips and tricks for using AI in Windows
Microsoft Copilot allows you to ask an AI assistant questions within Office apps.

Microsoft's Copilot might not be breaking ground in quite the same way as ChatGPT seemed to when it first debuted, but there are still some useful abilities for this desktop-ready chatbot AI that is now available to pretty much anyone running the latest version of Windows 11. It doesn't have a huge range of abilities yet, confining itself to changing some Windows settings, opening apps for you, and performing the creative writing and web search functions available through its contemporaries.

But you can make Copilot work for you and work well, and there are some tips and tricks you'll want to employ to make the most of it. Here are some of my favorites.
Go hands-free
While the latest natural language AIs might be primarily text-based, many of them now include voice and audio support, and Windows Copilot is much the same. While this might seem like merely a more clunky way to interact with Copilot -- and it is kind of clunky -- this is an important feature because it means you don't have to use your hands to prompt it. Beyond clicking the little microphone button, you can get back to whatever you're doing while asking it a question or requesting something from it.

Read more