Google Bard could soon become your new AI life coach

Generative artificial intelligence (AI) tools like ChatGPT have gotten a bad rep recently, but Google is apparently trying to serve up something more positive with its next project: an AI that can offer helpful life advice to people going through tough times.

If a fresh report from The New York Times is to be believed, Google has been testing its AI tech with at least 21 different assignments, including “life advice, ideas, planning instructions and tutoring tips.” The work spans both professional and personal scenarios that users might encounter.

Mojahid Mottakin / Unsplash

It’s the result of Google merging its DeepMind research lab with its Brain AI team and is “indicative of the urgency of Google’s effort to propel itself to the front of the AI pack,” the report states.

Recommended Videos

According to one example cited in The Times, Google has been working on how to answer a query from a user who wants to attend a close friend’s wedding but is unable to afford the travel costs to do so.

Aside from that, the AI’s tutoring function could help people improve their skills or learn new ones, while its planning aspect may be able to aid users in creating a financial budget or whipping up a meal plan.

User wellbeing

Mojahid Mottakin / Unsplash

The move to help users with their most pressing personal challenges is a stark change from Google. In December 2022 — shortly after rival OpenAI’s ChatGPT was unleashed on the world — an internal Google slide deck cautioned against encouraging people to get too emotionally attached to AI tools, according to the report from The New York Times.

In fact, Google’s own safety experts warned in December that taking life advice from AI could result in “diminished health and well-being” and a “loss of agency,” with the potential for some users to mistakenly think the AI was sentient and able to understand them in ways a human can.

As recently as the Google Bard launch in March 2023, Google said the tool was forbidden from advising users on medical, financial, or legal matters. If the company goes ahead and builds these capabilities into its AI tools, it will mark a striking turnaround — and could raise questions over whether Google is prioritizing primacy in the AI race over users’ wellbeing.

Winning at all costs

Hatice Baran / Unsplash

A life coach is not the only AI-based tool Google is apparently working on. Among its other projects are tools that can generate scientific and creative writing, help journalists write headlines, as well as find and extract patterns from text.

Yet even ideas like these were criticized by Google just months ago when the company said there was a risk of “deskilling” creative writers through the use of generative AI.

Whether any of these tools become reality is unclear at this moment, but it seems Google is determined to pull ahead in the AI race. Doing so could come at a cost, though — as its own experts have pointedly argued.

Editors' Recommendations

In ancient times, people like Alex would have been shunned for their nerdy ways and strange opinions on cheese. Today, he…
Copilot: how to use Microsoft’s own version of ChatGPT

ChatGPT isn’t the only AI chatbot in town. One direct competitor is Microsoft’s Copilot (formerly Bing Chat), and if you’ve never used it before, you should definitely give it a try. As part of a greater suite of Microsoft tools, Copilot can be integrated into your smartphone, tablet, and desktop experience, thanks to a Copilot sidebar in Microsoft Edge. 

Like any good AI chatbot, Copilot’s abilities are constantly evolving, so you can always expect something new from this generative learning professional. Today though, we’re giving a crash course on where to find Copilot, how to download it, and how you can use the amazing bot. 
How to get Microsoft Copilot
Microsoft Copilot comes to Bing and Edge. Microsoft

Read more
GPTZero: how to use the ChatGPT detection tool

In terms of world-changing technologies, ChatGPT has truly made a massive impact on the way people think about writing and coding in the short time that it's been available. Being able to plug in a prompt and get out a stream of almost good enough text is a tempting proposition for many people who aren't confident in their writing skills or are looking to save time. However, this ability has come with a significant downside, particularly in education, where students are tempted to use ChatGPT for their own papers or exams. That prevents them from learning as much as they could, which has given teachers a whole new headache when it comes to detecting AI use.

Teachers and other users are now looking for ways to detect the use of ChatGPT in students' work, and many are turning to tools like GPTZero, a ChatGPT detection tool built by Princeton University student Edward Tian. The software is available to everyone, so if you want to try it out and see the chances that a particular piece of text was written using ChatGPT, here's how you can do that.
What is GPTZero?

Read more
Is ChatGPT safe? Here are the risks to consider before using it

For those who have seen ChatGPT in action, you know just how amazing this generative AI tool can be. And if you haven’t seen ChatGPT do its thing, prepare to have your mind blown! 

There’s no doubting the power and performance of OpenAI’s famous chatbot, but is ChatGPT actually safe to use? While tech leaders the world over are concerned over the evolutionary development of AI, these global concerns don’t necessarily translate to an individual user experience. With that being said, let’s take a closer look at ChatGPT to help you hone in on your comfort level.
Privacy and financial leaks
In at least one instance, chat history between users was mixed up. On March 20, 2023, ChatGPT creator OpenAI discovered a problem, and ChatGPT was down for several hours. Around that time, a few ChatGPT users saw the conversation history of other people instead of their own. Possibly more concerning was the news that payment-related information from ChatGPT-Plus subscribers might have leaked as well.

Read more