The company cautions that it may provide inaccurate information at times
Google has released its Bard AI chatbot to users in the US and UK. I tried it out to compare it with Bing AI and ChatGPT. The chat interface looks like Bing AI, with a wide text box at the bottom and a dialogue-based layout. But there are some differences.
To start a conversation with Bing AI, you have to hit “Chat” or scroll up from search results. With Bard, you don’t have to do that. Bard’s website has a column on the left with options for resetting the chat, accessing activity, FAQ, and help & support. On the other hand, Bing AI has a broom icon to clear the chat.
Bard AI is still a work in progress, and Google warns that it may provide inaccurate information.
When accessing Bard’s website, users are presented with a reminder that Bard is an experiment, and may not always provide accurate information. Users are asked to keep this in mind and reminded that feedback will help Bard improve. Fine print below the input field cautions that Bard may display inaccurate or offensive information that does not reflect Google’s views. A message appears when resetting the chat that Bard is still in its experimental phase and chatting with it will help improve the experience.
Despite the reminders, Bard’s interface is unobtrusive, and there are some functional differences compared to Bing AI. While Bing AI does not offer speech-to-text on a desktop (although the app does), Bard can use a laptop’s microphone to capture spoken queries. The dropdown arrow next to “View other drafts” allows users to see alternative responses within the conversation, a feature not available in Bing AI.
Although the reminders are understandable given Bard’s experimental nature and past blunders, they don’t significantly affect the user experience. Bard’s speech-to-text function is helpful for some users, while the option to see alternative responses within the conversation is a notable feature.
If you live in the US or the UK, you may have access to Bard soon. However, Google wants to remind you that it is an experimental chatbot, so be cautious with its responses and fact-check the information. Also, avoid sharing personal or sensitive data with Bard, as it will use this to improve its algorithms. As more people interact with Bard and Bing AI, there is a risk of potential biases, discrimination, and flawed thinking. However, with more feedback, the chatbots may become more advanced and accurate.
Add Comment