- Advertisement -

RECENT HEADLINES

- Advertisement -

Reviews

Google Expands AI Mode in Search with New Tools for Smarter Learning

Google is rolling out a fresh batch of updates to its AI Mode in Search, bringing new capabilities aimed at students, educators, and lifelong learners, just in time for the back-to-school season.

The latest features include support for images and PDFs on desktop, real-time video-based help, and a planning tool called Canvas, all designed to make information discovery more interactive and personalized.

Visual Search Now Supports Images and PDFs on Desktop

AI Mode previously allowed mobile users to ask questions about images using the Google app. Now, that feature is coming to desktop browsers, making it easier to interact with visual materials like lecture slides, infographics, or screenshots. Google is also introducing PDF support in the coming weeks, allowing users to upload documents and ask questions that draw context from both the file and the broader web.

A user interface displaying a PDF attachment titled 'Psych...ality.pdf' with a prompt saying 'Summarize these slides for me' and icons for image upload and deep search.

This multimodal capability is designed for deeper exploration. For example, uploading class notes or a syllabus could prompt the AI to generate clarifications or supplementary material. These features are launching everywhere AI Mode is available (US, India, UK), with plans to expand support to additional file types, including those from Google Drive.

Canvas Brings Planning and Organization to the Sidebar

For anyone juggling multiple research sources or project ideas, AI Mode’s new Canvas tool offers a persistent planning space. Users can now create a dynamic side panel where information is saved across sessions. Typing a prompt like “help me study for biology” and selecting “Create Canvas” initiates a live workspace that updates as users refine their goals.

Canvas will soon support uploaded content too, such as class notes, giving users a way to personalize study guides or project plans. It’s initially available on desktop for users enrolled in the AI Mode Labs experiment.

Search Live: Like a Visual Chat with an AI Guide

A smartphone screen displaying a science experiment involving yeast and hydrogen peroxide, with foam and bubbles forming. Text on the screen asks about the science behind the experiment.

Search Live introduces real-time video input to AI Mode by integrating with Google Lens. Available in the U.S. on mobile through the AI Mode Labs program, it lets users interact with the world around them via their phone’s camera. Whether you’re trying to understand a concept, object, or environment, you can ask questions and receive context-aware answers with helpful links.

The tool is designed to feel like having an expert in your pocket who can literally see what you’re pointing at.

Lens in Chrome Adds Contextual Search from Any Page

For desktop users, Chrome is gaining a new “Ask Google about this page” option in the address bar dropdown. When activated, this launches Lens and AI Mode to provide quick overviews of content on the screen, whether it’s a webpage, diagram, or document. Users can then dig deeper by asking follow-up questions or hitting the “Dive deeper” button in the side panel.

This seamless integration of Lens and AI Mode continues to position Search as more than just an old fashioned query box. With each update it’s becoming an interactive tool for navigating dense or unfamiliar content.

More details about the rollout and how to participate in the AI Mode Labs experiment can be found on Google’s official blog.

Note: This content may contain affiliate links, meaning we may earn a commission for purchases made using them.

- Advertisement -

Featured