Unsurprisingly, the bulk of Google's announcements at I/O this week focused on AI. Although past Google I/O events also heavily leaned on AI, what made this year's announcements different is that the features were spread across nearly every Google offering and touched nearly every task people partake in every day.
Also: Everything Google unveiled at I/O 2025: Gemini, AI Search, smart glasses, more
Because I'm an AI optimist, and my job as an AI editor involves testing tools, I have always been pretty open to using AI to optimize my daily tasks. However, Google's keynote made it clear that even those who may not be as open to it will soon find it unavoidable.
Moreover, the tech giants' announcements shed light on the industry's future, revealing three major trends about where AI is headed, which you can read more about below.
When AI first became popular, most companies raced to build the best AI tools, whether that was an AI chatbot, image generator, etc. Now that these AI tools have become capable and more commonplace, we have entered the next stage of AI development. The goal now is to implement those tools into everyday processes, applications, and services. Ideally, they'll fit into people's existing workflows, without creating new tools to reach for.
Also: I tried Google's XR glasses and they already beat my Meta Ray-Bans in 3 ways
A prime example is Google's announcements at I/O that enable users to soon have the option of integrating AI into nearly every function of their digital life, such as online shopping, managing their email inbox, video conferencing, searching the web, and coding. While, at the moment, these are all opt-in experiences that allow the user to stay in control, being presented with the option to use AI will soon become inescapable.
This can be a positive change if you want to optimize your workflows easily while minimizing context switching, or a negative if you would prefer not to use AI. Either way, what remains true is that it is becoming increasingly difficult to escape it.
Like an effective human coworker, the more AI systems learn about you and your habits, the better they can cater to your needs by anticipating what you want done or using it as context when interacting with you. The difference is that, unlike humans, with AI, you don't necessarily need to tell it about yourself. Instead, you can feed it your data so it can use it as relevant context. The result is AI that is grounded in your information, and therefore, allows you to repeat yourself and instruct it less.
Also: One of Gemini's coolest features is now open to all Android and iOS users - for free
For example, Google introduced a feature that will soon allow users to feed AI Mode with the context of past searches and other Google apps, starting with Gmail, to provide more personal context. Similarly, Personalized Smart Replies in Gmail can draft emails that pull from your past email conversations and Google Drive.
While the results of these integrations are better AI assistance, it also means that you are handing over access to more of your data to fuel these higher-level integrations. As AI applications continue to evolve into proper assistants, to take full advantage of these models, you will have to get comfortable with sharing your data with AI models and the companies behind them.
While assistance on the digital plane continues to advance, there is a limitation to how much AI can help in your everyday life, as it's not present in your physical environment. Hardware will play an important role in bridging the gap between digital AI assistance and your daily physical experiences in your non-digital environments.
Also: Google offers AI certification for business leaders now - free trainings included
For instance, Google gave iOS and Android users access to Gemini Live's screen-sharing and video features, which give Gemini a glimpse into your regular routine and what you're encountering so that it can provide that extra layer of assistance. Google's Android XR glasses offer an even better example because they feed Gemini with everything you see to allow for a more seamless immersive interaction between you and AI.
Following this progression, more of your current hardware, such as watches, earbuds, and phones, will become vehicles for AI, and new hardware, such as AI pins and bands, will open up even more opportunities for AI integration.
Want more stories about AI? Sign up for Innovation, our weekly newsletter.