Google is shaking up the way we navigate with a major update to its Maps app, powered by its new AI model, Gemini. Forget rote instructions like “turn left in 200 feet” – Maps is going conversational, making your journey smoother and more intuitive.
This isn’t just about talking to your phone; it’s about having a real conversation with it. Imagine asking complex questions like: “Are there any good Thai restaurants near me that are open late?” or “What’s the parking situation like at that new concert venue I was looking at?” Gemini, fueled by Google’s vast knowledge of the world and street-level imagery, can handle these requests and more.
One of the most significant changes is landmark-based navigation. Instead of generic distance cues, Maps will now guide you with landmarks: “Turn right after the iconic clock tower” or “Take a left before that quirky mural on your left.” This comes from analyzing data from over 250 million locations and cross-referencing it with Street View imagery.
The update also brings some handy proactive features:
- Traffic alerts: Maps will now warn you about road closures or accidents even when you’re not actively navigating, helping you avoid frustrating detours.
- Lens mode: Point your camera at a building or landmark and ask Maps questions about it directly. Need to know the opening hours of that bakery? Or maybe its Wi-Fi password? Lens has you covered.
These Gemini-powered features are rolling out on Android and iOS in the US this month, with plans for wider availability soon.
This isn’t just another tweak; Google is fundamentally changing how we interact with Maps, making it a more natural and helpful part of our daily lives.






























































