....and that's a wrap!
Thank you for joining us! Do check back in tomorrow for more information and analysis on all that was announced today.
Building a better driver
To be a capable and safe driver, cars need to be able to understand their environment, identify objects and predict their behaviour.
In one example, a car jumped a red light. Waymo's car saw the speed of the car driving on the road, noted the change in traffic light and predicted that the car would jump the light. The Waymo car pre-emptively slowed down and may have saved lives.
Waymo's Dmitri Dolgov is on stage
Google Using ML to make driving more interesting and easy. 5 billion miles in simulation has already been conducted
In 2018, self-driving cars already transforming the way we live and move.
With Waymo, Google claims to be building not just a driverless car, but a better driver.
Waymo detects things and places on the road using powerful machine learning systems.
Waymo is the only company in the world having fully self-driving, driverless cars on the streets today.
Dmitri Dolgov on stage to talk more about how Waymo is using machine learning to make the car better and safe drivers.
Spotted a lamp or a dress you like? Lens will find items that match that style. Just point your camera and let Lens do its thing.
Lens does all of this processing in real-time. Information is even anchored to things that you see. This is only possible with machine learning, both on-device and on the cloud.
Lens will also work with posters, say, and play trailers directly over them. All of these updates are coming out within the next few weeks.
Like voice, vision is a fundamental shift in computing.
Lens can now recognise and understand words. With smart text selection, you can copy and paste from the real world into your phone. This can also be translated into real time.
If you're looking at a restaurant menu, you can get real-time information on the food and its ingredients.
Google has just waved goodbye to Apple says @vijay
John Krafcik, CEO of Waymo, on stage to talk about self-driving cars
Cameras are good, but with AI, they can be great
You exit a subway, you're already running late for an appointment, and your phone says "head south on Market Way". Do you know where south is? Or Market Way?
With Google Maps and the camera, you can get LIVE, visual updates showing street names, locations as well as information on what's around you. You can even enable a virtual guide to show you the way.
This is beyond cool. It's like AR, but for Maps.
Google puts the words from the image right to your phone
Google Lens recognizes places and is also to be integrated with camera.
Create a shortlist
Long-press on a place to add it to a list. Share it with a friend to get there input. Friends can then vote and share in real-time.
Planning meet-ups couldn't be easier!
Updates on the places you care about
Maps will add a tab called "For You", which gives you information about places you care about. You can, for example, get updates on places you care about.
Maps will use Machine Learning to combine information about a place with your preferences to give you a recommendation. This is not that same as choosing a place based on its rating. Think of it like Netflix recommendations.
Aparna Chennapragada on stage to talk about Google camera
Google Maps can now automatically add places, buildings and businesses that are spotted in Street View.
Maps can tell you if parking is easier or harder, give different routes based on whether you're on a car or bike.
Here's @sankam opinion on Sundar Pichai
Google Maps to suggest new places near you and suggest a place according to your previous choices using machine learning.
Android P Beta
You knew it was coming, but did you know that it was coming to several flagship devices that are not the Pixel?
Google is partnering with Nokia, Vivo, OnePlus, Sony, Essential and even Oppo to bring Android P Beta today. You can literally download it right now on compatible devices.
New feature to make Maps more accurate
Jen Fitzpatrick comes on stage to talk about Google Maps
With night mode, the screen goes black and white, reminding you that it's time to sleep.
Android P is now available in Beta.
The new Android P Dashboard will show you how much time you're spending on your device. It'll tell you how many times you've unlocked the device, engaged with it, etc.
It even tells you what you're engaging with. This is to separate meaningful engagement from wasted time.
Android P will let you set timers for apps, even graying out an app to let you know that you should avoid it.
The new Do Not Disturb mode (called Shush) will turn on the moment you turn your phone face first on a table. No notifications or displays or pings.
Google Wind Down
Want to put your phone down before sleeping? Google Wind Down will help with that and help you put your phone down fast.
Google's DND will stop the constant distraction from texts and visual mediums. It's been appropriately called Shush.
DIsconnecting from our phones is hard. Turning off a phone is like cutting off a limb.
Apparently, 70% of people want help cutting down on usage.
New Dashboard feature to show time spent on all devices on Android P. You can set time limits to use an app as well
A personal pain point for me has been the inability to control rotation lock on a per app basis. But Google with Android P, Google has fixed that now.
iOS is certainly looking very lame right now. If only Google would sort out its hardware game.
Sameer Samat takes the stage to explain about Android P's Digital Wellbeing
New Volume controls
Finally! The slider button adjusts media volume by default. No matter randomly muting audio when wanting to mute media.
Android P Simplicity
Evolving Android UI
There's now a special emphasis on simplicity, says Google.
For example, there's a new Home button, which looks like a smaller iPhone X home button.
Swiping up gives you an overview of your spaces as well as all apps. This will happen across apps.
Sliding the home button sideways will slide through your recent apps. This is, I think, more useful than iOS gestures.
Google Android P has now got a single navigation button at the bottom.
It's a new set of APIs available through firebase.
This will also tap into cloud-based resources.
It's used to access AI models designed in TensorFlow and the like. It's also compatible with iOS and Android.
Slices of apps
It's a new API for developers. This renders an interactive "slice" of an app to you when needed. You won't need to open a full app to perform a simple action like booking a cab.
Google Slices for Android P
App UI will show up intelligently in context using Google Slices.
Android P will learn your usage patterns and place the apps that you are expected to launch along the path you'd normally follow to launch the app. Google claims its 60% accurate right now.
App Actions is a new feature that will even predict the actions that you will take.
Assistant, for example, will learn when you go jogging and will queue up your fitness tracking app at that time.
Google Action for Android P
Now Google will predict your next action on the phone
Auto brightness hasn't been too smart. Now, with Adaptive Brightness, Android P will learn your brightness preferences based on ambient lighting and will adapt the brightness accordingly.
Adaptive brightness with Android P
It's a feature that was made in partnership with DeepMind. It learns how you use your phone, predicting the apps that you'd use, easing the strain on your battery.
Google claims to have seen a 30% increase in battery life.
New Adaptive battery is coming on Android P
Devices should be smarter. They should learn from you.
Data should also be more private.
It's all about:
- Digital Wellbeing
Dave Burke discusses Android
Android is more than just a smartphone OS. It's powering new wearables, TV, AR, VR, etc.
There's also a huge shift from desktop to mobile. On top of this, AI is now everywhere.