We were there
Google I/O 2017
Google I/O 2017
Google released new and exciting tech during their three-day conference late May 2017. We've written a summary from a frontend web developer's perspective.
Lazy and just want the clickbait list of the news from the event?
Let’s set the stage. I’m writing this article on the plane back to Norway. I’m on seat 31B. Next to me is, coincidentally, a Danish developer whom also attended the conference. What are the odds? Prior to the flight, he downloaded all of the talks from the conference he didn’t have time to attend, and now he’s planning on binge watching them for ten hours straight while sipping on a Coca-Cola Zero. And you can too! Click here to binge watch all of this year’s Google I/O talks.
Let’s back up to February earlier this year. A colleague sent out a message on our #nerdherd Slack channel saying we should sign up for the conference. Almost all of us did. It was a lottery system, so we’d be lucky if one of us got picked. It was clear through the signup form that Google really wanted a super diverse crowd this year since the gender and skin color dropdown was packed with options. And it really showed at the conference. Normally tech conferences are filled with white males in the late twenties/early thirties. They have a beard, dad jeans, and lumberjack shirts. But not at Google I/O. Oh no no no.
As a matter of fact, I met a bunch of girls in a line waiting for Pizza. The line was extremely long, so we got to know each other quite well. It turned out they had all met at something called Women’s Techmakers – an initiative by Google to bring women in tech together. They had all met a couple of days prior to the conference and mingled. I was almost jealous and wished for a couple of seconds that I too was a woman and participate in this social gathering. Just kidding. But, once we had gotten our pizza they invited me to join their crew. Pizza brings people together!
Ok, I realize I hopped a little back and forth in time while telling you about the backstory. Let’s talk about the actual conference.
The conference started out with a two-hour opening keynote where Google revealed most of the big and exciting news. The Verge did a solid job of crunching keynote down to ten minutes. Check it out here:
If you watched the video above you would notice there weren’t many jaw dropping news that got launched. Instead there were tons of smaller incremental changes that felt like a natural step in the right direction. Here’s a list of what I think most people will find interesting:
This is probably the most mind blowing thing that was released at the conference. It is also a clear statement from Google that they are officially taking the AI throne. Google Lens lets you do Google searches with your phone camera. This technology has existed for a while, but they are taking a huge leap with the Google Lens technology. Previously you’d be able to take a picture of a flower and it would be able to tell you you took a picture of a flower. With the new AI tech it will now tell you that the flower you took a picture of is a Elatior Begonia and that it needs indirect sunlight to survive.
You and me can now use Google’s cloud TPU (Tensor Processor Unit) for our own projects. This is the chip that is the core of Google’s machine learning. And by combining it with the Tensor Flow software you’ll be able to create some really powerful machine learning applications. Google Expedition - AR without goggles
At the conference they had a dedicated tent for VR and AR demos. The VR was Daydream and the AR was Google Expedition; an AR concept made for educational purposes in the classroom. A teacher will be able to place a “hologram” in the classroom, and the students will be able to hold their phones up and look at the object through their phone camera and screen - as if they were all looking at the same thing at the same fixed position. Imagine being ten years old learning about ancient Egypt and while the teacher is talking about the pyramids you are looking at a 100% accurate hologram of an actual pyramid. Of course this will be used for other things than education when officially released, but this is the way they demoed it.
Let’s say you have taken a bunch of photos from a weekend trip with your friends. You hit the “share” button. Google Photos will now through face detection suggest a list of people you will share the photos with. You can also let Google Photos always share all of the pictures you take of one specific person - for example if you want your mom to see all the photos of your kids. Let’s just hope the face detection is of such high quality that it doesn’t send the wrong pictures to your mom.
New #Daydream standalone headsets from partners like @htcvive won't require a phone or PC. #io17 pic.twitter.com/7TpYPJGEdUGoogle (@Google) May 17, 2017
Google has teamed up with Qualcomm to produce a standalone Daydream VR headset. This means you won’t have to dock your phone in a headset/mount like the Google Cardboard. I personally tried a demo of the Daydream at the conference and my mind was blown away. The way you interact with the virtual world with the remote control really felt like a step in the right direction.
Not familiar with the Google Assistant? But maybe you’ve heard of Siri? Google Assistant is Google’s competitive software which uses speech recognition to help you out with just about anything. But if it’s going to be as well integrated as Siri is still a question to be figured out. Even if it may be a better service there’s no guarantee you can swap out Siri with Google’s AI tech. If you want this you should consider getting an Android phone.
Apart from being able to use your Google Home device as "just a speaker" via Bluetooth or doing calls through it, the major update to the Google Home is the way you will interact with it. The interaction flow will change from “you giving it instructions and it returning information” to “it starts a conversation with you based on machine learning”. For example if you’ve got an event in your calendar the Google Home will tell you when you should leave home based on current traffic instead of you having to ask the assistant when you should leave. Proactiveness is the key word here. Have you seen the movie “Her”? This is a giant leap in that direction.
As mentioned, the list above contains what most people/consumers will find interesting. These were the news that got presented at the opening keynote of the conference. But the conference lasted for three full days and there were lots of other really interesting releases I would like to mention.
Buzzword alert: #pwa.
A common denominator for all talks I went to about the mobile web was the fact that Google was trying to make the experience on the mobile web as “appy” as possible. Progressive web apps (PWA) was on everybody’s tongues. I really enjoyed the talk called "Creating UX that 'just feels right' with Progressive Web Apps". Here they demonstrated how you can, through the use of PWA's pre-cache videos so the user won't experience any latency when hitting "play" on a video thumbnail. They also showed that you can store files locally on mobile. We should also soon be able to decide when to prompt the user with the annoying "add to home screen" message. And not only that.. they teamed up with the most popular frameworks, such as React and Vue to make sure they now bundle their boilerplate code with all the things you need to get started creating PWA's the right way. How cool is that? The developers behind Polymer, React, Angular and Vue all worked together to release new and enhanced versions in front of the conference to try and make PWA's as accessible and low threshold for all developers as possible .
Lighthouse is a Google Chrome tool created to let you know how you can optimise your progressive web app. Right now you can install it as a Chrome Extention. But the plan is that it will be an integrated tool in Chrome's dev tools. Can't wait! Check out a great talk demoing it here:
Payment on the web is normally a bad experience. At least on mobile. It's one of the main causes of abandonment on a website. I'm sure we can all agree that a bad UX in a checkout form is one of the most annoying elements on the web. Wouldn't it be great if we had a standard? Google to the rescue! With the Payment Request API online stores will soon be able to scrap their crappy checkout forms and rely purely on Google Chrome's way of dealing with it. This means designers will spend less time having to rethink online checkouts for each online store they create. And probably the best part about it is that the online stores won't lose customers due to bad form UX.
As I mentioned above, the web is slowly but surely creeping up on native apps. And this truly manifested itself in the talks about Web VR. What is Web VR? It's VR on the web! Developers have been able to do VR stuff on the web for a couple of years now, but now it's really getting to a point where the technology is accessible enough for the everyday developer to start fiddling around and create awesome experiences. Check out the Web VR talk here:
Ok, so those are (in my mind) the main takeaways from the conference. I'm aware that I totally ignored the Android part of the conference here in this article. But if you're an Android developer I'm sure you've already heard about the juicy stuff like Kotlin, which I heard was a big deal.
Finally, I'd like to finish off this article by quoting Jeppe, the Danish Android developer, with what he said when I asked him about what he was most excited about after the conference: "There were so many small things they introduced this year that just felt right". And I totally agree.
I'd like to highlight AI, VR, and PWD. They are coming for you! Hope to see you next year!
I did some filming. Here's a teeny tiny report from the conference.
I also took some pictures. Here's a photo album I created with pics from the conference.