Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Google Code Blog
Syndicate content
ewoodhttp://www.blogger.com/profile/12341551220176883769noreply@blogger.comBlogger1460125
Updated: 4 hours 52 min ago

Easier Auth for Google Cloud APIs: Introducing the Application Default Credentials feature.

Mon, 07/20/2015 - 19:27

Originally posted to the Google Cloud Platform blog

When you write applications that run on Google Compute Engine instances, you might want to connect them to Google Cloud Storage, Google BigQuery, and other Google Cloud Platform services. Those services use OAuth2, the global standard for authorization, to help ensure that only the right callers can make the right calls. Unfortunately, OAuth2 has traditionally been hard to use. It often requires specialized knowledge and a lot of boilerplate auth setup code just to make an initial API call.

Today, with Application Default Credentials (ADC), we're making things easier. In many cases, all you need is a single line of auth code in your app:

Credential credential = GoogleCredential.getApplicationDefault();

If you're not already familiar with auth concepts, including 2LO, 3LO, and service accounts, you may find this introduction useful.

ADC takes all that complexity and packages it behind a single API call. Under the hood, it makes use of:

  • 2-legged vs. 3-legged OAuth (2LO vs. 3LO) -- OAuth2 includes support for user-owned data, where the user, the API provider, and the application developer all need to participate in the authorization dance. Most Cloud APIs don't deal with user-owned data, and therefore can use much simpler two-party flows between the API provider and the application developer.
  • gcloud CLI -- while you're developing and debugging your app, you probably already use the gcloud command-line tool to explore and manage Cloud Platform resources. ADC lets your application piggyback on the auth flows in gcloud, so you only have to set up your credentials once.
  • service accounts -- if your application runs on Google App Engine or Google Compute Engine, it automatically has access to the built-in "service account", that helps the API provider to trust that the API calls are coming from a trusted source. ADC lets your application benefit from that trust.

You can find more about Google Application Default Credentials here. This is available for Java, Python, Node.js, Ruby, and Go. Libraries for PHP and .Net are in development.

Categories: Programming

Chromecast drives higher visits, engagement and monetization for app developers

Wed, 07/15/2015 - 18:06

Posted by Jeanie Santoso, Merchandise Marketing Manager

Chromecast, our first Google Cast device, has seen great success with 17 million devices already sold and over 1.5 billion touches of the Cast button. Consumers now get all the benefits of their easy to use personal mobile devices, with content displayed on the largest and most beautiful screen in the house. By adding Google Cast functionality to their apps, developers can gain visits, engagement, and/or higher monetization. Here are four real-world examples showing how very different companies are successfully using Google Cast technology. Read on to learn more about their successes and how to get started.

Comedy Central sees 50% more videos viewed by Chromecast users

The Comedy Central app lets fans watch their favorite shows in full and on demand from mobile devices. The company created a cast-enabled app so users could bring their small screen experience to their TVs. Now with Chromecast, users watch at least 50 percent more video, with 1.5 times more visits than the average Comedy Central app user. “The user adoption and volume we saw immediately at launch was pleasantly surprising,” says Ben Hurst, senior vice president, mobile and emerging platforms, Viacom Music and Entertainment Group. “We feel that Chromecast was a clear success for the Comedy Central app.”

Read the full Comedy Central case study here

Just Dance Now sees 2.5x monetization with Chromecast users

Interactive-game giant Ubisoft adopted Google Cast technology as a new way to experience their Just Dance Now game. As the game requires a controller and a main screen on which the game is displayed, Ubisoft saw Chromecast as the simplest and most accessible way to play. Chromecast represents 30 percent of all songs launched on the game in the US. Chromecast players monetize 2.5 times more than other players - they’re more engaged, play longer and more often than other players. Ubisoft also has seen more long-term subscribers with Chromecast. “The best Just Dance Now experience is on a big screen, and Chromecast brings an amazingly quick launch and ease of use for players to get into the game,” says Björn Törnqvist, Ubisoft technical director.

Read the full Just Dance Now case study here

Fitnet sees 35% higher engagement in Chromecast users

Fitnet is a mobile app that delivers video workouts and converts your smartphone’s or tablet’s camera into a biometric sensor to intelligently analyze your synchronicity with the trainer. The app provides a real-time score based on the user’s individual performance. The company turned to Chromecast to provide a compelling, integrated big screen user experience so users don’t need to stare at a tiny display to follow along. Chromecast users now perform 35 percent better on key engagement metrics Fitnet regard as critical to their success”—metrics such as logins, exploring new workouts, and actively engaging in workout content. “Integrating with Google Cast technology has been an excellent investment of our time and resources, and a key feature that has helped us to develop a unique, compelling experience for our users,” Bob Summers, Fitnet founder and CEO.

Read the full Fitnet case study here

table, th, td { border: clear; border-collapse: collapse; } Haystack TV doubled average weekly viewing time

Haystack TV is a personal news channel that lets consumers watch news on any screen, at any time. The company integrated Google Cast technology so users can browse their headline news, choose other videos to play, and even remove videos from their play queue without disrupting the current video on their TV. With Chromecast, average weekly viewing time has doubled. One-third of Haystack TV users now view their news via Chromecast. “We’re in the midst of a revolution in the world of television. More and more people are ‘cutting the cord’ and favoring over-the-top (OTT) services such as Haystack TV,” says Ish Harshawat, Haystack TV co-founder. “Chromecast is the perfect device for watching Haystack TV on the big screen. We wouldn't be where we are today without Chromecast.”

Read the full Haystack TV case study here

Integrate with Google Cast technology today

More and more developers are discovering what Google Cast technology can do for their app. Check out the Google Cast SDK for API references and take a look at our great sample apps to help get you started.

To learn more, visit developers.google.com/cast

Categories: Programming

Lighting the way with BLE beacons

Tue, 07/14/2015 - 16:06

Posted by Chandu Thota, Engineering Director and Matthew Kulick, Product Manager

Just like lighthouses have helped sailors navigate the world for thousands of years, electronic beacons can be used to provide precise location and contextual cues within apps to help you navigate the world. For instance, a beacon can label a bus stop so your phone knows to have your ticket ready, or a museum app can provide background on the exhibit you’re standing in front of. Today, we’re beginning to roll out a new set of features to help developers build apps using this technology. This includes a new open format for Bluetooth low energy (BLE) beacons to communicate with people’s devices, a way for you to add this meaningful data to your apps and to Google services, as well as a way to manage your fleet of beacons efficiently.

Eddystone: an open BLE beacon format

Working closely with partners in the BLE beacon industry, we’ve learned a lot about the needs and the limitations of existing beacon technology. So we set out to build a new class of beacons that addresses real-life use-cases, cross-platform support, and security.

At the core of what it means to be a BLE beacon is the frame format—i.e., a language—that a beacon sends out into the world. Today, we’re expanding the range of use cases for beacon technology by publishing a new and open format for BLE beacons that anyone can use: Eddystone. Eddystone is robust and extensible: It supports multiple frame types for different use cases, and it supports versioning to make introducing new functionality easier. It’s cross-platform, capable of supporting Android, iOS or any platform that supports BLE beacons. And it’s available on GitHub under the open-source Apache v2.0 license, for everyone to use and help improve.

By design, a beacon is meant to be discoverable by any nearby Bluetooth Smart device, via its identifier which is a public signal. At the same time, privacy and security are really important, so we built in a feature called Ephemeral Identifiers (EIDs) which change frequently, and allow only authorized clients to decode them. EIDs will enable you to securely do things like find your luggage once you get off the plane or find your lost keys. We’ll publish the technical specs of this design soon.


Eddystone for developers: Better context for your apps

Eddystone offers two key developer benefits: better semantic context and precise location. To support these, we’re launching two new APIs. The Nearby API for Android and iOS makes it easier for apps to find and communicate with nearby devices and beacons, such as a specific bus stop or a particular art exhibit in a museum, providing better context. And the Proximity Beacon API lets developers associate semantic location (i.e., a place associated with a lat/long) and related data with beacons, stored in the cloud. This API will also be used in existing location APIs, such as the next version of the Places API.

Eddystone for beacon manufacturers: Single hardware for multiple platforms

Eddystone’s extensible frame formats allow hardware manufacturers to support multiple mobile platforms and application scenarios with a single piece of hardware. An existing BLE beacon can be made Eddystone compliant with a simple firmware update. At the core, we built Eddystone as an open and extensible protocol that’s also interoperable, so we’ll also introduce an Eddystone certification process in the near future by closely working with hardware manufacturing partners. We already have a number of partners that have built Eddystone-compliant beacons.

Eddystone for businesses: Secure and manage your beacon fleet with ease

As businesses move from validating their beacon-assisted apps to deploying beacons at scale in places like stadiums and transit stations, hardware installation and maintenance can be challenging: which beacons are working, broken, missing or displaced? So starting today, beacons that implement Eddystone’s telemetry frame (Eddystone-TLM) in combination with the Proximity Beacon API’s diagnostic endpoint can help deployers monitor their beacons’ battery health and displacement—common logistical challenges with low-cost beacon hardware.

Eddystone for Google products: New, improved user experiences

We’re also starting to improve Google’s own products and services with beacons. Google Maps launched beacon-based transit notifications in Portland earlier this year, to help people get faster access to real-time transit schedules for specific stations. And soon, Google Now will also be able to use this contextual information to help prioritize the most relevant cards, like showing you menu items when you’re inside a restaurant.

We want to make beacons useful even when a mobile app is not available; to that end, the Physical Web project will be using Eddystone beacons that broadcast URLs to help people interact with their surroundings.

Beacons are an important way to deliver better experiences for users of your apps, whether you choose to use Eddystone with your own products and services or as part of a broader Google solution like the Places API or Nearby API. The ecosystem of app developers and beacon manufacturers is important in pushing these technologies forward and the best ideas won’t come from just one company, so we encourage you to get some Eddystone-supported beacons today from our partners and begin building!

Update (July 16, 2015 11.30am PST) To clarify, beacons registered with proper place identifiers (as defined in our Places API) will be used in Place Picker. You have to use Proximity Beacon API to map a beacon to a place identifier. See the post on Google's Geo Developer Blog for more details.

Categories: Programming

Connect With the World Around You Through Nearby APIs

Tue, 07/14/2015 - 16:03

Posted by Akshay Kannan, Product Manager

Mobile phones have made it easy to communicate with anyone, whether they’re right next to you or on the other side of the world. The great irony, however, is that those interactions can often feel really awkward when you're sitting right next to someone.

Today, it takes several steps -- whether it’s exchanging contact information, scanning a QR code, or pairing via bluetooth -- to get a simple piece of information to someone right next to you. Ideally, you should be able to just turn to them and do so, the same way you do in the real world.

This is why we built Nearby. Nearby provides a proximity API, Nearby Messages, for iOS and Android devices to discover and communicate with each other, as well as with beacons.

Nearby uses a combination of Bluetooth, Wi-Fi, and inaudible sound (using the device’s speaker and microphone) to establish proximity. We’ve incorporated Nearby technology into several products, including Chromecast Guest Mode, Nearby Players in Google Play Games, and Google Tone.

With the latest release of Google Play services 7.8, the Nearby Messages API becomes available to all developers across iOS and Android devices (Gingerbread and higher). Nearby doesn’t use or require a Google Account. The first time an app calls Nearby, users get a permission dialog to grant that app access.

A few of our partners have built creative experiences to show what's possible with Nearby.

Edjing Pro uses Nearby to let DJs publish their tracklist to people around them. The audience can vote on tracks that they like, and their votes are updated in realtime.

Trello uses Nearby to simplify sharing. Share a Trello board to the people around you with a tap of a button.

Pocket Casts uses Nearby to let you find and compare podcasts with people around you. Open the Nearby tab in Pocket Casts to view a list of podcasts that people around you have, as well as podcasts that you have in common with others.

Trulia uses Nearby to simplify the house hunting process. Create a board and use Nearby to make it easy for the people around you to join it.

To learn more, visit developers.google.com/nearby.

Categories: Programming

This is Material Design Lite

Mon, 07/13/2015 - 18:05

Posted by Addy Osmani, Staff Developer Platform Engineer

Back in 2014, Google published the material design specification with a goal to provide guidelines for good design and beautiful UI across all device form factors. Today we are releasing our first effort to bring this to websites using vanilla CSS, HTML and JavaScript. We’re calling it Material Design Lite (MDL).

MDL makes it easy to add a material design look and feel to your websites. The “Lite” part of MDL comes from several key design goals: MDL has few dependencies, making it easy to install and use. It is framework-agnostic, meaning MDL can be used with any of the rapidly changing landscape of front-end tool chains. MDL has a low overhead in terms of code size (~27KB gzipped), and a narrow focus—enabling material design styling for websites.

Get started now and give it a spin or try one of our examples on CodePen.

MDL is a complimentary implementation to the Paper elements built with Polymer. The Paper elements are fully encapsulated components that can be used individually or composed together to create a material design-style site, and support more advanced user interaction. That said, MDL can be used alongside the Polymer element counterparts.

Out-of-the-box Templates

MDL optimises for websites heavy on content such as marketing pages, text articles and blogs. We've built responsive templates to show the broadness of sites that can be created using MDL that can be downloaded from our Templates page. We hope these inspire you to build great looking sites.

Blogs:

Text-heavy content sites:

Dashboards:

Standalone articles:

and more.

Technical details and browser support

MDL includes a rich set of components, including material design buttons, text-fields, tooltips, spinners and many more. It also include a responsive grid and breakpoints that adhere to the new material design adaptive UI guidelines.

The MDL sources are written in Sass using BEM. While we hope you'll use our theme customizer or pre-built CSS, you can also download the MDL sources from GitHub and build your own version. The easiest way to use MDL is by referencing our CDN, but you can also download the CSS or import MDL via npm or Bower.

The complete MDL experience works in all modern evergreen browsers (Chrome, Firefox, Opera, Edge) and Safari, but gracefully degrades to CSS-only in browsers like IE9 that don’t pass our Cutting-the-mustard test. Our browser compatibility matrix has the most up to date information on the browsers MDL officially supports.

More questions?

We've been working with the designers evolving material design to build in additional thinking for the web. This includes working on solutions for responsive templates, high-performance typography and missing components like badges. MDL is spec compliant for today and provides guidance on aspects of the spec that are still being evolved. As with the material design spec itself, your feedback and questions will help us evolve MDL, and in turn, how material design works on the web.

We’re sure you have plenty of questions and we have tried to cover some of them in our FAQ. Feel free to hit us up on GitHub or Stack Overflow if you have more. :)

Wrapping up

MDL is built on the core technologies of the web you already know and use every day—CSS, HTML and JS. By adopting MDL into your projects, you gain access to an authoritative and highly curated implementation of material design for the web. We can’t wait to see the beautiful, modern, responsive websites you’re going to build with Material Design Lite.

Categories: Programming

Cast Remote Display API: Processing

Fri, 07/10/2015 - 18:08

Posted by Leon Nicholls, Developer Programs Engineer

Remote Display on Google Cast allows your app to display both on your mobile and Cast device at the same time. Processing is a programming language that allows artists and hobbyists to create advanced graphics and interactive exhibitions. By putting these two things together we were able to quickly create stunning visual art and display it on the big screen just by bringing our phone to the party or gallery. This article describes how we added support for the Google Cast Remote Display APIs to Processing for Android and how you can too.

An example app from the popular Processing toxiclibs library on Cast. Download the code and run it on your own Chromecast! A little background

Processing has its own IDE and has many contributed libraries that hide the technical details of various input, output and rendering technologies. Users of Processing with just basic programming skills can create complicated graphical scenes and visualizations.

To write a program in the Processing IDE you create a “sketch” which involves adding code to life-cycle callbacks that initialize and draw the scene. You can run the sketch as a Java program on your desktop. You can also enable support for Processing for Android and then run the same sketch as an app on your Android mobile device. It also supports touch events and sensor data to interact with the generated apps.

Instead of just viewing the graphics on the small screen of the Android device, we can do better by projecting the graphics on a TV screen. Google Cast Remote Display APIs makes it easy to bring graphically intensive apps to Google Cast receivers by using the GPUs, CPUs and sensors available on the mobile devices you already have.

How we did it

Adding support for Remote Display involved modifying the Processing for Android Mode source code. To compile the Android Mode you first need to compile the source code of the Processing IDE. We started with the source code of the current stable release version 2.2.1 of the Processing IDE and compiled it using its Ant build script (detailed instructions are included along with the code download). We then downloaded the Android SDK and source code for the Android Mode 0232. After some minor changes to its build config to support the latest Android SDK version, we used Ant to build the Android Mode zip file. The zip file was unzipped into the Processing IDE modes directory.

We then used the IDE to open one of the Processing example sketches and exported it as an Android project. In the generated project we replaced the processing-core.jar library with the source code for Android Mode. We also added a Gradle build config to the project and then imported the project into Android Studio.

The main Activity for a Processing app is a descendent of the Android Mode PApplet class. The PApplet class uses a GLSurfaceView for rendering 2D and 3D graphics. We needed to change the code to use that same GLSurfaceView for the Remote Display API.

It is a requirement in the Google Cast Design Checklist for the Cast button to be visible on all screens. We changed PApplet to be an ActionBarActivity so that we can show the Cast button in the action bar. The Cast button was added by using a MediaRouteActionProvider. To only list Google Cast devices that support Remote Display, we used a MediaRouteSelector with an App ID we obtained from the Google Cast SDK Developer Console for a Remote Display Receiver.

Next, we created a class called PresentationService that extends CastRemoteDisplayLocalService. The service allows the app to keep the remote display running even when it goes into the background. The service requires a CastPresentation instance for displaying its content. The CastPresentation instance uses the GLSurfaceView from the PApplet class for its content view. However, setting the CastPresentation content view requires some changes to PApplet so that the GLSurfaceView isn’t initialized in its onCreate, but waits until the service onRemoteDisplaySessionStarted callback is invoked.

When the user selects a Cast device in the Cast button menu and the MediaRouter onRouteSelected event is called, the service is started with CastRemoteDisplayLocalService.startService. When the user disconnects from a Cast device using the Cast button, MediaRouter onRouteUnselected event is called and the service is stopped by using CastRemoteDisplayLocalService.stopService.

For the mobile display, we display an image bitmap and forward the PApplet touch events to the existing surfaceTouchEvent method. When you run the Android app, you can use touch gestures on the display of the mobile device to control the interaction on the TV. Take a look at this video of some of the Processing apps running on a Chromecast.

Most of the new code is contained in the PresentationService and RemoteDisplayHelper classes. Your mobile device needs to have at least Android KitKat and Google Play services version 7.5.71.

You can too

Now you can try the Remote Display APIs in your Processing apps. Instead of changing the generated code every time you export your Android Mode project, we recommend that you use our project as a base and simply copy your generated Android code and libraries to our project. Then simply modify the project build file and update the manifest to start the app with your sketch’s main Activity.

To see a more detailed description on how to use the Remote Display APIs, read our developer documentation. We are eager to see what Processing artists can do with this code in their projects.

Categories: Programming

What’s new with Google Fit: Distance, Calories, Meal data, and new apps and wearables

Tue, 06/30/2015 - 18:52

Posted by Angana Ghosh, Lead Product Manager, Google Fit

To help users keep track of their physical activity, we recently updated the Google Fit app with some new features, including an Android Wear watch face that helps users track their progress throughout the day. We also added data types to the Google Fit SDK and have new partners tracking data (e.g. nutrition, sleep, etc.) that developers can now use in their own apps. Find out how to integrate Google Fit into your app and read on to check out some of the cool new stuff you can do.

table, th, td { border: clear; border-collapse: collapse; }

Distance traveled per day

The Google Fit app now computes the distance traveled per day. Subscribe to it using the Recording API and query it using the History API.

Calories burned per day

If a user has entered their details into the Google Fit app, the app now computes their calories burned per day. Subscribe to it using the Recording API and query it using the History API.

Nutrition data from LifeSum, Lose It!, and MyFitnessPal

LifeSum and Lose It! are now writing nutrition data, like calories consumed, macronutrients (proteins, carbs, fats), and micronutrients (vitamins and minerals) to Google Fit. MyFitnessPal will start writing this data soon too. Query it from Google Fit using the History API.

Sleep activity from Basis Peak and Sleep as Android

Basis Peak and Sleep as Android are now writing sleep activity segments to Google Fit. Query this data using the History API.

New workout sessions and activity data from even more great apps and fitness wearables!

Endomondo, Garmin, the Daily Burn, the Basis Peak and the Xiaomi miBand are new Google Fit partners that will allow users to store their workout sessions and activity data. Developers can access this data with permission from the user, which will also be shown in the Google Fit app.

How are developers using the Google Fit platform?

Partners like LifeSum, and Lose It! are reading all day activity to help users keep track of their physical activity in their favorite fitness app.

Runkeeper now shows a Google Now card to its users encouraging them to “work off” their meals, based on their meals written to Google Fit by other apps.

Instaweather has integrated Google Fit into a new Android Wear face that they’re testing in beta. To try out the face, first join this Google+ community and then follow the link to join the beta and download the app.

We hope you enjoy checking out these Google Fit updates. Thanks to all our partners for making it possible! Find out more about integrating the Google Fit SDK into your app.

Categories: Programming

Quake® III on your TV with Cast Remote Display API

Wed, 06/24/2015 - 18:45

Posted by Leon Nicholls, Developer Programs Engineer and Antonio Fontan, Software Engineer

At Google I/O 2015 we announced the new Google Cast Remote Display APIs for Android and iOS that make it easy for mobile developers to bring graphically intensive apps or games to Google Cast receivers. Now you can use the powerful GPUs, CPUs and sensors of the mobile device in your pocket to render both a local display and a virtual one to the TV. This dual display model also allows you to design new game experiences for the display on the mobile device to show maps, game pieces and private game information.

We wanted to show you how easy it is to take an existing high performance game and run it on a Chromecast. So, we decided to port the classic Quake® III Arena open source engine to support Cast Remote Display. We reached out to ID Software and they thought it was a cool idea too. When all was said and done, during our 2015 I/O session “Google Cast Remote Display APIs for Games” we were able to present the game in 720p at 60 fps!

During the demo we used a wired USB game controller to play the game, but we've also experimented with using the mobile device sensors, a bluetooth controller, a toy gun and even a dance mat as game controllers.

Since you're probably wondering how you can do this too, here's the details of how we added Cast Remote Display to Quake. The game engine was not modified in any way and the whole process took less than a day with most of our time spent removing UI code not needed for the demo. We started by using an existing source port of Quake III to Android which includes some usage of kwaak3 and ioquake3 source code.

Next, we registered a Remote Display App ID using the Google Cast SDK Developer Console. There’s no need to write a Cast receiver app as the Remote Display APIs are supported natively by all Google Cast receivers.

To render the local display, the existing main Activity was converted to an ActionBarActivity. To discover devices and to allow a user to select a Cast device to connect to, we added support for the Cast button using MediaRouteActionProvider. The MediaRouteActionProvider adds a Cast button to the action bar. We then set the MediaRouteSelector for the MediaRouter using the App ID we obtained and added a callback listener using MediaRouter.addCallback. We modified the existing code to display an image bitmap on the local display.

To render the remote display, we extended CastPresentation and called setContentView with the game’s existing GLSurfaceView instance. Think of the CastPresentation as the Activity for the remote display. The game audio engine was also started at that point.

Next we created a service extending CastRemoteDisplayLocalService which would then create an instance of our CastPresentation class. The service will manage the remote display even when the local app goes into the background. The service automatically provides a convenient notification to allow the user to dismiss the remote display.

Then we start our service when the MediaRouter onRouteSelected event is called by using CastRemoteDisplayLocalService.startService and stop the service when the MediaRouter onRouteUnselected event is called by using CastRemoteDisplayLocalService.stopService.

To see a more detailed description on how to use the Remote Display APIs, read our developer documentation. We have also published a sample app on GitHub that is UX compliant.

You can download the code that we used for the demo. To run the app you have to compile it using Gradle or Android Studio. You will also need to copy the "baseq3" folder from your Quake III game to the “qiii4a” folder in the root of the SD card of your Android mobile device. Your mobile device needs to have at least Android KitKat and Google Play services version 7.5.71.

With 17 million Chromecast devices sold and 1.5 billion touches of the Cast button, the opportunity for developers is huge, and it’s simple to add this extra functionality to an existing game. We're eager to see what amazing experiences you create using the Cast Remote Display APIs.

QUAKE II © 1997 and QUAKE III © 1999 id Software LLC, a ZeniMax Media company. QUAKE is a trademark or registered trademark of id Software LLC in the U.S. and/or other countries. QUAKE game assets used under license from id Software LLC. All Rights Reserved

QIII4A © 2012 n0n3m4. GNU General Public License.

Q3E © 2012 n0n3m4. GNU General Public License.

Categories: Programming

Attach Google Drive files to Calendar events with the Calendar API

Tue, 06/23/2015 - 20:43

Posted by Iskander Akishev, Software Engineer, Google Calendar

Originally posted to the Google Apps Developer blog

The Google Calendar API allows you to create and modify events on Google Calendar. Starting today, you can use the API to also attach Google Drive files to Calendar events to make them—and your app—even more useful and integrated. With the API, you can easily attach meeting notes or add PDFs of booking confirmations to events.

Here's how you set it up:

1) Get the file information from Google Drive (e.g. via the Google Drive API):

GET https://www.googleapis.com/drive/v2/files

{
...
"items": [
{
"kind": "drive#file",
"id": "9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q,
...
"alternateLink": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
"title": "Workout plan",
"mimeType": "application/vnd.google-apps.presentation",
...
},
...
]
}

2) Pass this information into an event modification operation using the Calendar API:

POST https://www.googleapis.com/calendar/v3/calendars/primary/events?supportsAttachments=true

{
"summary": "Workout",
"start": { ... },
"end": { ... },
...
"attachments": [
{
"fileUrl": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
"title": "Workout plan",
"mimeType": "application/vnd.google-apps.presentation"
},
...
]
}

Voilà!

You don’t need to do anything special in order to see the existing attachments - they are now always exposed as part of an event:

GET https://www.googleapis.com/calendar/v3/calendars/primary/events/ja58khmqndmulcongdge9uekm7

{
"kind": "calendar#event",
"id": "ja58khmqndmulcongdge9uekm7",
"summary": "Workout",
...
"attachments": [
{
"fileUrl": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
"title": "Workout plan",
"mimeType": "application/vnd.google-apps.presentation",
"iconLink": "https://ssl.gstatic.com/docs/doclist/images/icon_11_presentation_list.png"
},
...
]
}

Check out the guide and reference in the Google Calendar API documentation for additional details.

For any questions related to attachments or any other Calendar API features you can reach out to us on StackOverflow.com, using the tag #google-calendar.

Categories: Programming

Grow your app installs with App Invites from Google

Fri, 05/29/2015 - 15:11

Posted by Ranjith Jayaram, Product Manager

If you’re looking to drive usage and grow a mobile app, you’re probably testing out referrals, recommendations, and the user onboarding experience. These product flows are resource-intensive to design, build, and optimize. What if you could use a set of tools that help your users share your app, and get more of the right people to download and use your app? What if you could craft a more personalized onboarding experience in your new user’s journey?

Now in beta, App Invites let mobile app developers increase their reach, deep link new users to custom experiences, and tap into your users’ device and Google-wide contacts as a source to drive referrals. This is available for both iOS and Android app developers. We’re launching with UrbanSitter, Yummly, The CW, Coinbase and Picsart apps.

Here’s what some of our early partners had to say:

  • For Andrea Barrett, co-founder and VP of Product at UrbanSitter, “App Invites gives our members the ability to easily share favorite sitters with their friends and Google contacts. As a service targeting busy parents, our user growth thrives on social recommendations and word-of-mouth referrals, so Google’s app invites are a natural fit for us.”
  • Sharing is an important part of TV network The CW’s app growth strategy. “Tools that help fans of our shows recommend The CW app to their contacts and friends are important. App Invites let’s specific users share their favorite shows with selected friends; it's the next evolution of ‘word of mouth’. We’re integrating App Invites into our episode sharing capabilities, so that fans can speak to each other about the The CW app.” - Zach Mannon, Director of Digital Media at Warner Brothers Television
  • For PicsArt, their fast growth to 250M installs has been driven by word-of-mouth. “Google’s new App Invites will accelerate our organic growth even further, giving people the opportunity to proactively invite their friends to join our mission of beautifying the world!" - Arusiak Kanetsyan, Director of Content and Communication
  • Yummly integrated app invites to expand their user base and generate awareness of their app, by allowing people to suggest the app to those who love to cook or are interested in food. “We see this expanding beyond just inviting new users to join our app. In the future, we hope to use this to share different meal ideas and have the opportunity to share your shopping list with family members or even inviting friends over for dinner. With the power of Google and personalized app invites, making a dinner everyone agrees on will never been easier." -- Brian Witlin, Chief Operating Officer

App Invites is available on both Android and iOS. Here’s what’s in the beta toolkit:

  1. Use app invites for expanded reach: Tap into SMS and email invites via your user’s phone and Google contacts.
  2. Make it easy for your users to send invites. We’ll recommend their closest contacts to share your app with, and suggest a preferred method of delivery.
  3. Send actionable invite cards: Include an install button right in an email invite.
  4. Faster Android install flows: Your new Android users can click App Invite and download your app straight from the Play Store, bypassing the browser windows that usually open in between.
  5. Create personalized onboarding flows: New users can get deep linked into a specific onboarding experience - for example, you can offer custom discount codes or content.
  6. Measure app invites using Google Analytics custom reports.

With App Invites, our goal is to take the hard work out of building user referral and onboarding flows and build a toolkit that works across platforms, so that you can focus on your core app experience. Visit https://developers.google.com/app-invites/ to get started To learn about opportunities to re-engage app users using goo.gl deep links, check out this post.

Categories: Programming

Polymer 1.0 Released!

Fri, 05/29/2015 - 01:33

Posted by Taylor Savage, Product Manager, Polymer

Today we released the 1.0 version of the Polymer library. Polymer is a new way of thinking about building web applications - a sugaring layer on top of Web Components, making it easy for you to create interoperable custom elements. These elements can then be put together to create app-like immersive experiences on the web.

Since the “Developer Preview” release, we’ve re-written the library from the ground up, focusing on cross-browser performance while keeping the developer-friendly ergonomics. The new library is about 3x faster on Chrome, 4x faster on Safari, and a third less code than in developer preview. And it’s ready to be used in production applications.

Get started with the new library

We’ve updated the polymer-project.org site with documentation for the 1.0 release. Major new and updated features include:

  • Brand-new, fast, and easy-to-use data binding system
  • Element theming and styling using custom CSS properties
  • Fast and lightweight Shadow DOM shim for non-supporting browsers
  • “Behaviors” mechanism for sharing behavior between elements

It’s easier than ever to create high-quality, production-ready elements using Polymer, to use in your app or share with other developers.

Browse, demo, and learn how to use the latest elements built by the Polymer team

Check out the many brand-new element product lines built by the Polymer team with the Polymer elements catalog. There you can browse for elements to help create or add features to your web app - whether you need buttons or layouts, Google maps or push notifications. For just about any problem you might need to solve on the web, there’s an element for that.

Kick off your app with the Polymer Starter Kit

Looking for a fast and easy way to get started building a production-ready web application using Polymer? Use the Polymer starter kit. Packed with the latest elements, ready-to-use boilerplate, and an end-to-end toolchain to use from development through production deployment, the starter kit works out of the box so you can focus on adding features right away.

We’re incredibly excited about this release, and can’t wait to see what you’ll build!

Categories: Programming

Introducing the Google Identity Platform

Thu, 05/28/2015 - 22:34

Posted by Adam Dawes, Product Manager

Identity is an important building block to any app. Even if it’s not top of mind when you’re building out new features, authentication is critical to get right. We realize developers want their users to have an efficient and secure experience with their products rather than dealing with forgotten passwords that prevent app usage. The Google Identity Platform is a suite of developer tools that cuts friction out of every step of getting people signed in: Smart Lock for Passwords, Google Sign-in, and Identity Toolkit.

New: Get people automatically signed-in with Smart Lock for Passwords

Smart Lock is a password manager built into your Google account. It can automatically sign you into an Android app, and autofills credentials on websites viewed with Chrome. It removes the need to remember and type in password credentials.

We're now extending Smart Lock as an API to app developers and publishers, starting with Android. Android developers can programmatically save and retrieve credentials, and automatically sign their users in across devices and websites in Chrome. Companies like Eventbrite, Orbitz, Netflix, Instacart, the New York Times, and soon — LinkedIn, are using Smart Lock for Passwords to get people automatically signed-in across their properties.

For new account sign-ups, or first-time sign ins, your users now have the option to save their passwords to Smart Lock, and be automatically signed-in subsequently.

How does Smart Lock for Passwords work for your app or site? It recognizes and pre-fills login credentials across devices. When a user signs into or signs-up on your property, they’ll see a Smart Lock prompt to save their password for future use. If they accept, they’ll never have to enter a password again on your Android app or website on Chrome. In turn, this helps your app or site boost sign-up and sign-in rates for new and existing users.

This also means when your users buy and activate new devices, they can easily download and get logged right back into your app using the proper account. You don’t have to spend resources in re-acquiring them again, or encounter forgotten passwords along the way.

Integrating with Smart Lock for Passwords lets your users save their password on one device, and be automatically be signed-in on other devices.

Here’s how early partners are using Smart Lock for Passwords, which you can see rolling out in their Android apps and websites on Chrome.

  • Eventbrite easily integrated and tested SmartLock in their app to increase the number of signed in users, because signing in "gives users a more personalized event discovery experience and a more seamless ticket purchase experience", says Atul Kakkar, Senior Product Manager at Eventbrite.
  • For Instacart, “Smart Lock makes sign-in much faster, sparing the need for users to type in their information - across devices. Many people first browse Instacart on mobile or desktop web and then download the app. With all their credentials ready, our customers can get their groceries even faster. We’re excited for integrating with Smart Lock to help save customer time, and increase business conversions.” - Maksim Golivkin, Android Engineer
  • Netflix is using Smart Lock for Passwords to keep viewers signed in and consuming content no matter what device they’re watching on, even in the living room on Android TV. "Wherever possible, we want to reduce friction in getting our members authenticated across devices. Smart Lock can help, particularly on big screens, where entering user credentials is more cumbersome than on mobile or PC." - Rob Caruso, Director of Partner Engagement
  • In addition to helping pave the way for paid subscribers, the New York Times is using Smart Lock to help build a better cross-device experience for their readers. “This is a retention strategy for us, and part of an ongoing push to create registration and signed-in usage in the app. Smart Lock for Passwords makes it essentially frictionless for someone to do so. Readers who log in or create a registration have a higher propensity to subscribe, and this goes up further if they’re interacting with us on multiple platforms.” - Rachel Kirscht, Android Marketing Manager
  • “When consumers sign-in on Orbitz they benefit in a number of ways. For example, people can get quick access to the travel searches they’ve done on Orbitz using a different device and have a streamlined booking experience with minimal data entry. While traveling, signed-in app users receive push notifications to stay on top of flight delays and gate changes. Desktop Orbitz.com users can now get all these benefits automatically in our app when they utilize Smart Lock, avoiding the usual friction of sign-in or having to know that sign-in comes with benefits.” - Ryan Kowalczyk, Senior Director, Product Strategy
  • For beta partner LinkedIn, "We are thrilled to be Google's launch partner for their Smart Lock debut, which helps to further simplify the log-in experience for our existing LinkedIn members as well as streamline the sign-up process for new members.” - Bob Rosin, VP of Business Development at LinkedIn

Later on this year, we’re planning to offer new features including support for sync passphrases and Smart Lock for Passwords API support on other platforms. Stay tuned!

Give people a trusted registration system with Google Sign-in

Google Sign-in is the gateway to securely connecting Google’s users and services to your app or site. It reduces the burden of login for your users, by enabling them to sign in with their Google account—the same account they already use with Gmail, Play, Google+, and other Google services. The New York Times was able attribute a 20% lift in new registered user signups to the implementation of Google Sign-In, in the months immediately following launch.

Implement authentication in-a-box with Identity Toolkit

Identity Toolkit gives you a robust, more secure authentication system that helps you do sign-in the right way, and can grow with your app. The toolkit supports email and password authentication as well as major identity providers including Google and Facebook. Even non-technical developers can add future sign-in options and migrate existing users with simple configuration changes. Getting started with Identity Toolkit lets you easily scale your auth system as you grow across platforms.

We hope that the Google Identity Platform makes it easy for your users to sign in, so that you can deliver them more personalized, engaging experiences. To view documentation, visit developers.google.com/identity

Categories: Programming

Add Google to your iOS Apps with CocoaPods

Thu, 05/28/2015 - 20:43

Posted by Jonathan Beri, Product Manager

Today, we’re announcing that CocoaPods will be the primary channel to acquire Google SDKs for iOS. CocoaPods is a popular, free dependency manager for iOS that significantly simplifies the process of importing libraries and frameworks into Xcode. And it will help you manage your dependencies between our various libraries.

While there was a small but growing number of official Google Pods, as well as a few Pods written by other iOS developers, it was often difficult for developer to find the right SDK. With today’s announcement, some of our key libraries, like Google Cloud Messaging and the Google Maps SDK for iOS can now be found on the public CocoaPods service. Going forward, new iOS SDKs will be packaged and documented as Pods and published on cocoapods.org. You can find the full list of Google Pods and all supporting documentation at developer.google.com/ios/cocoapods.

If you haven't used CocoaPods yet, now is a great time to give it a try. Add GoogleAnalytics to a Podfile to start counting new users. Or pod install your way to adding AdMob ads in your app. And check out pod try, a CocoaPods plugin that allows you to quickly launch our demo projects.

For more iOS developer protips and to learn more about Google’s iOS SDKs, please check out the Route 85 video series from Todd Kerpelman. We look forward to all the great Google-powered iOS apps you will create!

Categories: Programming

Google I/O 2015: Develop, Engage & Earn

Thu, 05/28/2015 - 20:41

Posted by Jason Titus, Senior Director of Engineering

This morning, we kicked off our 8th annual Google I/O conference, joined by 5,600 developers at Moscone Center, 530 I/O Extended communities around the world, and millions via the I/O Live stream. This year’s event is focused on delivering incredible experiences on mobile and helping developers build successful businesses through these key themes: Develop, Engage, and Earn.

During the keynote, we had a number of announcements that covered the evolution of the mobile platform, tools for Android, iOS, and Web developers, and the push towards more immersive mobile experiences with VR. Here’s a closer look at the news:

Develop
  • Android M Developer Preview — Get an early look at the next version of Android.
  • Google Play services 7.5 — Check out new features like Smart Lock for Passwords, new APIs for Google Cloud Messaging and
  • Google Cast, and the Google Maps API on Android Wear devices.
  • Android Studio 1.3 — Preview Android Studio v1.3, now with improved Gradle build speed, a new memory profiler, and full editing and debugging support for C/C++.
  • Polymer 1.0 — As the first production-ready release for the web components library, we’re also adding new product lines of elements. Features range from toolbars and menus to offline caching.
  • iOS & CocoaPods — Access your favorite libraries via CocoaPods, the official distribution channel for Google SDKs on iOS.
  • Cloud Test Lab — Building on the technology from our acquisition of Appurify, cloud Test Lab automates the testing of your mobile apps.
  • Firebase — Firebase makes it easy to build apps quickly, without spinning up servers or writing server-side code.
Engage
  • App Indexing — Index your app’s content into Google Search, just like you would with a website. Available for Android, and now piloting on iOS.
  • Google Cloud Messaging — As one of the most popular ways to send messages from the cloud to user's devices on Android and Chrome, GCM is now also available on iOS.
  • Engagement on the Web — Access previously unavailable native functionality on the web with push notifications from your mobile website and add to homescreen.
  • Empowering successful global businesses on Google Play — Get new and powerful tools to help you further grow your business, improve decision making based on smarter insights, and engage and retain a valuable user base.
  • Universal App Campaigns coming soon — Set up app install ads with just a few inputs like your budget and the cost you’re willing to pay per user--we take care of the rest.
  • Measure App Installs Ads with Google Analytics — Measure app install campaigns based on engagement and lifetime value. On iOS, Google Analytics is now integrated with 20+ ad networks making it easy to understand how to spend your acquisition budget.
Earn
  • AdMob — Monetize your apps intelligently on a single platform, so you can gain insights about your users, drive more in-app purchases and maximize your ad revenue.
What’s Next
  • Android Nanodegree — Enroll in the industry's first Android Nanodegree, developed in collaboration with Udacity. The curriculum was built from the ground up s to meet the high standard we have for Android development. The Nanodegree covers everything from fundamentals to advanced development skills, as well as Google Play services and Material Design.
  • Cardboard SDK — The Cardboard SDK for Unity now supports both Android and iOS.
Categories: Programming

Announcing new Google Cast APIs

Thu, 05/28/2015 - 20:20

Posted by Nathan Camarillo and David Allison, Google Cast Product Managers

Google Cast makes it easy for developers to extend their mobile experiences to bigger screens and speakers. Since the launch of Chromecast, the first Google Cast device, we’ve seen tremendous success. 17 million Chromecast devices have been sold, with 1.5 billion touches of the Cast button, changing the way people watch TV. In the US, active Chromecast users are consuming 66% more content per day than at launch in July 2013. We see a 45% increase in YouTube watch time upon a Chromecast activation.

The Google Cast ecosystem continues to grow, with a range of new entertainment devices with Google Cast built-in-- from Chromecast and Android TV devices to speakers and soundbars.

Today at Google I/O, we are announcing new developer tools specifically for mobile app and game developers to bring new experiences to the TV.

Google Cast Remote Display APIs (Beta)

We are making it easy for mobile developers to bring graphically intensive apps or games to Google Cast receivers with Google Cast Remote Display APIs for Android and iOS.

While Casting your Android screen is an existing option for users, the new Remote Display API allows mobile developers to build a tailored, integrated second screen experience, without requiring an identical mirroring of content between mobile devices and the Google Cast device.

Remote Display will be available both iOS and Android. Your app will contain the same Google Cast icon allowing users to select a device. On Android, you’ll connect to the device using the helper class CastRemoteDisplayLocalService, then create a subclass of CastPresentation where you can call setContentView() to display any content you want (including drawing directly on a SurfaceView!). The contents of your CastPresentation are then sent over to the connected Google Cast device for display. On iOS, you’ll use the core Google Cast SDK to discover receivers and connect to one, then GCKRemoteDisplayChannel to establish a new Remote Display session. Your app will then send video and audio frames via the session. Adaptors for OpenGL ES and Metal are provided to help integrating Remote Display with existing rendering code.

There’s no need to write a receiver app as the Remote Display APIs are supported natively by all Google Cast receivers. However, you’ll still need to register a Remote Display App ID using the Google Cast SDK Developer Console.

See Remote Display in action with Driver® Speedboat Paradise:

Autoplay and Queuing APIs

Autoplay and Queuing APIs allow developers to create content queues and begin buffering a second video while a first video is finishing playback. This enables the creation of a continuous playback experience and can significantly increase watch time, helping lift per-session watch times by 10-20% or more.

Playing single media items on Chromecast has been something RemoteMediaPlayer (or CastCompanionLibrary’s VideoCastController) has been doing well for some time. With this release, RemoteMediaPlayer is gaining a full media queue, allowing you to queue up multiple MediaQueueItems, skip between items, set a repeat mode, reorder the queue, and enable autoplay for a seamless media playback experience. The MediaStatus returned to you will be augmented with the current queue of items as well as new OnPreloadStatusUpdatedListener, allowing you to display what will be played next. This ensures that all connected devices can easily maintain a synchronized queue of upcoming media items, opening up new possibilities of creating collaborative Google Cast media experiences.

Game Manager APIs for Google Cast

Since last November, Cast has had the ability to turn your TV screen into a game board, racetrack, dance floor, or trivia quiz—inviting everyone in your living room to use their phone as a personal game controller. Now we’re introducing new Game Manager APIs that will simplify the process for developers to build multiplayer games. To make it easier to synchronize game state across a potentially large number of players, Google Play services 7.5 introduces the Cast Game Manager API. This API, available for Android via the GameManagerClient class, iOS, Chrome, and for receivers, allows you to send messages and state changes to all connected clients and the receiver. All of the send methods also have a method that includes a specific player ID, making it possible to have multiple players sharing a single sender device.

All of these APIs are available on Android today. iOS and Chrome SDK updates will be available in the coming days.

You can learn more about these Google Cast APIs on the Google Cast Developers Site. We’re excited to see what you come up with to bring more mobile-centric experiences to the living room.

Categories: Programming

Get ready for even more material design

Wed, 05/27/2015 - 20:11

Posted by Monica Bagagem, Developer Marketing

Google I/O 2015 starts tomorrow, and, like last year, we’ve got an exciting lineup of design-focused content for both developers and designers to experience in-person and online. Just a year ago, we announced material design - a system for cross-platform visual, motion, and interaction design. This year at I/O, we’ll see how material has been adopted and implemented by the community, and our approach on design across our platforms.

Sessions

At 4PM PDT on Thursday, May 28, join Matias Duarte’s “Material Now” session to recap where we’ve been and get a sneak peek of where we’re going with design at Google. We’ll be recognizing some of the phenomenal material design work from across the community, so definitely tune in if you’re an Android developer or designer. For more details, check Matias’ post on Google+.

The session will be live streamed so you can follow along in real-time even if you’re not at Moscone. Recordings will also be available shortly after on the I/O website.

Add Design Sessions to your I/O schedule

Design Sandbox

We’ve dedicated an entire section of Moscone West to design-related programming, including one-on-one and group UX reviews with members of the material design team. Appointments will be on a first-come, first-serve basis, but we'll also have Google designers on hand for more casual questions.

Add Material Design Reviews to your I/O schedule

Sandbox Talks

Google designers and engineers will host several deep-dive, 20 minute tech talks in a breakout area within the Design Sandbox on Level 2. The space has been designed to facilitate conversation and discussion with small audiences, so come prepared with questions! We’ll be covering a range of topics such as cross-platform and responsive design, designing for platforms like Google Cast and Android Auto, and how to adapt material design to your brand. As an added bonus, most Sandbox Talks will take place twice throughout the conference giving you more flexibility to adjust your schedule.

Add Design Sandbox Talks to your I/O schedule

Explore the full Google I/O schedule here.

Be sure to follow +GoogleDesign and @GoogleDesign, where we’ll be posting design-related announcements throughout the conference. You can also follow and join the general conversation about I/O at #io15. See you tomorrow!

Categories: Programming

App deep linking with goo.gl

Wed, 05/27/2015 - 19:36

Posted by Fabian Schlup, Software Engineer

Starting now, goo.gl short links function as a single link you can use to all your content — whether that content is in your Android app, iOS app, or website. Once you’ve taken the necessary steps to set up App Indexing for Android and iOS, goo.gl URLs will send users straight to the right page in your app if they have it installed, and everyone else to your website. This will provide additional opportunities for your app users to re-engage with your app.

This feature works for both new short URLs and retroactively, so any existing goo.gl short links to your content will now also direct users to your app.

Zillow shares a short URL on Twitter that automatically links to their native app.

Share links that ‘do the right thing’

You can also make full use of this feature by integrating the URL Shortener API into your app’s share flow, so users can share links that automatically redirect to your native app cross-platform. This will also allow others to embed links in their websites and apps which deep link directly to your app.

Take Google Maps as an example. With the new cross-platform goo.gl links, the Maps share button generates one link that provides the best possible sharing experience for everyone. When opened, the link auto-detects the user’s platform and if they have Maps installed. If the user has the app installed, the short link opens the content directly in the Android or iOS Maps app. If the user doesn’t have the app installed or is on desktop, the short link opens the page on the Maps website.

Try it out for yourself! Don’t forget to use a phone with the Google Maps app installed: http://goo.gl/maps/xlWFj.

How to set it up

To set up app deep linking on goo.gl:

  1. Complete the necessary steps to participate in App Indexing for Android and iOS at g.co/AppIndexing. Note that goo.gl deep links are open to all iOS developers, unlike deep links from Search currently. After this step, existing goo.gl short links will start deep linking to your app.
  2. Optionally integrate the URL Shortener API with your app’s share flow, your email campaigns, etc. to programmatically generate links that will deep link directly back to your app.

We hope you enjoy this new functionality and happy cross-platform sharing!

Categories: Programming

Surfacing content from iOS apps in Google Search

Wed, 05/27/2015 - 15:51

Posted by Eli Wald, Product Manager

We’ve been helping users discover relevant content from Android apps in Google search results for a while now. Starting today, we’re bringing App Indexing to iOS apps as well. This means users on both Android and iOS will be able to open mobile app content straight from Google Search.

Indexed links from an initial group of apps we’ve been working with will begin appearing on iOS in search results both in the Google App and Chrome for signed-in users globally in the coming weeks:

How to get your iOS app indexed

While App Indexing for iOS is launching with a small group of test partners initially, we’re working to make this technology available to more app developers as soon as possible. In the meantime, here are the steps to get a head start on App Indexing for iOS:

  1. Add deep linking support to your iOS app.
  2. Make sure it’s possible to return to Search results with one click.
  3. Provide deep link annotations on your site.
  4. Let us know you’re interested. Keep in mind that expressing interest does not automatically guarantee getting app deep links in iOS search results.

If you happen to be attending Google I/O this week, stop by our talk titled “Get your app in the Google index” to learn more about App Indexing. You’ll also find detailed documentation on App Indexing for iOS at g.co/AppIndexing. If you’ve got more questions, drop by our Webmaster help forum.

Categories: Programming

Get ready for Google I/O 2015

Tue, 05/26/2015 - 19:24

Posted by Mike Pegg, reppin' I/O since 2011

Google I/O is almost here! We’ll officially kick-off live from the Moscone Center in San Francisco at 9:30AM PDT this Thursday, May 28th. While we’re putting the finishing touches on the keynote, sessions, sandbox talks, and code labs, we wanted to provide you with some tips to get ready to experience I/O, either in-person or offsite.

Navigate the conference with the Web & Android apps

To get the most out of Google I/O, make sure to download the I/O Android App and/or add the I/O web app to your mobile homescreen (both work offline!). From either, you can plan your schedule, view the venue map, and keep up with the latest I/O details. We just updated the website this morning, optimizing it for real-time content, as well as the Android app on Google Play - make sure to download the latest version (3.3.2) before the conference starts.

Attending in person?

New this year, keynote access will be assigned on a first-come, first-serve basis during badge pickup. Be sure to swing by Moscone West tomorrow, Wednesday, May 27th between 9AM-8PM PDT to pick up your badge (full badge pick-up schedule). Don’t forget to bring your government-issued photo ID and a copy of your ticket (on your phone or a printed copy). If you’re an Academic attendee, please remember to bring proof of eligibility. You might want to read through the pro tips in our FAQ before you arrive to learn how to best navigate the conference.

Last but not least, we’re looking forward to kicking back and relaxing with you at the After Hours party during the evening of Day 1. Expect good food, good drinks, and a few Googley surprises. Be sure to check your email during the event for further instructions.

Attending remotely?

Can’t join us in person? Don’t worry, we’ve got you covered! Whether you’re looking to to experience I/O with other devs in your neighborhood, or if you’ll be streaming it live from your couch, here are some ways you can connect with I/O in real-time:

  • I/O Extended: There’s still time to find an I/O Extended event happening near you. We have 460+ events happening in 91+ countries, so your chances of finding one near you are pretty good. These events are organized by Google Developer Groups, Student Ambassadors, and local developers, and include activities such as code labs, hackathons and more.
  • I/O Live: Tune into I/O Live on the web at google.com/io or via the I/O Android app. We will live stream the keynote and all sessions over the course of the event. During breaks, you can watch live interviews happening right from Moscone and educational pre-recorded content. If you want to bring the live stream and/or the #io15 conversation to your audience, simply customize our I/O Live widget and embed it on your site or blog.
  • #io15request: Send us your questions about what’s happening at I/O and a team of onsite Googlers will do their best to track down an answer for you. To submit a request, just make a public post on Google+ or Twitter with the #io15request hashtag. We’ll only be replying to requests made on May 28-29, in English, French or German. Learn more.
  • I/O in photos: Be sure to check out our photo stream during the event. We’ll upload photos in real time, live from Mocone.

We’re looking forward to seeing you in person or remotely on Thursday and Friday. Don’t forget to join the social conversation at #io15!

Categories: Programming

Rolling out the red carpet for app owners in Search Console

Fri, 05/22/2015 - 19:22

Posted by Hillel Maoz, Engineering Lead, Search Console Team and Mariya Moeva, Webmaster Trends Analyst

Originally posted to the Webmaster Central blog

Wouldn’t it be nifty if you could track where your indexed app content shows up in search results, for which queries, which app pages are most popular, and which ones have errors? Yeah, we thought so too! So we’ve equipped our freshly renamed Search Console with new reports to show you how Google understands and treats your app content in search results.

Our goal is to make Search Console a comprehensive source of information for everyone who cares about search, regardless of the format of their content. So, if you own or develop an app, Search Console is your new go-to place for search stats.

Add your app to Search Console

Simply open Search Console and enter your app name: android-app://com.example. Of course, we’ll only show data to authorized app owners, so you need to use your Google Play account to let Search Console know you have access to the app. If you don’t have access to your app in Google Play, ask an owner to verify the app in Search Console and add you next.

Connect your site to your app

Associating your site with your app is necessary for App Indexing to work. Plus, it helps with understanding and ranking the app content better.

Track your app content’s performance in search

The new Search Analytics report provides detailed information on top queries, top app pages, and traffic by country. It also has a comprehensive set of filters, allowing you to narrow down to a specific query type or region, or sort by clicks, impressions, CTR, and positions.

Use the Search Analytics report to compare which app content you consider most important with the content that actually shows up in search and gets the most clicks. If they match, you’re on the right track! Your users are finding and liking what you want them to see. If there’s little overlap, you may need to restructure your navigation, or make the most important content easier to find. Also worth checking in this case: have you provided deep links to all the app content you want your users to find?

Make sure Google understands your app content

If we encounter errors while indexing your app content, we won’t be able to show deep links for those app pages in search results. The Crawl Errors report will show you the type and number of errors we’ve detected.

See your app content the way Google sees it

We’ve created an alpha version of the Fetch as Google tool for apps to help you check if an app URI works and see how Google renders it. It can also be useful for comparing the app content with the webpage content to debug errors such as content mismatch. In many cases, the mismatch errors are caused by blocked resources within the app or by pop-ups asking users to sign in or register. Now you can see and resolve these issues.

To get started on optimizing and troubleshooting your own app, add it to Search Console now. If you want to know more about App Indexing, read about it on our Developer Site. And, as always, you’re welcome to drop by the help forum with more questions.

Categories: Programming