Gmail Add-ons: How-to be on all desktops and mobiles with a simple app

Some weeks ago, Google opened the possibility of create Gmail add-ons for all developers. Let’s see what are the possibilities and how to create and distribute our own Gmail add-on.

This is not a very hard and abstract post, add-ons are something easy, so I hope my explanation to be simple too. It is based on an add-on created by myself that allows any user adding an email to Google Tasks (probably it is more useful than the add-on for MSN Messenger that I created years ago which read received messages with “real” voice). It is a functionality that has been always present on Gmail web client but that is not available on mobile clients.

In order to create a Gmail add-on you need to create a new Google Apps Script project. After that you can add as much scripts as you want to create an interface using Cards and the functionality to interact with other services, APIs and so.

For instance, in our example the functionality is very basic:

function getContextualAddOn(e) {
  var message = getMessage(e);
  var subject = message.getSubject();
  var permaLink = message.getThread().getPermalink();
  var card = CardService.newCardBuilder();
                 .setTitle('Save as task'));
  var section = CardService.newCardSection();
  var button = prepareSend(permaLink);
  return [];

function getMessage(e){
  var accessToken = e.messageMetadata.accessToken;
  var messageId = e.messageMetadata.messageId;
  return GmailApp.getMessageById(messageId);

function prepareSend(permaLink){
   var action = CardService.newAction()
  .setParameters({permaLink: permaLink});
  return CardService.newTextButton()

function saveTask(e){
  var res = e['formInput'];
  var task = Tasks.newTask();
  task.title = res["title"];
  task.notes = e['parameters'].permaLink;
  if(res["due-date"] && res["due-date"] != "")
    task.due = res["due-date"] + "T00:00:00Z"
  task = Tasks.Tasks.insert(task, res["list"]);

function prepareList() {
  var response = Tasks.Tasklists.list();
  var taskLists = response.items;
  var dropdown = CardService.newSelectionInput()

  if (taskLists && taskLists.length > 0) {
    Logger.log('Task lists:');
    for (var i = 0; i < taskLists.length; i++) {
      var taskList = taskLists[i];
      Logger.log('%s (%s)', taskList.title,;
      dropdown.addItem(taskList.title,, i==0);
  } else {
    Logger.log('No task lists found.');
  return dropdown;

function prepareTitle(subject){
  return CardService.newTextInput()

function prepareDueDate(){
  var d = new Date();
  var dd = new Date(d.setDate(d.getDate() + ( 7 - d.getDay()) % 7));
  return CardService.newTextInput()
                    .setTitle("Due date")
                    .setValue(dd.getYear() + "-" + (dd.getMonth() + 1) + "-" + dd.getDate())

function prepareSign(){
  var button = CardService.newTextButton()
  return CardService.newKeyValue()
     .setContent("Developed and maintained by")

You also have to change script’s manifest. To do this you need to access to the menu “View>Show manifest file”. Now you can modify it as you want, following our example you have to replace the content with following lines:

  "timeZone": "GMT",
  "dependencies": {
    "enabledAdvancedServices": [{
      "userSymbol": "Tasks",
      "serviceId": "tasks",
      "version": "v1"
  "oauthScopes": ["", "", ""],
  "gmail": {
    "name": "Save as task",
    "logoUrl": "",
    "contextualTriggers": [{
      "unconditional": {
      "onTriggerFunction": "getContextualAddOn"
    "primaryColor": "#000000",
    "secondaryColor": "#888888",
    "version": "TRUSTED_TESTER_V2",
    "openLinkUrlPrefixes": [

As soon as you save all changes you are ready to publish your add-on. Unfortunately the Gmail add-ons store is not publicly available yet. However, you can get the ID of your add-on and distribute it to your company co-workers in order to allow them install your add on in their gmail accounts. See images below:

Gmail Add-on ID
Gmail Add-on ID
Add-ons in Gmail settings
Add-ons in Gmail settings

I know that it’s frustrating not being able to publish an add-on to all gmail users around the world yet. Anyway, I already have two cool ideas to implement as soon as I was able to distribute it at least to my friends. Something to draw an ordered tree of a conversation thread and an add-on to respond an e-mail with a video in an easy way. And you? What ideas are coming to your head?

How to implement a distributed OAuth 2.0 system

Some months ago (more than a year), I was playing with the WordPress REST API. I did an analysis about how to implement a distributed OAuth 2.0 system as an attempt to collaborate with the community. I wrote it as a comment over a discussion post, but I am going to replicate that here in order to save it with my other ideas and works.

  • is the site of the app
  • is the WP site of the developer in which I defined the app as multitenant
  • is the WP site of the user who wants authorize to interact

01- User access to and says “Hey, I want to use this cool app”
02- ask for its server and user writes
03- ask to to identify the user
04- the user writes his credentials in
05- redirects to with a code saying “Yeah, this man is my man”
06- then ask to and says “Hey, I have a user with a code that wants to acces to the resource and I am the (this is my client_id and this my client_secret)”
07- generates a token
08- says to “Hey, I just generated this token that expires in 3600 seconds”
09- says “I would prefer not to work, but you know, it’s ok”
10- sends the response to with the token
11- then using the token, can now ask to to do some stuff

If you look at this diagram

Abstract Protocol Flow
Abstract Protocol Flow

Client =
Resource Owner =
Authorization Server =
Resource Server =

The RFC says:

“The interaction between the authorization server and resource server
is beyond the scope of this specification. The authorization server
may be the same server as the resource server or a separate entity.
A single authorization server may issue access tokens accepted by
multiple resource servers.”

but with some conversation like the one in 8 and 9 it could be resolved.

Big Data talks

I was attending a talk about Big Data past week. The organizer was Ascentic and the speakers were IT workers from Cantabria with high responsibilities in their companies. I think that it is interesting know about different use cases on different sectors.

[Leelo en español en CantabriaTIC]

Celestino Güemes works at Atos Worldgrid and he is a member of its wise committee. He was talking about “new” types of analytic and the issues. He explained us that analytics of historic data is something easy nowadays. However, systems are evolving with predictive analytics and prescriptive analytics, providing to an operator possible actions to take, and what is the recommended one.

He also remarked the use of deep learning and multi-sided market analytic platforms to develop new products and services.

He exposed some interesting and real cases as example of the different types of use of Big Data:

Operational excellence: An oil company uses drilling heads with 120 sensors. They can analyze the data in real time and compare them with the historic  data to know when a head will break avoiding problems.

User experience: They work for a telecommunications provider in the relation between mobile network configuration and use of the clients. For example, they can detect where a user lives or works based on the network elements the user uses. They also can improve the quality of service for a specific VIP user when she is using Youtube during a trip or notify a client with information or an offer when he walk into a street (this is very similar to what I did with my team at GPMESS, my last company).

Bussiness re-invention: A seller of electricity is putting their data with other data sources to look for new possible services and business models. They are exploring things like detect the different machines in a house and offer discounts in new machines when they detect a problem in one of them based on the use of electricity.

Confidence and compliance: he is working in a solution to detect non-technical economic losses (frauds and errors) for electricity companies. These losses represent 1% of the business (3.7M€ per year).

Miguel Sierra is a manager at CIC. He leads a product called IDbox that is a software for Operational Intelligence. It integrates all available information sources, processes that captured signals and offers the tools for analysis to assist in operational decision making. This product is used by companies from different sectors: nuclear plants, electricity companies, private parking companies, water companies, and also high performance sports training.

He was talking about their history and how they became a company with high expertise on Big Data.

He said that the size is important but the frequency is more important. They process 1.5M signals from Iberdrola each second and 80K signals from a nuclear plant each 20 milliseconds (it is almost the same that 4M signals each second).

They help business that are not scalable at first sight providing them ways to become scalable and more profitable companies. He used the example of a clinic that work with professional athletes. They needed a doctor attending a single athlete inside their installations. Now they can provide a service to other clinics and gyms monitoring trainings from a control center operated by a group of doctors. A single doctor can work now with more than 20 athletes that are training at anyplace.

Raul Uría, CEO at Zzircon Business Intelligence. He did a basic presentation thinking in non-technical attendees. He explained what is and for what is the data mining. He showed a complete example with a single product (a slide for kids) talking how data mining helps to know to what users you have to offer this product, and how you should impact them and what message you should use.

I am sure that it was a great explanation for people that are not involved on IT everyday.

WSO2 vs Azure API Management

Next values are estimations and all of them depend on the final production needs.

  WSO2 Azure API Management
Deployment effort 24-40 hours 1 hour
New API effort 1-2 hours 1-2 hours
Scale effort 4-8 hours 1 hour
Distribute in other location effort 16-24 hours 1 hour
Min dev. environment costs 1 machine = 587 euros yearly 486 euros yearly
Min pro. environment costs 12 machines + 8,500 euros (yearly) for production support = 15,544 28,288 euros yearly + 1,404 euros for VPN connection to private endpoints = 29,692
Pros –          It is more flexible thinking about configuration and deployment options –          Its management is easier and faster.
Cons –          Hard maintenance –          Its more expensive

–          At this moment you can only use this over Azure infrastructure.

Reverse Geocoding: Bing Maps REST Services

Provider: Microsoft Corp.
Provider Client Libraries: Javascript, .Net
Multiple Languages: Yes
Limitations: Developer account: 10,000 transactions within 30 days period


Batch processing: 1,000,000 geocode entities non-billable transactions in any 12 month period


Windows Apps: 50,000 transactions per day


No Windows mobile apps: 125,000 per year


Information wizard:


If Enterprise prices do not need to be considered, base prices can be seen as a component of Azure subscriptions:

Price: Different prices that depends on the use.


Example request:,-122.12934?o=json&key=BingMapsKey


Example response:

   "copyright":"Copyright © 2011 Microsoft and its suppliers. All rights reserved. This API cannot be accessed and the content and any results may not be used, reproduced or transmitted in any manner without express written permission from Microsoft Corporation.",
               "name":"1 Microsoft Way, Redmond, WA 98052",
                  "addressLine":"1 Microsoft Way",
                  "adminDistrict2":"King Co.",
                  "countryRegion":"United States",
                  "formattedAddress":"1 Microsoft Way, Redmond, WA 98052",


Azure API Management

It is a cloud API Manager that can connect to public and private endpoints. It is provided by Microsoft and cannot be installed as on premise service (but they are open to suggestions about this point and it is under review: “There is no on-premises deployment option available at this time but you can vote on uservoice if you’d like this capability. However, you can certainly use Azure-based API management with on-premises systems and data.”).

At this moment it does not provide a monetization standard (but it is under review: ”We’ll monitor continued feedback on this item”).

It is planned to provide a standard way to have testing and live environments (Sandbox Environment – SBE).

Set Up

Follow a two steps wizard is the only need to set up an Azure API Management instance.

Azure APIM 1

Azure APIM 2

Advanced configuration can be provided in order to personalize security, rate limits or monitoring capabilities.


A publisher can add a new API manually or import it using WADL or Swagger definition files.

Azure APIM 3

A subscriber can discover and subscribe to an API throw developer portal.

Reverse Geocoding: Nominatim

Provider: OpenStreetMaps
Provider Client Libraries:
Multiple Languages: Yes
Limitations: 1 request per second (86400 per day)
Price: Free


You can use thid party providers or install your own instance


Example request:

Example response:

<reversegeocode timestamp="Fri, 06 Nov 09 16:33:54 +0000" querystring="...">
   <result place_id="1620612" osm_type="node" osm_id="452010817">
     135, Pilkington Avenue, Wylde Green, City of Birmingham, West Midlands (county), B72, United Kingdom
     <road>Pilkington Avenue</road>
     <village>Wylde Green</village>
     <town>Sutton Coldfield</town>
     <city>City of Birmingham</city>
     <county>West Midlands (county)</county>
     <country>United Kingdom</country>