Tech roundup 4: a journal published by a bot

Read a tech roundup with this week’s news that our powerful bot has chosen: blockchain, AI, development, corporates and more.

Gooooooood morning, Hyperspace!!! Hey, this is not a test, this is a tech roundup. Time to rock it from the Delta to the DMZ.

AI, bots and robots

Blockchain and decentralization

Woman computer scientist of the week
Tamara “Tammy” G. Kolda is an American applied mathematician and Distinguished Member of Technical Staff at Sandia National Laboratories. She is noted for her contributions in computational science, multilinear algebra, data mining, graph algorithms, mathematical optimization, parallel computing, and software engineering. She is currently a member of the SIAM Board of Trustees and serves as associate editor for both the SIAM Journal on Scientific Computing and the SIAM Journal on Matrix Analysis and Applications. She received her bachelors degree in mathematics in 1992 from the University of Maryland Baltimore County and her PhD in applied mathematics from the University of Maryland College Park in 1997. She was a Householder Postdoctoral Fellow at Oak Ridge National Laboratory from 1997 to 1999 before joining Sandia National Laboratories. Kolda received a Presidential Early Career Award for Scientists and Engineers in 2003, best paper prizes at the 2008 IEEE International Conference on Data Mining and the 2013 SIAM International Conference on Data Mining, and has been a distinguished member of the Association for Computing Machinery since 2011. She was elected a Fellow of the Society for Industrial and Applied Mathematics in 2015.

Cloud and architecture

Development and languages

Quote of the week

{ajh} I always viewed HURD development like the Special Olympics of free software.

Enterprises

Other news

Suscríbete al blog por correo electrónico

Introduce tu correo electrónico para suscribirte a este blog y recibir notificaciones de nuevas entradas.

Tech roundup 3: a journal published by a bot

Read a tech roundup with this week’s news that our powerful bot has chosen: blockchain, AI, development, corporates and more.

Gooooooood morning, Community!!! Hey, this is not a test, this is a tech roundup. Time to rock it from the Delta to the DMZ.

AI, bots and robots

Blockchain and decentralization

Woman computer scientist of the week
Yolanda Gil is a Spanish computer scientist specializing in knowledge discovery and knowledge-based systems at the University of Southern California (USC). She served as chair of SIGAI the Association for Computing Machinery (ACM) Special Interest Group (SIG) on Artificial Intelligence, and the president of the Association for the Advancement of Artificial Intelligence (AAAI).

Cloud and architecture

Development and languages

Quote of the week

Every language has an optimization operator. In C++ that operator is ‘//’

Enterprises

Other news

Suscríbete al blog por correo electrónico

Introduce tu correo electrónico para suscribirte a este blog y recibir notificaciones de nuevas entradas.

Tech roundup 2: a journal published by a bot

Read a tech roundup with this week’s news that our powerful bot has chosen: blockchain, AI, development, corporates and more.

Gooooooood morning, Bodies!!! Hey, this is not a test, this is a tech roundup. Time to rock it from the Delta to the DMZ.

AI, bots and robots

Blockchain and decentralization

Woman computer scientist of the week
Gladys del Estal was a computer programmer and ecologist activist from San Sebastián, Gipuzkoa, Spain. She was killed by a bullet shot at her by the Civil Guard during a protest in Tudela, Navarre, against the nuclear station construction program for the Basque Country, and the aircraft firing range of Bardenas. She has since become an important icon of the ecologist movement.

Cloud and architecture

Development and languages

Quote of the week

Before software can be reusable it first has to be usable.

Enterprises

Other news

Suscríbete al blog por correo electrónico

Introduce tu correo electrónico para suscribirte a este blog y recibir notificaciones de nuevas entradas.

Tech roundup 1: a journal published by a bot

Read a tech roundup with this week’s news that our powerful bot has chosen: blockchain, AI, development, corporates and more.

Gooooooood morning, Y’all!!! Hey, this is not a test, this is a tech roundup. Time to rock it from the Delta to the DMZ.

AI, bots and robots

Blockchain and decentralization

Woman computer scientist of the week
Carolina Cruz-Neira is a Spanish-Venezuelan-American computer engineer, researcher, designer, educator, and a pioneer of virtual reality (VR) research and technology. She is known for inventing the CAVE automatic virtual environment. She previously worked at Iowa State University (ISU), University of Louisiana at Lafayette and is currently the director of the Emerging Analytics Center at the University of Arkansas at Little Rock.

Cloud and architecture

Development and languages

Quote of the week

The best code is no code at all.

Enterprises

Other news

Suscríbete al blog por correo electrónico

Introduce tu correo electrónico para suscribirte a este blog y recibir notificaciones de nuevas entradas.

Tech roundup 1: a journal published by a bot

Read a tech roundup with this week’s news that our powerful bot has chosen: blockchain, AI, development, corporates and more.

Gooooooood morning, Y’all!!! Hey, this is not a test, this is a tech roundup. Time to rock it from the Delta to the DMZ.

AI, bots and robots

Blockchain and decentralization

Woman computer scientist of the week
Carolina Cruz-Neira is a Spanish-Venezuelan-American computer engineer, researcher, designer, educator, and a pioneer of virtual reality (VR) research and technology. She is known for inventing the CAVE automatic virtual environment. She previously worked at Iowa State University (ISU), University of Louisiana at Lafayette and is currently the director of the Emerging Analytics Center at the University of Arkansas at Little Rock.

Cloud and architecture

Development and languages

Quote of the week

The best code is no code at all.

Enterprises

Other news

Suscríbete al blog por correo electrónico

Introduce tu correo electrónico para suscribirte a este blog y recibir notificaciones de nuevas entradas.

Chatbots (III): The magic of creating chatbots without Visual Studio thanks to FormFlow

Chatbots, conversational interfaces, that started their hype two years ago are really funny but they aren’t always useful. Design a conversation having in mind all different possible options needs a lot of effort. Is it necessary to manage all possible inputs in order to manage a simple business task? Thinking about most of the apps, the answer is “no”, else we wouldn’t have a lot of apps based on forms getting information to do something.

[En castellano]

Microsoft gives us, inside Bot Framework, an easy way to model tiny business functionalities that in other environments we would resolve using a simple form. FormFlow is an engine that, using a data model, can automatize a conversation to get all the information needed to fill the model and allow us to do whatever we want with the data.

Thanks to it we can use the omnipresence of chatbots to be more near to our public (because chatbots are in a lot of channels in which users are already present).

Let’s go to see how we can deploy a simple chatbot inside Azure. You will see that using FormFlow is so easy so we don’t need to open Visual Studio to code it. It will be like magic! We are going to do an emulator of the Sorting Hat from Hogwarts who decided, in J. K. Rowling stories, to what house the students of the school of magic were going.

It is a really easy example, but full of functionalities and details, that is going to be useful as a base to make any simple app that needs to get some data with which do something: calculate something (like in this case but also it could be calculating a loan conditions or the price for a car insurance), call to an API (for instance in order to record a request for a loan, or to request a call from a salesperson), or whatever comes to mind.

As most of the bot is based on the data model definition it will be an easy task no matter how complex it was, because the magic is coded by Microsoft inside FormFlow.

Our first move has to be creating a new bot over Azure (Web App Bot). To get the easy way, let’s use an example bot, so when we are creating the app we will tell to Azure that has to use the FormFlow template.

When it was deployed, we will have two elements with the same name: a Web App and the Web App Bot. Inside the Web App Bot options, we can find one (under Build menu) to edit code online, that’s the one that we will use to edit our bot’s code because it’s so simple that we don’t need to do complex things.

Bots based on FormFlow use a model, so we have to define one. Our Sorting Hat isn’t so magic like the one from Hogwarts, ours gets information from students in order to take a decision.

Fields of our model can be from basic types (integers, floating point numbers, strings, DateTimes, enumerations) and lists of enumerations. In our case, we will start only with enumerations because values that we will manage are too specific to our domain.

 public enum Nombres {
  Godric,
  Helga,
  Rowena,
  Salazar
 };  
 public enum Cualidades {
  Valentía, 
  Honestidad,
  Inteligencia,
  Ambición
 }; 
 public enum Colores {
  Rojo, 
  Amarillo,
  Azul,
  Verde
 }; 
 public enum ColoresDeJoyas {
  Oro,
  Negro,
  Bronce,
  Plata
 }; 
 public enum Animales {
  Leon,
  Tejon,
  Aguila,
  Serpiente
 }; 
 public enum LugaresParaVivir {
  TorreOscura,
  Bodega,
  TorreLuminosa,
  Mazmorras
 }; 
 public enum Titulos {
  Sir,
  Fraile,
  Dama,
  Barón
 }; 
 public enum Amigos {
  Ron,
  Neville,
  Hermione,
  Harry
 }; 
 public enum Accesorios {
  Espada,
  Copa,
  Diadema,
  Guardapelo
 } 
 [Serializable]
 public class SortingTest
 {
     public Nombres? Nombre;
     public Cualidades? Cualidad;
     public Colores? Color;
     public ColoresDeJoyas? ColorDeJoyas;
     public Animales? Animal;
     public LugaresParaVivir? LugarParaVivir;
     public Titulos? Titulo;
     public Amigos? Amigo;
     public Accesorios? Accesorio;

     public static IForm<SortingTest> BuildForm()
     {
         OnCompletionAsyncDelegate<SortingTest> evaluate = async (context, state) =>
         {
              await context.PostAsync(“OK”);
         };
         return new FormBuilder<SortingTest>()
                  .Message(“Hola”)
                  .OnCompletion(evaluate)
                  .Build();
     }
};

If you also want to manage new information like the name or the birth date, you only need to add new properties to our model that bot will manage and validate for you.

    public string TuNombre;
    public DateTime? FechaDeNacimiento;

As soon as we have our class model defined, we only have to add this to our project creating an online file and we can also delete the example model that is not related to our magic world.

We also have to do some minor changes in the controller to allow it use our new model instead of the one we deleted. The file name is MessagesController.cs and we will change references to the model on MakeRootDialog method.

    internal static IDialog<SortingTest> MakeRootDialog()
    {
        return Chain.From(() => FormDialog.FromForm(SortingTest.BuildForm));
    }

From this point, we can compile (build.cmd) and test our bot. Again, inside Web App Bot options we have one to test our bot using a web client without leaving Azure Portal. As soon as we say “Hi” it will answer us and we could see it asking us to fill the model.

When we are capturing all the data, we only have to process that and to do this we will change code inside BuildForm method of our SortingTest. If we test it again, we will see that we already have all working. However, it’s not very beautiful that if our bot is made for Spanish speakers it speaks in English. FormFlow is ready to localize it to different languages but in our case only will change some details using attributes over our model.

There are attributes for many things. For instance, we can set optional fields or create our very own validations. We will use a template attribute to change the question that is made for each field.

    [Template(TemplateUsage.EnumSelectOne, "Elige un {&} {||}", ChoiceStyle = ChoiceStyleOptions.Auto)]

There is a full language to edit messages format. In our case, {&} characters represent the field name and {||} characters different options for the user. The ChoiceStyleOptions enumeration allows us to indicate how options are shown.

If we would test again we will see that it is more elegant, but it is not elegant at all because of some language conflicts. For instance “Cualidad” is female and the question is not neutral so it’s not well-formed for female names. It’s the same for string and DateTime properties for what we didn’t change their template. We can use a similar attribute that is applied to one only property.

    [Prompt("Elige una cualidad {||}")]

FormFlow has more capabilities but with these, we could do some “tech magic” in few minutes to get something beautiful and functional. We only have to select one or more channels to publish it to start reaching our public just in the channel they are working daily. For instance, if you want to know what the Sorting Hat is thinking about you, you only have to visit javilopezg.com/SortingHat and talk to it.

Chatbots (II): How to make your phone read to you Hacker News (also contents)

Following the post series about Chatbots, I did an app to make Google Assistant allows you navigate using your voice throw Hacker News (Y Combinator‘s tech news forum) and listen to the body of the news in which you are more interested.

[En castellano]

If you are only interested in using it, you only have to talk to Google and say “Talk to Hacker News Reader. However, if you want to know main points to be able to do something similar using few hours and few lines of code stay tuned: we are going to see the 6 human characteristics that are really easy to get using Dialogflow.

Dialogflow is a Google platform (formerly API.ia) that allows us to create chatbots in an easy way.

It is so powerful that allows us to do a lot of things without a single line of code.

For instance, it has integrated a natural language processing system. If you give it few training examples, it will know what our users are trying to say, driving them to different parts of our chatbot in order to give them the correct answer.

So, it will allow us to give human capacities to our chatbot in a really easy way.

1. Listening

Intents are the main component of a chatbot inside Dialogflow. It is something like a conversation unit, it is each part that our system can understand and give an answer.

Dialogflow allows us to set events and other parameters that will launch an intent. Especially, it allows writing different sentences in order to guide the chatbot. When it will detect those it will know that that is the intent it has to launch.

It also allows writing different responses that it will launch randomly avoiding you from writing code.

2. Understanding

A chatbot picking out actions based on sentences without a single line of code is great but not powerful enough. In the end, listening sentences is not the most important but understanding the concepts that are wrapped inside them.

When you are typing example phrases, you are allowed to select some words to say the platform that they are something important and that it should abstract them from single values.

At the moment in which the language understanding motor detects some of the entities that we had mapped to variables, it will extract them and send them to us as parameters in each request.

The system is ready to understand a lot of different things, but it allows us to define our own entities in order to model exactly what we want. For instance, I created an entity with different sections from Hacker News: top, new, best, ask, show and job. Then, the system can understand that a user wants that jobs uploaded to Hacker News to be read.

3. Intelligence

If intents’ answer options are not enough, we can create our very own web service to answer request understood by the system.

Google offers some libraries and using them, create a service in any language on any platform would be really easy. However, for small things like Hacker News Reader we can code right inside the platform using node.js. This code will be deployed to Firebase only with one click.

When you are thinking about things you can do you must think that coding a service (on Firebase or anywhere) you can do literally anything.

That is, you don’t need to only use APIs to access contents because you don’t have cross-origin restrictions. You have the whole Internet in your hands.

For instance, my action allows users listen to news linked from Hacker News. To do this, it downloads the web page (like a browser) and processes that to extract contents (I didn’t a hard work, it could be better).

4. Analysis

In order to use the inline editor, we have to attend some restrictions like the one that says that “dialogflowFirebaseFulfillment” must be the name of our function if we want an automated deployment.

However, thanks to Dialogflow listening and understanding, when we are giving it some intelligence we will have really easy to give analysis capacities of requests received for our chatbot.

Map each intent to functions developed by ourselves is really easy. Intents function is listening so they will say us what a user wants.

We could also access parameters understood by the system thanks to entities (understanding).

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) =&amp;gt; {
    const agent = new WebhookClient({
        request,
        response
    });
    //...

    function read(agent) {
        var number = agent.parameters.number || 0;
        //...
    }

    let intentMap = new Map();
    intentMap.set('Default Welcome Intent', welcome);
    intentMap.set('Repeat', repeat);
    intentMap.set('Back', repeat);
    intentMap.set('Read', read);
    //...
    var promise = null;
    try {
        promise = agent.handleRequest(intentMap).then(function() {
            log('all done');
        }).catch(manageError);
    } catch (e) {
        manageError(e);
    }
    return promise;
});

5. Answering

To give to our chatbot answering capacity we only have to use add method from WebhookClient. We can pass as params text, user answer suggestions or rich text cards where we can embed images, use emoticons, etc.

Keep in mind that some devices where your action may be executed could not own a display or a browser. It’s important if we want a strictly conversational interface, so we should avoid visual elements to help to our bot only with words.

6. Memory

The most disgusting thing in a conversation is having to repeat every time what we are saying so, it is really important that our bot remember what user said in previous interactions of our conversation.

For this, we will use contexts. Contexts are an element managed by Dialogflow to help to choose between intents in order to launch the correct one. They could be used to know if a client device has a display available, for instance.

Their use is not very well documented, but when you debug basic methods you see that it is trivial use them to save information between each conversation turn.

    //...
    var context = agent.getContext('vars');
    var vars = context ? context.parameters : {
        ts: (new Date()).getTime() % 10000,
        items: [],
        pointer: 0,
        list: 0,
        //...
    };
    //...
    agent.setContext({
        name: 'vars',
        lifespan: 1000,
        'parameters': vars
    });
    //...

With these 6 human capacities, you own the keys to do something similar by your own and provide a lot of functionalities to Google Assistant.

I hope they may be useful for you, the action and the information provided in this post. If yes, please share and spread the word.

We will continue with other systems that allow us to do chatbots in an easy way and with how to integrate our chatbots into other channels.