Blog / Development

Powering Alexa with Kentico Cloud


by Bryan Soltis

May 17, 2017

Websites. Mobile apps. Smartwatches. Wearables. Digital assistants. How you deliver content is limitless, with so many options for how people consume and leverage information. Thanks to AI, people are integrating technology into their everyday lives and using it to get information from their favorite sources. Enter the headless CMS. By hosting your content in a central location, you can easily provide your data to a digital butler with ease. In this article, I’ll show you how I powered Alexa with my Kentico Cloud content.

It’s no secret that one of the biggest benefits of a headless CMS is the ability to store content in a single location and deliver it across multiple channels. While the web will probably be most companies’ primary focus, there are so many other ways data can be consumed. From mobile to smartwatches, billboards to digital assistants, the possibilities are seemingly endless as to how and when people want to get information. 

Digital assistants are rocketing in popularity, with many households integrating them into their daily lives. Whether it’s for home automation or streaming, Amazon’s Alexa is certainly one of the front runners when it comes to which AI persona people want to have help them get information. Privacy concerns aside, these devices can be a great tool for you to reach your audience and market your brand.

In this article, I thought it would be cool to show you how to integrate your Kentico Cloud content with Alexa. The process is surprisingly simple, requiring only a little bit of code and configuration. And because the content is stored in Kentico Cloud, it could easily be repurposed for other communication and presentations. So how did I do it?

Create Kentico Cloud Content

Before I could get started with Alexa, I needed to create my content. Because I knew I wanted to provide facts about Kentico, this meant defining my content types in Kentico Cloud. Note that I used a radio select element to determine the type of fact for each item.


With the content type created, I added my facts. 


Because the facts would be used for Alexa and not a website, I didn’t need to create any other types of content. This greatly simplified the creation process and allowed me to focus on the development.

Create a Service to Retrieve Content

With the content handled, I was ready to get to the fun stuff: coding! The first step of the integration was to create a service that would retrieve the content from Kentico Cloud. Because it’s all REST API- accessible, nearly any solution would have been a good fit. AWS Lambda functions, REST Services, or even a standalone web application would all be fine for retrieving the data. Not surprising to anyone that knows me, I chose Azure Functions for the job. I knew I would be able to code the function very quickly and it would fit my needs perfectly.

In the Azure portal, I created a new HTTP trigger function. After creating it, I added a project.json file to pull in my dependencies.


Because I would only be integrating with Kentico Cloud and returning JSON, the list of packages was very short. Here is the full project.json file.

{
  "frameworks": {
    "net46":{
      "dependencies": {
          "KenticoCloud.Delivery":"4.1.0"
      }
    }
   }
}


Next, I added my function logic. I added the Delivery API code to retrieve the data. Because this was an HTTP trigger, it receives a HttpRequestMessage as an input parameter, as well as a TraceWriter


Because Alexa would be making requests to the function, I also added some code to determine some specifics about the query. Don’t worry, this will make more sense when I get to the Alexa Custom Skill.

    dynamic data = await req.Content.ReadAsAsync<object>();
    string type = "general";
    string fact = "";

    if (data.request.intent.slots.type.value != null) 
    {
        switch(data.request.intent.slots.type.value.ToString().ToLower())
        {
            case "history":
                type = "history";
                break;
            case "general":
                type = "general";
                break;
            case "product":
                type = "product";
                break;
            default:
                type = "general";
                break;            
        }
    }


I then passed the type value to the Kenitco Cloud API to retrieve the filtered items. Using a little Math randomization, I then pulled out a random fact from the returned collection.

    // Get the Kentico Cloud content
    DeliveryClient client = new DeliveryClient(ConfigurationManager.AppSettings["KenticoFactsProjectID"]);
    // Get the fact items from Kentico Cloud
    var responseFact = await client.GetItemsAsync(
        new EqualsFilter("system.type", "fact"),
        new ContainsFilter("elements.facttype", type)
    );

    int r = rnd.Next(responseFact.Items.Count);

    fact = responseFact.Items[r].GetString("facttext");

    log.Info(fact);


Lastly, I coded the function to return the JSON that Alexa would be expecting. You can find more about this JSON structure here.

    return req.CreateResponse(HttpStatusCode.OK, new
    {
        verstion = "1.0",
        sessionAttributes = new {},
        response = new
        {
            outputSpeech = new 
            {
                type = "PlainText",
                text = fact
            },
            card = new
            {
                type = "Simple",
                title = "Fact",
                content = fact
            },
            shouldEndSession = true
        }
    });


Here is the full Azure Function code.

using System.Net;
using KenticoCloud.Delivery;
using System.Text;
using System.Configuration;

static Random rnd = new Random();

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    dynamic data = await req.Content.ReadAsAsync<object>();
    string type = "general";
    string fact = "";

    if (data.request.intent.slots.type.value != null) 
    {
        switch(data.request.intent.slots.type.value.ToString().ToLower())
        {
            case "history":
                type = "history";
                break;
            case "general":
                type = "general";
                break;
            case "product":
                type = "product";
                break;
            default:
                type = "general";
                break;            
        }
    }

    // Get the Kentico Cloud content
    DeliveryClient client = new DeliveryClient(ConfigurationManager.AppSettings["KenticoFactsProjectID"]);
    // Get the fact items from Kentico Cloud
    var responseFact = await client.GetItemsAsync(
        new EqualsFilter("system.type", "fact"),
        new ContainsFilter("elements.facttype", type)
    );

    int r = rnd.Next(responseFact.Items.Count);

    fact = responseFact.Items[r].GetString("facttext");

    log.Info(fact);

    return req.CreateResponse(HttpStatusCode.OK, new
    {
        verstion = "1.0",
        sessionAttributes = new {},
        response = new
        {
            outputSpeech = new 
            {
                type = "PlainText",
                text = fact
            },
            card = new
            {
                type = "Simple",
                title = "Fact",
                content = fact
            },
            shouldEndSession = true
        }
    });
}

Create an Alexa Custom Skill

With the Azure Function in place, I was ready to create my Alexa Custom Skill. Creating an Alexa skill involves some basic configuration, telling Alexa how to interact with the service, and setting up the plumbing so Alexa and your service can talk to each other. 

Skill Information

In the Amazon Developer portal, I created a new skill named Kentico Facts. Note that I used specific values for the Name and Invocation Name. These are what will be displayed in the Alexa marketplace and the phrase that Alexa will listen for to start the skill.


Interaction Model

In the Interaction Model settings, I defined how Alexa would work with the skill. 


I defined the intent scheme, which specified the JSON that Alexa would use to make the request. Note the slots definition. This would determine the type of fact to retrieve.

{
  "intents": [
    {
      "slots": [
        {
          "name": "type",
          "type": "LIST_OF_FACTTYPES"
        }
      ],
      "intent": "GetKenticoFact"
    }
  ]
}


For the Custom Slot Types, I defined a new list of fact types. These values correspond to the radio selection element in my Kentico Cloud content type.

history
general
product

Lastly, I defined Sample Utterances. This setting specifies the phrases Alexa will respond to a when there is a skill request. Note how the type slot is used in the various phrases.

GetKenticoFact get me a {type} fact
GetKenticoFact get a {type} fact
GetKenticoFact tell me a {type} fact
GetKenticoFact tell me something about kentico
GetKenticoFact tell me something about the company
GetKenticoFact tell me something about kentico's {type}
GetKenticoFact tell me something about the company's {type}
GetKenticoFact what is something interesting about kentico
GetKenticoFact what is something interesting about the company
GetKenticoFact what is something interesting about kentico's {type}
GetKenticoFact what is something interesting about the company's {type}
GetKenticoFact what's something interesting about kentico
GetKenticoFact what's something interesting about the company
GetKenticoFact what's something interesting about kentico's {type}
GetKenticoFact what's something interesting about the company's {type}

Note: The list of Sample Utterances can evolve over time as you learn how users interact with the skill. 

Configuration

For the Configuration, I selected HTTPS and entered the URL for the Azure Function.


Tip: You can get your function URL in the Azure Portal using the </> Get function URL link within your function.


SSL Certificate

For the SSL Certificate, I selected My development endpoint is a sub-domain… option. Because my function was hosted in Azure Functions, I leveraged the *.azurewebsites.net certificate. 

Testing

The last part of the skill definition was to test it. Using the interface, I sent a command to the skill to confirm it returned the appropriate values. I used the product type as part of the request, to make sure it returned the right kind of fact.


I also confirmed the Azure Function was receiving the request and returning the fact information.

Talking to My Digital Friend

With everything in place, I was ready to test my skill using my Echo Dot. In the Alexa app, I confirmed the custom skill was listed under My Skills. I then asked Alexa for a fact.

Here is a video I made demonstrating the functionality. On the screen, I had the Azure Function Log displayed, which updated as each new request came in. Using the TraceWriter in the function, I wrote out the returned fact to the log.

Moving Forward

Pretty awesome, right?!? By managing content in Kentico Cloud, you can easily deliver it to nearly any channel. Perhaps your site has a knowledgebase that you would like to make more accessible. With this solution, you could keep all your articles in KC, display them on your site, as well as provide an interactive AI experience for people to consume them. I hope this article shows you how easy it is to integrate Alexa with Kentico Cloud and build unique solutions. Good luck!