Deploy a Swarm Cluster with Alexa

Serverless and Containers changed the way we leverage public clouds and how we write, deploy and maintain applications. A great way to combine the two paradigms is to build a voice assistant with Alexa based on Lambda functions – written in Go – to deploy a Docker Swarm cluster on AWS.

The figure below shows all components needed to deploy a production-ready Swarm cluster on AWS with Alexa.

Note: Full code is available on my GitHub.

A user will ask Amazon Echo to deploy a Swarm Cluster:

Echo will intercept the user’s voice command with built-in natural language understanding and speech recognition. Convey them to the Alexa service. A custom Alexa skill will convert the voice commands to intents:

The Alexa skill will trigger a Lambda function for intent fulfilment:

The Lambda Function will use the AWS EC2 API to deploy a fleet of EC2 instances from an AMI with Docker CE preinstalled (I used Packer to bake the AMI to reduce the cold-start of the instances). Then, push the cluster IP addresses to a SQS:

Next, the function will insert a new item to a DynamoDB table with the current state of the cluster:

Once the SQS received the message, a CloudWatch alarm (it monitors the ApproximateNumberOfMessagesVisible parameter) will be triggered and as a result it will publish a message to an SNS topic:

The SNS topic triggers a subscribed Lambda function:

The Lambda function will pull the queue for a new cluster and use the AWS System Manager API to provision a Swarm cluster on the fleet of EC2 instances created earlier:

For debugging, the function will output the Swarm Token to CloudWatch:

Finally, it will update the DynamoDB item state from Pending to Done and delete the message from SQS.

You can test your skill on your Amazon Echo, Echo Dot, or any Alexa device by saying, “Alexa, open Docker

At the end of the workflow described above, a Swarm cluster will be created:

At this point you can see your Swarm status by firing the following command as shown below:

Improvements & Limitations:

  • Lambda execution timeout if the cluster size is huge. You can use a Master Lambda function to spawn child Lambda.
  • CloudWatch & SNS parts can be deleted if SQS is supported as Lambda event source (AWS PLEAAASE !). DynamoDB streams or Kinesis streams cannot be used to notify Lambda as I wanted to create some kind of delay for the instances to be fully created before setting up the Swarm cluster. (maybe Simple Workflow Service ?)
  • Inject SNS before SQS. SNS can add the message to SQS and trigger the Lambda function. We won’t need CloudWatch Alarm.
  • You can improve the Skill by adding new custom intents to deploy Docker containers on the cluster or ask Alexa to deploy the cluster on a VPC

In-depth details about the skill can be found on my GitHub. Make sure to drop your comments, feedback, or suggestions below — or connect with me directly on Twitter @mlabouardy.

Serverless Golang API with AWS Lambda

AWS has announced few days ago, Go as supported language for AWS Lambda. So, I got my hand dirty and I made a Serverless Golang Lambda Function to discover new Movies by genres, I went even further and created a Frontend in top of my API with Angular 5.

Note: The full source code for this application can be found on GitHub 

To get started, install the dependencies below:

Create a main.go file with the following code:

The handler function takes as a parameter the movie genre ID then query the TMDb API  – Awesome free API for Movies and TV Shows – to get list of movies. I registred the handler using the lambda.Start() method.

To test our handler before deploying it, we can create a basic Unit Test:

Issue the following command to run the test:

Next, build an executable binary for Linux:

Zip it up into a deployment package:

Use the AWS CLI to create a new Lambda Function:

Note: substitute role flag with your own IAM role.

Sign in to the AWS Management Console, and navigate to Lambda Dashboard, you should see your lambda function has been created:

Set TMDb API KEY (Sign up for an account) as environment variable:

Create a new test event:

 Upon successful execution, view results in the console:

To provide the HTTPS frontend for our API, let’s add API Gateway as a trigger to the function:

Deployment:

Now, if you point your favorite browser to the Invoke URL:

Congratulations   you have created your first Lambda function in Go.

Let’s build a quick UI in top of the API with Angular 5. Create an Angular project from scratch using Angular CLI. Then, generate a new Service to calls the API Gateway URL:

In the main component iterate over the API response:

Note: the full code is in GitHub.

Generate production grade artifacts:

The build artifacts will be stored in the dist/ directory

Next, create an S3 bucket with AWS CLI:

Upload the build artifacts to the bucket:

Finally, turns website hosting on for your bucket:

If you point your browser to the S3 Bucket URL, you should be happy:

Amazon Alexa GitHub Followers Counter

This post is part of “Alexa” series. I will walk you through how to build an Amazon Alexa Skill with Node.JS and Lambda to get numbers of followers & repositories in GitHub in real-time.

Note: all the code is available in my GitHub.

Amazon Echo will captures voice commands and send them to the Alexa Skill to convert them into structured text commands. A recognized command is sent to an AWS Lambda function that will call GitHub API to get response.

To get started, sign up to Amazon Developer Console,  and create a new Alexa Skill:

The invocation name is what user will say to trigger the skill. In our case it will be “github“.

Click on “Next” to bring up the Interaction Model page, use the intent schema below:

Intents will map user’s voice command to services that our Alexa skill can address. For instance, here I defined an intent called GetGithubFollowerCount, which will line up with a portion of code in my Lambda funtion that I leverage in a bit.

The programming languages are defined as a Custom Slot Type, with the following possible values:

Now our intents are defined, we need to link them to a human request that will trigger this linkage. To do this multiple sentences (utterances) are listed to make the interaction as natural as possible.

Result:

Click on “Next” and you will move onto a page that allows us to use an ARN (Amazon Resource Name) to link to AWS Lambda.

Before that, let’s create our lambda function, login to AWS Management Console, then navigate to Lambda Dashboard and create a new function from scratch:

Select Alexa Skills Kit as trigger:

I wrote the Lambda functions in Node.JS, although that code isn’t actually that interesting so I won’t go into it in much detail.

This function is fired when there is an incoming request from Alexa. The function will:

  • Process the request
  • Call GitHub API
  • Send the response back to Alexa

Create a zip file consisting of the function above and any dependencies (node_modules). Then, specify the .zip file name as your deployment package at the time you create the Lambda function. Don’t forget to set your GitHub Username as an environment variable:

Back in the Alexa Skill we need to link our Lambda function as our endpoint for the Alexa Skill:

That’s it, let’s test it out using a Service Simulation by clicking on “Next“.

GetFollowerCount Intent : 

GetRepositoryCount Intent:

GetGithubRepositoryCountByLanguage Intent:

You can see that the Lambda responds as expected !

Test it now with Amazon Echo, by saying “Alexa, ask GitHub for …” :

 

Youtube to MP3 using S3, Lambda & Elastic Transcoder

In this tutorial, I will show you how to convert a Youtube video 📺 to a mp3 file 💿 using AWS Elastic Transcoder. How can we do that ?

We will create a Lambda function to consume events published by S3. For any video uploaded to a bucket, S3 will invoke our Lambda function by passing event information. AWS Lambda executes the function. As the function executes, it reads the S3 event data, logs some of the event information to Amazon CloudWatch. Then, kick off a transcoding job.

Let’s start, by creating an S3 bucket to store the inputs files (videos) and the outputs files (audio) :

Next, let’s define a Transcoder pipeline. A pipeline essentially defines a queue for future transcoding jobs. To create a pipeline, we need to specify the input bucket (where the videos will be).

Note: Copy down the Pipeline ID, we will need later on

Having created a pipeline, go to the AWS Management Console, navigate to Lambda service & click on “Create a Lambda Function“, add S3 as the event source for Lambda function:

I used the following Node.JS code:

The script does the following:

  • Extract the filename of the uploaded file from the event object
  • Create a Transcoder job and specify the required outputs
  • Launch the job

Note: you might notice in the function above is the use of presets (1351620000001-300040). It describes how to encode the given file (in this case mp3). The full list of available presets can be found in AWS Documentation.

Finally, set the pipeline id as an envrionment variable and select an IAM role with permission to access Elastic Transcoder:

Once created, upload a video file to the inputs bucket:

If everything went well, you should see the file in your outputs bucket:

S3 will trigger our Lambda function. It will then execute our function. and log the S3 object name to CloudWatch Logs:

After couple of seconds (or minutes depends on the size of the video ) , you should see a new MP3 file has been generated by Elastic Transcoder job inside the outputs directory in the S3 bucket:

Create Front-End for Serverless RESTful API

In this post, we will build an UI for our Serverless REST API we built in the previous tutorial, so make sure to read it before following this part.

Note: make sure to enable CORS for the endpoint. In the API Gateway Console under Actions and Enable CORS:

The first step is to clone the project:

Head into the ui folder, and modify the js/app.js with your own API Gateway Invoke URL:

Once done, you are ready to create a new S3 bucket:

Copy all the files in the ui directory into the bucket:

Finally, turns website hosting on for your bucket:

After running this command all of our static files should appear in our S3 bucket:

Your bucket is configured for static website hosting, and you now have an S3 website url like this http://<bucket_name>.s3-website-us-east-1.amazonaws.com