Youtube to MP3 using S3, Lambda & Elastic Transcoder

In this tutorial, I will show you how to convert a Youtube video 📺 to a mp3 file 💿 using AWS Elastic Transcoder. How can we do that ?

We will create a Lambda function to consume events published by S3. For any video uploaded to a bucket, S3 will invoke our Lambda function by passing event information. AWS Lambda executes the function. As the function executes, it reads the S3 event data, logs some of the event information to Amazon CloudWatch. Then, kick off a transcoding job.

Let’s start, by creating an S3 bucket to store the inputs files (videos) and the outputs files (audio) :

Next, let’s define a Transcoder pipeline. A pipeline essentially defines a queue for future transcoding jobs. To create a pipeline, we need to specify the input bucket (where the videos will be).

Note: Copy down the Pipeline ID, we will need later on

Having created a pipeline, go to the AWS Management Console, navigate to Lambda service & click on “Create a Lambda Function“, add S3 as the event source for Lambda function:

I used the following Node.JS code:

The script does the following:

  • Extract the filename of the uploaded file from the event object
  • Create a Transcoder job and specify the required outputs
  • Launch the job

Note: you might notice in the function above is the use of presets (1351620000001-300040). It describes how to encode the given file (in this case mp3). The full list of available presets can be found in AWS Documentation.

Finally, set the pipeline id as an envrionment variable and select an IAM role with permission to access Elastic Transcoder:

Once created, upload a video file to the inputs bucket:

If everything went well, you should see the file in your outputs bucket:

S3 will trigger our Lambda function. It will then execute our function. and log the S3 object name to CloudWatch Logs:

After couple of seconds (or minutes depends on the size of the video ) , you should see a new MP3 file has been generated by Elastic Transcoder job inside the outputs directory in the S3 bucket:

Create Front-End for Serverless RESTful API

In this post, we will build an UI for our Serverless REST API we built in the previous tutorial, so make sure to read it before following this part.

Note: make sure to enable CORS for the endpoint. In the API Gateway Console under Actions and Enable CORS:

The first step is to clone the project:

Head into the ui folder, and modify the js/app.js with your own API Gateway Invoke URL:

Once done, you are ready to create a new S3 bucket:

Copy all the files in the ui directory into the bucket:

Finally, turns website hosting on for your bucket:

After running this command all of our static files should appear in our S3 bucket:

Your bucket is configured for static website hosting, and you now have an S3 website url like this http://<bucket_name>.s3-website-us-east-1.amazonaws.com

Create a Serverless REST API with Node.JS, AWS Lambda, DynamoDB & API Gateway

In this post, I will show you how to build a RESTful API in Node.JS following the Serverless approach using AWS Lambda, API Gateway & DynamoDB.

Serverless computing is a cloud computing execution model in which the cloud provider dynamically manages the allocation of machine resources

All the code is available on my Github 😎

1 – API Specification

The REST API service will expose endpoints to manage a store of movies. The operations that our endpoints will allow are:

2 – DynamoDB Table

Go to DynamoDB Console, then click on “Create table” button, and fill the table name and set a primary key:

3 – Write a new Movie Lambda Function

The code is self explanatory, I have used the put method to insert a new movie item and uuid library to generate a random unique id for the movie item:

Go to the Lambda Dashboard and create a new function as below:

Add the DynamoDB table name as an environment variable and update the handler name to the function name in the code above:

Note: you should use a IAM Role with permission to access DynamoDB & Lambda.

Once created, you can click on “Test” and send a JSON as below:

The item has been successfuly inserted into DynamoDB:

4 – List all Movies Lambda Function

The code is self explanatory, I used scan method to fetch all items from the table:

Fill the function configuration as below:

Similar to the write function, we need to add DynamoDB table name as an environment variable:

Once created, you could test the function by clicking on the “Test” button:

Congratulation ! 😁 We have successfuly created our Lambda functions:

5 – Setup API Gateway

Go to API Gateway Console and create a new API.

5.1 – GET /movies

5.2 – POST /movies

Once done, deploy the API:

Copy down the API Invoke URL:

6 – Test

Let’s test it out:

6.1 – Create a Movie

With Postman:

With cURL:

6.2 – List Movies

With Postman:

With cURL:

If we check our DynamoDB Table 😇

Setup AWS Lambda with Scheduled Events

This post is part of my “Serverless” series. In this part, I will show you how to setup a Lambda function to send mails on a defined scheduled event from CloudWatch.

1 – Create Lambda Function

So start by cloning the project :

I implemented a simple Lambda function in NodeJS to send an email using MailGun library

Note: you could use another service like AWS SES or your own SMTP server

Then, create a zip file:

Next, we need to create an Execution Role for our function:

Execute the following Lambda CLI command to create a Lambda function. We need to provide the zip file, IAM role ARN we created earlier & set MAILGUN_API_KEY and MAILGUN_DOMAIN as parameters.

Note: the –runtime parameter uses Node.JS 6.10 but you can also specify Node.JS 4.3

Once created, AWS Lambda returns function configuration information as shown in the following example:

Now if we go back to AWS Lambda Dashboard we should see our function has been successfuly created:

2 – Configure a CloudWatch Rule

Create a new rule which will trigger our lambda function each 5 minutes:

Note: you can specify the value as a rate or in the cron expression format. All schedules use the UTC time zone, and the minimum precision for schedules is one minute

If you go back now to the Lambda Function Console and navigate to the Trigger tab, you should see the CloudWatch has been added:

After 5 minutes, CloudWatch will trigger the Lambda Function and you should get an email notification: