Attach an IAM Role to an EC2 Instance with CloudFormation

CloudFormation allows you to manage your AWS infrastructure by defining it in code.

In this post, I will show you guys how to create an EC2 instance and attach an IAM role to it so you can access your S3 buckets.

First, you’ll need a template that specifies the resources that you want in your stack. For this step, you use a sample template that I already prepared:

The template creates a basic EC2 instance that uses an IAM Role with S3 List Policy. It also creates a security group which allows SSH access from anywhere.

Note: I used also the Parameters section to declare values that can be passed to the template when you create the stack.

Now we defined the template. Sign in to AWS Management Console then navigate to CloudFormation, and click on “Create Stack“. Upload the JSON file:

You would be asked to assign a name to this stack, and choose your EC2 specs configuration & SSH KeyPair:

Make sure to check the box “I ackownledge the AWS CloudFormation might create IAM resources” in order to create the IAM Policy & Role:

Once launched, you will get the following screen with launching process events:

After a while, you will get the CREATE_COMPLETE message in the status tab:

Once done, on the output tab, you should see how to connect via SSH to your instance:

If you point your terminal to the value shown in the output tab, you should be able to connect via SSH to server:

Let’s check if we can list the S3 buckets using the AWS CLI:

Awesome ! so we are able to list the buckets, but what if we want to create a new bucket:

It didn’t work, and it’s normal because the IAM Role attached to the instance doesn’t have enough permission (CreateBucket action).

Continuous Deployment with AWS CodeDeploy & Github

This post will walk you through how to AutoDeploy your application from Github using AWS CodeDeploy.

Let’s start first by creating 2 IAM roles we will use in this tutorial:

  • IAM role for CodeDeploy to talk to EC2 instances.
  • IAM role for EC2 to access S3.

1 – CodeDeployRole

Go to AWS IAM Console  then navigate to “Roles“, choose “Create New Role“, Select “CodeDeploy” and attach “AWSCodeDeployRole” policy:

2 – EC2S3Role

Create another IAM role, but this time choose EC2 as the trusted entity. Then, attach “AmazonS3ReadOnlyAccess” policy:

Now that we’ve created an IAM roles, let’s launch an EC2 instance which will be used by CodeDeploy to deploy our application.

3 – EC2 Instance

Launch a new EC2 instance with the IAM role we created in last section:

Next to User Data type the following script to install the AWS CodeDeploy Agent at boot time:

Note: make sure to allow HTTP traffic in the security group.

Once created, connect to the instance using the Public IP via SSH, and verify whether the CodeDeploy agent is running:

4 – Application

Add the appspec.yml file to the application to describe to AWS CodeDeploy how to manage the lifecycle of your application:

The BeforeInstall, will install apache server:

The AfterInstall will restart apache server

5 – Setup CodeDeploy

Go to AWS CodeDeploy and create a new application:

Select In-Place deployement (with downtime):

Click on “Skip“, because we already setup our EC2 instance:

The above will take you to the following page where you need to give a name to your application:

Select the EC2 instance and assign a name to the deployment group:

Select the CodeDeployRole we created in the first part of the tutorial:

Then click on “Deploy“:

Create a deployment, select Github as the data source:

Just select “Connect to GitHub“. Doing that will pop up a new browser window, take you to Github login where you will have to enter your username and password

After that come back to this page, and you should see something like below, just enter the remaining details and click “Deploy

This will take you to a page as follows:

If you point your browser to the EC2 public IP, you should see:

Now, let’s automate the deployment using Github Integrations.

6 – Continuous Deployment

Go to IAM Dashboard, and create a new policy which give access to register and create a new deployment from Github.

Next, create a new user and attach the policy we created before:

Note: copy to clipboard the user AWS ACCESS ID & AWS SECRET KEY. Will come handy later.

7 – Github Integration

Generate a new token to invoke CodeDeploy from Github:

Once the token is generated, copy the token and keep it. Then, add AWS CodeDeploy integration:

Fill the fields as below:

Finally, add Github Auto Deployment

Fill the form as below:

To test it out, let’s edit a file or commit a new file. You should see a new deployment on AWS CodeDeploy:

Youtube to MP3 using S3, Lambda & Elastic Transcoder

In this tutorial, I will show you how to convert a Youtube video 📺 to a mp3 file 💿 using AWS Elastic Transcoder. How can we do that ?

We will create a Lambda function to consume events published by S3. For any video uploaded to a bucket, S3 will invoke our Lambda function by passing event information. AWS Lambda executes the function. As the function executes, it reads the S3 event data, logs some of the event information to Amazon CloudWatch. Then, kick off a transcoding job.

Let’s start, by creating an S3 bucket to store the inputs files (videos) and the outputs files (audio) :

Next, let’s define a Transcoder pipeline. A pipeline essentially defines a queue for future transcoding jobs. To create a pipeline, we need to specify the input bucket (where the videos will be).

Note: Copy down the Pipeline ID, we will need later on

Having created a pipeline, go to the AWS Management Console, navigate to Lambda service & click on “Create a Lambda Function“, add S3 as the event source for Lambda function:

I used the following Node.JS code:

The script does the following:

  • Extract the filename of the uploaded file from the event object
  • Create a Transcoder job and specify the required outputs
  • Launch the job

Note: you might notice in the function above is the use of presets (1351620000001-300040). It describes how to encode the given file (in this case mp3). The full list of available presets can be found in AWS Documentation.

Finally, set the pipeline id as an envrionment variable and select an IAM role with permission to access Elastic Transcoder:

Once created, upload a video file to the inputs bucket:

If everything went well, you should see the file in your outputs bucket:

S3 will trigger our Lambda function. It will then execute our function. and log the S3 object name to CloudWatch Logs:

After couple of seconds (or minutes depends on the size of the video ) , you should see a new MP3 file has been generated by Elastic Transcoder job inside the outputs directory in the S3 bucket:

Create Front-End for Serverless RESTful API

In this post, we will build an UI for our Serverless REST API we built in the previous tutorial, so make sure to read it before following this part.

Note: make sure to enable CORS for the endpoint. In the API Gateway Console under Actions and Enable CORS:

The first step is to clone the project:

Head into the ui folder, and modify the js/app.js with your own API Gateway Invoke URL:

Once done, you are ready to create a new S3 bucket:

Copy all the files in the ui directory into the bucket:

Finally, turns website hosting on for your bucket:

After running this command all of our static files should appear in our S3 bucket:

Your bucket is configured for static website hosting, and you now have an S3 website url like this http://<bucket_name>.s3-website-us-east-1.amazonaws.com

Build RESTful API in Go and MongoDB

In this tutorial I will illustrate how you can build your own RESTful API in Go and MongoDB. All the code used in this demo can be found on my Github. 😊

1 – API Specification

The REST API service will expose endpoints to manage a store of movies. The operations that our endpoints will allow are:

2 – Fetching Dependencies

Before we begin, we need to grap the packages we need to setup the API:

  • toml :  Parse the configuration file (MongoDB server & credentials)
  • mux : Request router and dispatcher for matching incoming requests to their respective handler
  • mgo : MongoDB driver

3 – API structure

Once the dependencies are installed, we create a file called “app.go“, with the following content:

The code above creates a controller for each endpoint, then expose an HTTP server on port 3000.

Note: We are using GET, POST, PUT, and DELETE where appropriate.  We are also defining parameters that can be passed in

To run the server in local, type the following command:

If you point your browser to http://localhost:3000/movies, you should see:

4 – Model

Now that we have a minimal application, it’s time to create a basic Movie model. In Go, we use struct keyword to create a model:

Next, we will create the Data Access Object to manage database operations.

5 – Data Access Object

5.1 – Establish Connection

The connect() method as its name implies, establish a connection to MongoDB database.

5.2 – Database Queries

The implementation is relatively straighforward and just includes issuing right method using db.C(COLLECTION) object and returning the results. These methods can be implemented as follows:

6 – Setup API Endpoints

6.1 – Create a Movie

Update the CreateMovieEndpoint method as follows:

It decodes the request body into a movie object, assign it an ID, and uses the DAO Insert method to create a movie in database.

Let’s test it out:

With Postman:

With cURL

6.2 – List of Movies

The code below is self explanatory:

It uses FindAll method of DAO to fetch list of movies from database.

Let’s test it out:

With Postman:

With cURL:

6.3 – Find a Movie

We will use the mux library to get the parameters that the users passed in with the request:

Let’s test it out:

With Postman:

With cURL:

6.4 – Update an existing Movie

Update the UpdateMovieEndPoint method as follows:

Let’s test it out:

With Postman:

With cURL:

6.5 – Delete an existing Movie

Update the DeleteMovieEndPoint method as follows:

Let’s test it out:

With Postman:

With cURL:

7 – Taking this further

On my upcoming posts, I will show you how :

  • Write Unit Tests  in Go for each Endpoint
  • Build a UI in Angular 4
  • Setup a CI/CD with CircleCI
  • Deploy the stack on AWS and much more …

So stay tuned 😇