A Hands on Crash Course in AWS’s Serverless Technologies

Lucas Kardonski
codeburst
Published in
12 min readMay 18, 2020

--

This tutorial is intended to be as complete and concise as possible to cover the most subjects without taking too much of your time. In this tutorial you will build and deploy serverless resources using the AWS console, then you will build and deploy a static app that uses these serverless resources to copy objects from one s3 bucket to another.

Background: Currently, I’m working on an open source project that uses a lot of serverless components. Thought I could share some of what I’ve learned with the community :)

What you will learn:

  1. A brief overview of serverless technologies
  2. How to create serverless resources in the AWS console
  3. How to test an API with Postman
  4. How to call a REST API from a react.js app using AXIOS
  5. How to add styling to your react.js app with semantic UI
  6. How to deploy a react.js app to a static s3 bucket using Amplify

What you will build:

  1. A simple node.js lambda function that uses the AWS-SDK-JS to copy objects from one s3 bucket to another.
  2. A REST API with POST method that takes in a JSON payload
  3. A react.js app that will use the method created above to copy objects from one s3 bucket to another.

1. Create a Lambda Function and a REST API

Pre-Requisites

  • Have an AWS account. If you don’t have one create one Here.
  • Create two or more S3 buckets. You can follow this tutorial and repeat it two or more times to have multiple S3 bucket

Create an IAM Policy and an IAM Role to manage your lambda function

  1. Sign in to the AWS console
  2. Go to services -> IAM
  3. On the sidebar, click on Policies and click Create Policy
  4. Click on the JSON tab and paste the following policy. **Please make sure to change yoursourcebucket and yourdestinationbucket for the names of your respective source and destination buckets.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListSourceAndDestinationBuckets",
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:ListBucketVersions"
],
"Resource": [
"arn:aws:s3:::yoursourcebucket",
"arn:aws:s3:::yourdestinationbucket"
]
},
{
"Sid": "SourceBucketGetObjectAccess",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion"
],
"Resource": "arn:aws:s3:::yoursourcebucket/*"
},
{
"Sid": "DestinationBucketPutObjectAccess",
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::yourdestinationbucket/*"
}
]
}

5. Click on Review Policy

6. Give your policy a name and optionally a description.

7. Click on Create Policy.

8. Next Create a role to use this policy. Go back to the IAM console home

9. On the sidebar, click on Roles

10. Click on Create Roll

11. Under Choose the service that will use this role, select Lambda

12. Click on Next: Permissions

13. Search for your newly created Policy

14. Select the checkbox beside it

15. Click on Next: Tags

16. Click on Next: Review

17. Give Your Role a name and optionally a description.

18. Click on Create Roll

Create a Lambda function to copy objects from one s3 bucket to another

Overview: Lambda is AWS’s serverless compute service and it’s pretty awesome! It’s not actually serverless as in there’s still servers involved; but they are only provided to you whenever your lambda function is requested. Lambda creates the execution environment and all of the necessary resources and executes your function in milliseconds. As a customer, this allows you to pay for infrastructure at a per request basis.

  1. Sign in to the AWS console
  2. Go to Services -> Lambda
  3. Click on Create Function
  4. Use the Author from scratch option and leave Runtime set to node.js 8.10
  5. Under Permissions, For execution role, select: Use an existing role
  6. For Existing Role. click on the dropdown arrow in the input and select your newly created IAM role.
  7. Click on Create function
  8. You should see a screen like the following

9. Under function code, delete all the default code and paste in the following code. **Please make sure to change yoursourcebucket and yourdestinationbucket for the names of your respective source and destination buckets.

// Load the AWS SDK
const aws = require('aws-sdk');
// Construct the AWS S3 Object - http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#constructor-property
const s3 = new aws.S3({
apiVersion: '2006-03-01'
});

// Define 2 new variables for the source and destination buckets
const srcBucket = "your-source-bucket-name";
const destBucket = "your-destination-bucket-name";
//Main function
exports.handler = (event, context, callback) => {
var copySource = `${srcBucket}/${event.sourceObject}`;
var bucket = `${destBucket}`;
if(event.sourceRoute.length > 1) {
copySource = `${srcBucket}/${event.sourceRoute}/${event.sourceObject}`;
bucket = `${destBucket}/${event.destRoute}`;
}
//Copy the current object to the destination bucket - http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#copyObject-property
s3.copyObject({
CopySource: copySource,
Bucket: bucket,
Key: event.sourceObject
}, function(copyErr, copyData){
if (copyErr) {
console.log("Error: " + copyErr);
} else {
console.log('Copied OK');
}
});
callback(null, 'All done!');
};

10. Click on Save to save your new Lambda function.

11. Next configure a test by clicking on the dropdown next to the Test button and selecting the configure test events option.

12. Give the Event a name, like copyBucketTest

13. Enter the following JSON payload. **Change source_route, source_object, destination_route etc for valid values.

{
"sourceRoute": "source_route/source_sub_route",
"sourceObject": "source_object.html",
"destRoute": "destination_route"
}

14. Next go to the S3 console, Services-> S3

15. And go into your source bucket, click on permissions and click on CORS configuration.

16. Paste in the following CORS policy

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<ID>S3CORSRuleId1</ID>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>HEAD</AllowedMethod>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>DELETE</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<ExposeHeader>x-amz-server-side-encryption</ExposeHeader>
<ExposeHeader>x-amz-request-id</ExposeHeader>
<ExposeHeader>x-amz-id-2</ExposeHeader>
<ExposeHeader>ETag</ExposeHeader>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

Create a POST API to call this function with AWS API gateway

Now that you created the lambda function, you want to be able to access it from an external source, like a website or an app that has a form. To do this you have to create an API for your Lambda function. You are going to want to create a POST method for your API that receives a payload (a JSON object) with the parameters of your copyBucketFunction e.g. the source route, object and destination route. This might sound a bit complicated but AWS makes it really easy..

  1. Sign in to the AWS console
  2. Go to Services -> API Gateway
  3. Click on + Create API

4. Give your API a name, e.g. copyToBucketApi and optionally a description.

5. Click Create API

6. Next click on Actions -> Create Method

7. Select POST from the little dropdown menu that will pop up at your resources sidebar.

8. Click on the little check mark

9 Leave the default — Lambda function selected for integration type

10. In Lambda function: type the name of your lambda function

11. Click save

12. Confirm the Add Permission to Lambda Function pop up.

13. Your all set. Now you can test your function by providing a payload and clicking test. *you can use the same payload from a couple of steps before.

15. You should receive a response message and status

16. Now you are ready to deploy your API. go to Actions -> Deploy API

17. Give your API a name and click Deploy

18. Your Invoke URL: is your API url e.g. the one that you need to replace into your Axios request later on.

Export and test your API in Postman (optional)

Postman is a great tool to test out your API’s before you implement them in your apps.

  1. In the API gateway console, click on your API -> Stages and then click on your current stage.
  2. To export your API as a JSON specially formatted for Postman, Click on the export tag -> Export as Swagger + Postman Extensions -> JSON

3. Download postman for your OS here and complete the signup process.

4. In the home screen, next to + New, click on Import.

5. Click on select files, and select your newly created file.

2. Create a simple react.js app and deploy it to a static s3 bucket using Amplify

Pre-requisites

  • Have Node.js and NPM installed. you can follow this tutorial to Install Node JS and NPM using NVM.

Creating the React.js App that calls the API method you created previously

  1. Create a new directory in your desktop or wherever else you like, mkdir directory_name
  2. Cd into the new directory cd directory_name **you can skip the next steps by simply cloning the github repo: git clone https://github.com/gkpty/axios_form_post_example.git
  3. Create a new react app by running npm create-react-app APP_NAME **change APP_NAME for your desired app name
  4. To go into the directory of your newly created app, cd APP_NAME
  5. Run npm start to preview the default app
  6. Edit the src/App.js file with your favorite text editor
  7. You can download the babel extension for sublime here
  8. Or you can use VScode (give it a try! I don’t use too much MS myself, but for a text editor, this is pretty fast, light and its packed with features and extensions).
  9. Run npm install axios --save to install axios. Axios is a request Library for Angular.js which also works quite well in react and it’s got widespread support.
  10. You need to install typescript as well. npm install typescript
  11. Paste the following code in the App.js file.
import React, { Component } from 'react';
import axios from 'axios';
import './App.css'
class App extends Component {
state = {
sourceRoute: '',
sourceObject: '',
destRoute: '',
}
handleChange = event => {
const target = event.target;
const value = target.value
const name = target.name;
this.setState({
[name]: value
});
}
handleSubmit = event => {
event.preventDefault();
const bucketVars = {
sourceRoute: this.state.sourceRoute,
sourceObject: this.state.sourceObject,
destRoute: this.state.destRoute
};
axios.post(`https://your-api-gateway-url`, bucketVars, { headers: {'Accept': 'application/json', 'Content-Type': 'application/x-www-form-urlencoded' }})
.then(res => {
console.log(res);
console.log(res.data);
// if res.data.contains("success")
// notice("success")
// do something
// else
// alert("fail")
})
}
render() {
return (
<div>
<form onSubmit={this.handleSubmit}>
<h2>Copy To Bucket</h2>
<div>
<label>
Source Route
<input type="text" placeholder="Source Route" name="sourceRoute" onChange={this.handleChange} />
</label>
</div>
<div className="field">
<label>
Source Object
<input type="text" placeholder="Source Object" name="sourceObject" onChange={this.handleChange} />
</label>
</div>
<div className="field">
<label>
Destination Route
<input type="text" placeholder="Destination Route" name="destRoute" onChange={this.handleChange} />
</label>
</div>
<button className="ui button" type="submit">Add</button>
</form>
</div>
)
}
}
export default App;

The code explained: notice the change event handler is set to handle multiple inputs. It sets the state of inputs according to their names and values.

Also check out the structure of the request made out to the API we created in the AWS API Gateway. You’ll notice we have our API’s URL, followed by bucketVars which is our JSON payload that contains the names and values of the state of the three form inputs at the moment of submitting the form. Then we add two headers, Accept and Content-Type.

**you can find your API’s URL if you go to the API Gateway console, click on your API and click on stages in the sidebar.

axios.post(`https://your-api-gateway-url`, bucketVars, { headers: {'Accept': 'application/json', 'Content-Type': 'application/x-www-form-urlencoded' }})

12. Now you can run npm start and you should see something like this

Adding Styling to your form

Using the Semantic UI Library

  1. run npm i semantic-ui-css to install semantic ui
  2. Add import ‘semantic-ui-css/semantic.min.css’ in the header of your App.js file.
  3. Now that you have installed semantic UI you can use the class attribute as opposed to className.
  4. Modify your code in the following way to include semantic UI form and field classes.
import React, { Component } from 'react';
import axios from 'axios';
import './App.css'
import 'semantic-ui-css/semantic.min.css'
class App extends Component {
state = {
sourceRoute: '',
sourceObject: '',
destRoute: '',
}
handleChange = event => {
const target = event.target;
const value = target.value
const name = target.name;
this.setState({
[name]: value
});
}
handleSubmit = event => {
event.preventDefault();
const bucketVars = {
sourceRoute: this.state.sourceRoute,
sourceObject: this.state.sourceObject,
destRoute: this.state.destRoute
};
axios.post(`https://your-api-gateway-url`, bucketVars, { headers: {'Accept': 'application/json', 'Content-Type': 'application/x-www-form-urlencoded' }})
.then(res => {
console.log(res);
console.log(res.data);
// if res.data.contains("success")
// notice("success")
// do something
// else
// alert("fail")
})
}
render() {
return (
<div className="ui container">
<form className="ui form" onSubmit={this.handleSubmit}>
<h2 className="ui dividing header">Copy To Bucket</h2>
<div className="field">
<label>
Source Route
<input type="text" placeholder="Source Route" name="sourceRoute" onChange={this.handleChange} />
</label>
</div>
<div className="field">
<label>
Source Object
<input type="text" placeholder="Source Object" name="sourceObject" onChange={this.handleChange} />
</label>
</div>
<div className="field">
<label>
Destination Route
<input type="text" placeholder="Destination Route" name="destRoute" onChange={this.handleChange} />
</label>
</div>
<button className="ui button" type="submit">Add</button>
</form>
</div>
)
}
}
export default App;

Now, when you preview your app you should see something like this.

It looks a lot better than default HTML and its pretty fast to prototype with! It comes packed with a bunch of features that can get you prototyping some pretty advanced stuff quite quickly.

Add bootstrap to your project using npm

  1. npm install — save bootstrap
  2. Add the following in your header
import 'bootstrap/dist/css/bootstrap.css';
// Put any other imports below so that CSS from your
// components takes precedence over default styles.

Now add your custom stylesheets to your app and make use of the row and column classes.

Install Amplify and add authentication to your newly created React.js App

  1. In your app’s directory, Install amplify npm install aws-amplify aws-amplify-react
  2. Install the amplify CLI npm install -g @aws-amplify/cli
  3. Check to see if amplify installed correctly by running amplify
  4. Configure amplify by running amplify configure
  5. The following short video tutorial will help you with the configuration
  6. Initialize a new amplify project inside your react app amplify init
  7. Add authentication by running amplify add auth use the default configuration.

Deploying your new react app to a static s3 bucket using Amplify

Before deploying your amplify app its very important that you set your intended homepage for the app. I’ll be using admin.almostcms.org in this example as my chosen hosting address.

  1. Open up your package.json file and search for your “homepage” variable (cmd/ctrl + f), and change the value so that it matches the homepage address you want for your newly created app.

2. Now you can run amplify hosting add to configure your resources.

You can choose between a Dev and Production environments. Dev will only deploy your app in an S3 bucket with HTTP; it will not have a cloudFront distribution or HTTPS. In this example we will be setting up a dev environment to start alpha testing our app.

3. Now run amplify hosting configure and again type in the name of your desired hosting address next to hosting bucket name.

You can run this command at any time to modify the configuration of your resources (website , cloudfront, publish, exit).

4. Now you are ready to publish your App! Run amplify publish and click yes

Be a bit patient with deployment process and in a bit you should see a new window pop up with your apps homepage!

--

--

Entrepreneur, MBA, Innovation and Tech Enthusiast, Co Founder at Torus Digital, Learning to Code #nature #outdoors #surf #fishing #coffee #rum #mycology