Search This Blog

Lambda

 

First Lambda function:

Lambda àFunctions àcreate a function àuse a blue print àBlue prints (hello world python), select A starter AWS function (hello world python)àconfigure
èfunction name (demo-lambda), Execution role (create new role with basic lambda functions) àcreate function

 

èselect Demo-lambda àselect function, test (create new test event), event name (demo-event), create

ànow test

 

Settings:
Select lambda function àconfiguration (general configuration)—edit –change accordingly àsave
 

Logs:
Select lambda function àMonitor àlogs àselect log stream (your taken to cloud watch logs console)

 

Changing code:
Select lambda function àcode àmodify code àDeploy --test


Installing Serverless

Serverless Framework:
      -          Serverless Framework is used for creating, deploying, managing, debugging Lambda                functions.
-          It integrates well with CI/ CD tools. 
-          It has CloudFormation support.



Installing Serverless:
install aws cli
install node:

PS C:\WINDOWS\system32> choco install nvs

ð  Install serverless framework

PS C:\WINDOWS\system32> npm install -g serverless

 

ð  PS C:\WINDOWS\system32> serverless config credentials --provider aws --key XXXXXX --secret YYY --profile serverless-admin






AWS VPC

503 AWS S3

Using an Amazon S3 trigger to invoke a Lambda function:

 
https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html
Create S3 bucket copy arn:
arn:aws:s3:::my-bucket-s3-logs

Create role with AmazonS3FullAccess, AWSLambdaFullAccess and CloudWatchFullAccess:
role name: role-Lambda-S3access

Create LAmbda Function:
BluePrint -- select Pyhton get_object -- select role

import json
import urllib.parse
import boto3

print('Loading function')

s3 = boto3.client('s3')


def lambda_handler(event, context):
    #print("Received event: " + json.dumps(event, indent=2))

    # Get the object from the event and show its content type
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
    try:
        response = s3.get_object(Bucket=bucket, Key=key)
        print("CONTENT TYPE: " + response['ContentType'])
        return response['ContentType']
    except Exception as e:
        print(e)
        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
        raise e
Select S3 bucket under Trigger in Lambda function

test in console:
create an event


S3 Trigger Lambda Function To Copy File from Source S3 bucket to Destination S3 Bucket:

(Using an Amazon S3 trigger to create thumbnail images)

create source and Destination buckets.
my-aws-source-bucket: arn:aws:s3:::my-aws-source-bucket
my-aws-destinaton-bucket: arn:aws:s3:::my-aws-destinaton-bucket

create IAm policy to read from Source bucket and write to Destination bucket:
my-IAM-s3-source-target

Cretae a role:
my-Role-S3-source-2-Destinaation: 
select above policy created.

Create a Lambda Function:
my-S3-source-2-Destination
Choose role created above
create function 

Node.JS
var AWS = require("aws-sdk");
exports.handler = (event, context, callback) => {
    var s3 = new AWS.S3();
    var sourceBucket = "my-aws-source-bucket";
    var destinationBucket = "my-aws-destinaton-bucket";
    var objectKey = event.Records[0].s3.object.key;
    var copySource = encodeURI(sourceBucket + "/" + objectKey);
    var copyParams = { Bucket: destinationBucket, CopySource: copySource, Key: objectKey };
    
    s3.copyObject(copyParams, function(err, data) {
        if (err) {
            console.log(err, err.stack);
        } else {
            console.log("S3 object copy successful.");
        }
    });
};

(or)
import boto3 import os import sys import uuid from urllib.parse import unquote_plus from PIL import Image import PIL.Image s3_client = boto3.client('s3') def resize_image(image_path, resized_path): with Image.open(image_path) as image: image.thumbnail(tuple(x / 2 for x in image.size)) image.save(resized_path) def lambda_handler(event, context): for record in event['Records']: bucket = record['s3']['bucket']['name'] key = unquote_plus(record['s3']['object']['key']) tmpkey = key.replace('/', '') download_path = '/tmp/{}{}'.format(uuid.uuid4(), tmpkey) upload_path = '/tmp/resized-{}'.format(tmpkey) s3_client.download_file(bucket, key, download_path) resize_image(download_path, upload_path) s3_client.upload_file(upload_path, '{}-resized'.format(bucket), key)

select Deploy -->to save code

Adding Triggers to Lambda Function:
select S3
select source bucket

Test Lambda Function:
upload file to source bucket.

You can find the file in Destination bucket.

AWS EC2

 AWS IAM