Automatic Resizing of Uploaded Images


When customers upload images to your website they can come in sorts of sizes. To increase the performance of your website you often want to ensure the images are the right resolution.

In this video, we’ll learn to add a hook to the S3 bucket upload to trigger a Lambda which resizes the image and saves it back to S3.

We’re going to be starting with the code that was done in the previous image upload API video. If you want to do this video then you can also start with the code from this repo:
https://github.com/SamWSoftware/ServerlessYoutubeSeries/tree/l32-image-upload

Updating the Existing Code

We’re going to start by making a few modifications to the existing code. The first thing is to update the S3.write function slightly, adding some extra parameters and updating the logic with the data.

We’re going to start by adding 2 extra parameters, ACL and ContentType and attaching them directly to the params.

async write(data, fileName, bucket, ACL, ContentType) {
        const params = {
            Bucket: bucket,
            Body: JSON.stringify(data),
            Key: fileName,
            ACL,
            ContentType,
        };

Next, we’re going to update the Body field on the params to allow us to send up a buffer as well as JSON. To do this we need to check if the data is a buffer or JSON and treat it likewise.

const params = {
        Bucket: bucket,
        Body: Buffer.isBuffer(data) ? data : JSON.stringify(data),
        Key: fileName,
        ACL,
        ContentType,
    };

We also need to make a very small change to the imageUpload.js file. In this video we want both the uploaded image and the resized images to be in the same bucket. To do this effectively we want to have a folder for each. This simple involves adding the folder name at the start of the key in the image upload. (line 37)

const key = `uploads/${name}.${detectedExt}`;

Image Resize Lambda

With that all done we can move onto creating the image resize lambda. Create a new file of src/endpoints/imageResize.js and open it up. We’ll be starting with a normal Lambda handler and importing the Responses from our common folder.

import Responses from '../common/API_Responses';
    
    exports.handler = async event => {
    }

The way we’re going to trigger this lambda is different to normal, we’re going to use S3 hooks instead of API Gateway. That means that the event is different to normal. In this case we have a Records field that we need to use. The next bit of logic we also want to wrap in a try catch for good measure.

const { Records } = event;
    
    try {
        
    } catch (error) {
        console.log('error in try catch', error);
        return Responses._400();
    }

We can now start on the code in that try block. We are going to be mapping over each record and getting some details from it before calling a resizeImage function that we’re going to write next. This returns a Promise so we’ll end up with an array of promises and we’ll wait for all of those to complete.

const promArray = Records.map(record => {
        const bucket = record.s3.bucket.name;
        const file = record.s3.object.key;
        const width = 300;
        const height = 300;
        return resizeImage({ bucket, file, width, height });
    });
    
    await Promise.all(promArray);
    
    return Responses._200()

We’ve also defined a width and hight which will be the maximum width and height of the image when we’e resized it.

Next, we make the resizeImage function. This function needs to get the image from S3, convert it to a format that we can resize, do the resizing and then save back to the S3 bucket.

Getting the image buffer from S3 is pretty easy.

const resizeImage = async ({ bucket, file, width, height }) => {
        const imageBuffer = await S3.get(file, bucket);
        
    }

Now we need to use an NPM package called Jimp which is a JavaScript image processing library. Make sure to install it with npm install --save jimp and then import it at the top of the file, along with the S3 object from common.

import S3 from '../common/S3';
    import jimp from 'jimp';

With this jimp package installed we can use it to resize the image.

const jimpImage = await jimp.read(imageBuffer.Body);
    const mime = jimpImage.getMIME();
    
    const resizedImageBuffer = await jimpImage.scaleToFit(width, height).getBufferAsync(mime);

As I said before, the width and height are the maximum that the image can be. The scaleToFit function will keep the aspect ratio of the image consistent.

The last thing to do it write this back to the S3 bucket. To do this we need a new key. We’ll get the last bit of the original image file and add some details about the resized image in the new folder structure.

const shortFileName = file.split('/')[1];
    const newFileName = `resized/${width}x${height}/${shortFileName}`;
    
    await S3.write(resizedImageBuffer, newFileName, bucket, 'public-read', mime);
    return newFileName;

This turns uploads/profile-picture.jpg into resized/300x300/profile-picture.jpg.

The changes we made to the S3 write at the start of this article are used here, passing up a buffer as data, the ACL and content type.

Updating serverless.yml

To add this as a function with an S3 trigger we need to go back to our serverless.yml file and find the end of the list of functions.

This function config starts in a very similar way to normal but the events doesn’t contain a http like normal. That’s because this lambda is triggered by S3, not an API endpoint.

imageResize:
        handler: lambdas/endpoints/imageResize.handler
        events:
            - s3:
                bucket: ${self:custom.imageUploadBucket}
                event: s3:ObjectCreated:*
                rules:
                    - prefix: uploads/
                existing: true

The new s3 event only really needs the name of the bucket. The rest of the config to limit when the trigger is fired. We only want to resize new images that are uploaded so we listen for the s3:ObjectCreated:* event and say that the file has to be uploaded to uploads/.

The last thing is the existing: true flag. If you’re using an existing bucket and you don’t add this then it will try and create a new bucket with the same name. This obviously fails and causes your deployment to also fail.

Testing it all out

Once we’ve deployed it with sls deploy we can test it out. Make sure to get the image upload url from the outputs. We can then head over to the image upload react app that we made in a previous video. If you don’t have your own one then you can use this one I’ve made.

When we paste the url into the file uploader and select an image we should see the image displayed. If we go into our AWS console and find our S3 bucket we should find an /uploads folder and a /resized folder. In the uploads folder we can find out original uploaded image, in my case it’s 82KB and 800×532 pixels.

We also find the image in the resized/300x300 folder. This is slightly smaller at 76.8KB and if we download it we can see that the image has been resized.

Review

There is a load more you could do with this. You could use jimp to do other image manipulations such as recolouring, rotating or cropping.

You could also use the S3 lambda hook to do much more, it doesn’t have to be images that you alter with this method. You could do something with converting .txt files to PDF, automatically zipping large file or much more.

Sam Williams

Sam is a Serverless Obsessive who runs Complete Coding, helping developers learn Serverless and companies make the most of the competitive advantage that Serverless gives them. Previous projects include: - Designing a chat platform that currently resolves over 250,000 customer inquiries a month for international retailers and local government - Architecting a backend system to support a 3D clothing visualisation tool - Building a solution to manage millions of dollars of communication software - Designing a "Video Editing in the Cloud" platform, enabling everyone from movie studios to indie film makers to harness the power of the Cloud - without needing to be cloud experts. - Launching a profitable Serverless Startup in under 30 days He has also been teaching cloud-based software development for 5 years and has taught Serverless development to thousands of people. The Complete Coding Youtube channel now has over 15,000 subscribers and over 1 million views

Recent Posts