Skip to main content

Uploading and Deleting Files with Amazon S3 and SpringBoot

Today I am going to be sharing about how to upload and delete files at AmazonS3 with SpringBoot, Amazon Simple Storage Service (S3) is an object storage platform which helps you to store the files in form of objects, it allows us to store and retrieve any amount of data from anywhere.  Each file is stored in Amazon S3 (as an object) and is represented using a key.

we are going to be using AWS Java SDK which supports various APIs related to Amazon S3 service for working with files stored in S3 bucket.

so after going to https://start.spring.io/ and create a new SpringBoot project with the Artifact as AmazonS3 or as you prefer, open your pom.xml and let's add this dependency.

AWS Java SDK supports several APIs related to Amazon S3 service for working with files stored in S3 bucket. 



<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
<dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-java-sdk</artifactId>
   <version>1.11.598</version>
</dependency>


Amazon S3 Account Configuration
Please follow the instructions given in the Amazon S3 official documentation for creating and configuring the S3 account and bucket.

Click here to visit the official documentation.
 https://docs.aws.amazon.com/AmazonS3/latest/gsg/GetStartedWithS3.html


Sign up with Amazon
First, you need to go to https://aws.amazon.com/s3/ for sign up with Amazon S3 service.  You have to follow up the instructions given in the sign up document and need to complete the registration.

AWS will notify you by email when your account is active and available for you to use. Once your account is active and ready to use

Creating S3 bucket
login to https://console.aws.amazon.com/s3 and click on “Create bucket” from below screen.


   don't forget letting Block all public access OFF



    Then it will list down the newly created bucket as follows.



Now let's Create an IAM User

Go to https://console.aws.amazon.com/iam/ . In the navigation pane, choose Users and then choose Add user.

Input User name, choose Programmatic access for Access type:


Press Next: Permissions button -> go to Set permissions for UserBucket

Now, choose Attach existing policies directly -> filter policy type s3, then check AmazonS3FullAccess. Then click on “Next Review“.




Once you have reviewed the details, press “Create user“. Now you have successfully created the user.


Click on Download .csv for downloading the credentials.  and what we are going to be needing from this file are the Access key ID and the  Secret access key


let's open the SpringBoot Amazon S3 project application which we downloaded previously from https://start.spring.io/


Final Project Structure

Your final project structure should looks like below.



Application.properties
Add the AWS S3 bucket related configurations in the application.properties as follows. You will find those configurations details in the CSV  file (Download.csv) that you have already downloaded.  Change the properties


there are Max file size. and Max Request Size, you can change those setting if you want to upload big files,  and do not forget your region, My region in my case is N. Virginia us-east-1


spring.servlet.multipart.enabled=true
# Threshold after which files are written to disk.
spring.servlet.multipart.file-size-threshold=2KB
# Max file size.
spring.servlet.multipart.max-file-size=200MB
# Max Request Size
spring.servlet.multipart.max-request-size=215MB

aws.access.key.id = your_secret_id
aws.access.key.secret = your_secret_key
aws.region = us-east-1
aws.s3.audio.bucket = buckettesters


Let's get started creating the AmazonS3Config.java inside the folder Model  



package com.example.AmazonS3.Model;

import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;

@Configuration
public class AmazonS3Config
{
   @Value("${aws.access.key.id}")
   private String awsKeyId;

   @Value("${aws.access.key.secret}")
   private String awsKeySecret;

   @Value("${aws.region}")
   private String awsRegion;

   @Value("${aws.s3.audio.bucket}")
   private String awsS3AudioBucket;

   @Bean(name = "awsKeyId")
   public String getAWSKeyId() {
       return awsKeyId;
   }

   @Bean(name = "awsKeySecret")
   public String getAWSKeySecret() {
       return awsKeySecret;
   }

   @Bean(name = "awsRegion")
   public Region getAWSPollyRegion() {
       return Region.getRegion(Regions.fromName(awsRegion));
   }

   @Bean(name = "awsCredentialsProvider")
   public AWSCredentialsProvider getAWSCredentials() {
       BasicAWSCredentials awsCredentials = new BasicAWSCredentials(this.awsKeyId, this.awsKeySecret);
       return new AWSStaticCredentialsProvider(awsCredentials);
   }

   @Bean(name = "awsS3AudioBucket")
   public String getAWSS3AudioBucket() {
       return awsS3AudioBucket;
   }
}


AmazonS3 is a class from amazon dependency. All other fields are just a representation of variables from our application.properties file.

The @Value annotation will bind application properties directly to class fields during application initialization.

Let's create AmazonS3ClientService.java in the Service folder



package com.example.AmazonS3.Service;

import org.springframework.web.multipart.MultipartFile;
public interface AmazonS3ClientService
{
   void uploadFileToS3Bucket(MultipartFile multipartFile, boolean enablePublicReadAccess);

   void deleteFileFromS3Bucket(String fileName);
}

AmazonS3ClientServiceImpl.java




package com.example.AmazonS3.Service;

import com.amazonaws.services.s3.model.CannedAccessControlList;
import com.amazonaws.services.s3.model.DeleteObjectRequest;
import com.amazonaws.services.s3.model.PutObjectRequest;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Component;

import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.regions.Region;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import org.springframework.web.multipart.MultipartFile;

import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;

@Component
public class AmazonS3ClientServiceImpl implements AmazonS3ClientService
{
   private String awsS3AudioBucket;
   private AmazonS3 amazonS3;
   private static final Logger logger = LoggerFactory.getLogger(AmazonS3ClientServiceImpl.class);

   @Autowired
   public AmazonS3ClientServiceImpl(Region awsRegion, AWSCredentialsProvider awsCredentialsProvider, String awsS3AudioBucket)
   {
       this.amazonS3 = AmazonS3ClientBuilder.standard()
               .withCredentials(awsCredentialsProvider)
               .withRegion(awsRegion.getName()).build();
       this.awsS3AudioBucket = awsS3AudioBucket;
   }

   @Async
   public void uploadFileToS3Bucket(MultipartFile multipartFile, boolean enablePublicReadAccess)
   {
       String fileName = multipartFile.getOriginalFilename();

       try {
           //creating the file in the server (temporarily)
           File file = new File(fileName);
           FileOutputStream fos = new FileOutputStream(file);
           fos.write(multipartFile.getBytes());
           fos.close();

           PutObjectRequest putObjectRequest = new PutObjectRequest(this.awsS3AudioBucket, fileName, file);

           if (enablePublicReadAccess) {
               putObjectRequest.withCannedAcl(CannedAccessControlList.PublicRead);
           }
           this.amazonS3.putObject(putObjectRequest);
           //removing the file created in the server
           file.delete();
       } catch (IOException | AmazonServiceException ex) {
           logger.error("error [" + ex.getMessage() + "] occurred while uploading [" + fileName + "] ");
       }
   }

   @Async
   public void deleteFileFromS3Bucket(String fileName)
   {
       try {
           amazonS3.deleteObject(new DeleteObjectRequest(awsS3AudioBucket, fileName));
       } catch (AmazonServiceException ex) {
           logger.error("error [" + ex.getMessage() + "] occurred while removing [" + fileName + "] ");
       }
   }
}

The two methods responsible for uploading and deleting files are annotated with @Async to make them asynchronous.  That means those methods will be executed in some other background thread except the main thread (the thread responsible for serving the request)

When uploading files, the below code segment is responsible for adding PublicRead permissions to the file being uploaded. It means that anyone who has the file url can access the file.

putObjectRequest.withCannedAcl(CannedAccessControlList.PublicRead);

FileHandlerController.java



package com.example.AmazonS3.Service;

import com.amazonaws.services.s3.model.CannedAccessControlList;
import com.amazonaws.services.s3.model.DeleteObjectRequest;
import com.amazonaws.services.s3.model.PutObjectRequest;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
package com.example.AmazonS3.Controller;



import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.multipart.MultipartFile;

import java.io.IOException;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;

import java.util.HashMap;
import java.util.Map;

import com.example.AmazonS3.Service.AmazonS3ClientService;

@RestController
@RequestMapping("/s3bucket")
public class FileHandlerController {

   @Autowired
   private AmazonS3ClientService amazonS3ClientService;

   @PostMapping("uploadone")
   public Map<String, String> uploadSingleFile(@RequestPart(value = "file") MultipartFile file)
   {

       System.out.println("get data..");
       this.amazonS3ClientService.uploadFileToS3Bucket(file, true);

       Map<String, String> response = new HashMap<>();
       response.put("message", "file [" + file.getOriginalFilename() + "] uploading request submitted successfully.");

       return response;
   }

   @PostMapping("uploadmultiple")
   public String uploadMultipleFiles(@RequestParam("files") MultipartFile[] files)throws IOException
   {
       for(MultipartFile uploadedFile : files) {
           System.out.println(uploadedFile.getOriginalFilename());
           uploadSingleFile(uploadedFile);
       }

       return "OK";
   }

   @DeleteMapping
   public Map<String, String> deleteFile(@RequestParam("file_name") String fileName)
   {
       this.amazonS3ClientService.deleteFileFromS3Bucket(fileName);

       Map<String, String> response = new HashMap<>();
       response.put("message", "file [" + fileName + "] removing request submitted successfully.");

       return response;
   }


   @DeleteMapping("delete")
   public String deleteFiles(@RequestBody Map<String, Object> userMap)
   {
       String metaInfoId = (String) userMap.get("metaInfoId");

       deleteFile(metaInfoId);
      
       return "OK -->"+metaInfoId;
   }
}

Let's test it with POSTMAN uploading just one file







you can download the poject in GitHub from here https://github.com/juandavidmarin368/SpringBoot-AmazonS3-AWS



Comments

  1. This article is a creative one and the concept is good to enhance our knowledge. Waiting for more updates.
    AWS Online Training

    ReplyDelete
  2. Nice blog, very informative content.Thanks for sharing, waiting for the next update…
    Java Tutorial for Beginners
    Java Programming Tutorial

    ReplyDelete
  3. Nice blog, very informative content.Thanks for sharing, waiting for the next update…
    statistics for data science tutorial
    Data Science Tutorial for Beginners

    ReplyDelete

  4. You need to participate in a challenge for probably the best webpage on the web. I will enthusiastically suggest this site!


    best interiors

    ReplyDelete

  5. This post is so interactive and informative.keep update more information...
    Tally Course in Tambaram
    Tally course in Chennai

    ReplyDelete

Post a Comment

Popular posts from this blog

How to deploy a VueJS App using Nginx on Ubuntu

There are thousands of blogs and websites out there explaining how to do a hello world and how to start with VueJS, but in this little post, I’m just going to be explaining how to do deploy a VueJs app after you have run the command through the CLI npm run build . So when you run the command npm run build a dist folder is created and that folder’s got the essential .js files to run our app, when we want to run our app on an Nginx server by default the vue-router is not going to work well so that Nginx does not come ready to work by default with a VueJs app This is basically for a Linux Ubuntu distribution if you’ve got any other Linux distribution just pay attention where is going to be set the www/html folder and the Nginx settings so that this is useful for any Linux distribution  Install and configure nginx sudo apt install nginx Double check to make sure the nginx service is running with command service nginx status, then open your browser and enter url

How to do pagination SpringBoot with Jbctemplate, MySQL

We are going to be working on a topic which is a basic need when doing any app, and it is Pagination. let's get started creating a product table at https://mockaroo.com/ create table products ( id INT, name VARCHAR(50), code VARCHAR(50) ); insert into products (id, name, code) values (1, 'Hettinger-Goyette', '42549-680'); insert into products (id, name, code) values (2, 'Konopelski-Klein', '49527-724'); insert into products (id, name, code) values (3, 'Smitham, Kuhlman and Balistreri', '53238-003'); insert into products (id, name, code) values (4, 'Hettinger, Weissnat and Goodwin', '0143-9916'); insert into products (id, name, code) values (5, 'Rowe Inc', '42291-898'); insert into products (id, name, code) values (6, 'Ernser-Hauck', '10544-617'); insert into products (id, name, code) values (7, 'Maggio and Sons', '68788-9087'); insert into products (id, name,

How to secure SpringBoot with SSL and Tomcat or Undertow

when we are going to take our applications to production mode, we must have an SSL certificate for the FrontEnd and   BackEnd too, in this case, our backend is a  SpringBoot which is using a tomcat embedded server. Terminology TLS vs SSL TLS is the successor to SSL. It is a protocol that ensures privacy between communicating applications. Unless otherwise stated, in this document consider TLS and SSL as interchangable. Certificate (cert) The public half of a public/private key pair with some additional metadata about who issued it etc. It may be freely given to anyone. Private Key A private key can verify that its corresponding certificate/public key was used to encrypt data. It is never given out publicly. Certificate Authority (CA) A company that issues digital certificates. For SSL/TLS certificates, there are a small number of providers (e.g. Symantec/Versign/Thawte, Comodo, GoDaddy, LetsEncrypt) whose certificates are included by most browsers and Operating System