Using AWS S3 with Python boto3

Ansaba Last Updated : 29 Dec, 2022
5 min read

This article was published as a part of the Data Science Blogathon.

AWS S3
Source: https://python.plainenglish.io/rename-and-move-s3-object-using-python-boto3-80bd07075e47

Introduction

AWS S3 is one of the object storage services offered by Amazon Web Services or AWS. It allows users to store and retrieve files quickly and securely from anywhere. Users can combine S3 with other services to build numerous scalable applications. Boto is the Python SDK (software development kit) or hand-coded Python library for AWS. Its different versions include Boto2, Boto3, and Botocore. The latest version of the SDK is Boto3 which is the ground-up rewrite of Boto. Through the boto3 python library, users can connect to Amazon services, including S3, and use the resources from within AWS. It helps developers to create, configure, and manage AWS services, making it easy to integrate with Python applications, libraries, or scripts. This article covers how boto3 works and how it helps interact with S3 operations such as creating, listing, and deleting buckets and objects.

What is boto3

Boto3 is a Python SDK or library that can manage and access various services of AWS, such as Amazon S3, EC2, Dynamo DB, SQS, Cloudwatch, etc., through python scripts. Boto3 has a data-driven approach for generating classes at runtime from JSON description files shared between SDKs. Because Boto 3 is generated from these shared JSON files, users get fast updates to the latest services and a consistent API across services. It provides object-oriented and easy-to-use API as well as low-level direct service access.

Key Features of boto3

  • It is built on top of botocore- a Python library used to send API requests to AWS and receive responses from the service.
  • Supports Python 2.7+ and 3.4+ natively.
  • Boto3 provides sessions and per-session credentials & configuration, along with essential components like authentication, parameter, and response handling.
  • Has a consistent and Up-to-date Interface

 

Working with AWS S3 and Boto3

AWS S3

Source: https://dashbird.io/blog/boto3-aws-python/

 

Using the Boto3 library or SDK with Amazon S3 allows users to create, delete, and update S3 Buckets, Objects, S3 Bucket Policies, etc., from Python programs or scripts in a faster way. Boto3 has two abstractions, namely client and resource. Users can choose client abstraction if they want to work with single S3 files or resource abstraction if they want to work with multiple S3 buckets. Clients provide a low-level interface to the AWS services, whereas resources are higher-level abstraction than clients.

Installation of boto3 and Building AWS S3 Client

Installing boto3 to your application:
On the Terminal, use the code
pip list
The above code will list the installed packages. If Boto3 is not installed, install it by the following code.
pip install boto3
Build an S3 client to access the service methods:
Create an S3 client that helps access the objects stored in the S3 environment and set credentials, including aws_access_key_id and aws_secret_access_key. It is essential to have credentials such as Access Key and Secret Key to access the S3 bucket and to run the following code.
# Import the necessary packages
import boto3
# Now, build a client
S3 = boto3.client(
    's3',
    aws_access_key_id = 'enter your_aws_access_key_id ',
    aws_secret_access_key = ' enter your_aws_secret_access_key ',
    region_name = ' enter your_aws_region_name '
)

 

AWS S3 Operations With boto3

 

AWS S3

Creating buckets:

To create an S3 bucket, use the create_bucket() method with the Bucket and ACL parameters. ACL represents Access Control List which manages access to S3 buckets and objects. It is important to note that Bucket names should be unique throughout the whole AWS platform.
my_bucket = "enter your s3 bucket name that has to be created"
bucket = s3.create_bucket(
ACL='private',
Bucket= my_bucket
)
Listing buckets:
To list all the available buckets, use the list_buckets() method.
bucket_response = s3.list_buckets()
# Output the bucket names
print('Existing buckets are:')
for bucket in bucket_response ['Buckets']:
    print(f'  {bucket["Name"]}')
Deleting Buckets:
A bucket in S3 can be deleted using the delete_bucket() method. The bucket must be empty, meaning it does not contain any objects to perform the deletion.
my_bucket = "enter your s3 bucket name that has to be deleted"
response = s3.delete_bucket(Bucket= my_bucket)
print("Bucket has been deleted successfully !!!")
Listing the files from a bucket:
Files or objects from an S3 bucket can be listed using the list_objects method or the list_objects_v2 method.
my_bucket = "enter your s3 bucket name from which objects or files has to be listed out"
response = s3.list_objects(Bucket= my_bucket,
                           MaxKeys=10, 
                           Preffix="only_files_starting_with_this_string")
MaxKeys argument represents the maximum number of objects to be listed. The prefix argument lists Objects whose keys (names) only start with a specific prefix.
Another way to list objects:
    s3 = boto3.client("s3")
    my_bucket = " enter your s3 bucket name from which objects or files has to be listed out "
    response = s3.list_objects_v2(Bucket=my_bucket)
    files = response.get("Contents")
    for file in files:
        print(f"file_name: {file['Key']}, size: {file['Size']}")
Uploading files:
To upload a file to an s3 bucket, use the method upload_file () having the following parameters:
  • File: it defines the path of the file to be uploaded
  • Key: it represents the unique identifier for an object within a bucket
  • Bucket: bucket name to which file has to be uploaded
my_bucket = "enter your bucket name to which files has to be uploaded"
file_name = "enter your file path name to be uploaded"
key_name = "enter unique identifier"
s3.upload_file(Filename= file_name, Bucket= my_bucket, Key= key_name)
Downloading files:
To download a file or object locally from a bucket, use the download_file() method with Key, Bucket, and Filename parameters.
my_bucket = "enter your s3 bucket name from which object or files has to be downloaded"
file_name = "enter file to be downloaded"
key_name = "enter unique identifier"
s3.download_file(Filename= file_name, Bucket= my_bucket, Key= key_name)
Deleting files:
To delete a file or object from a bucket, use the delete_object() method with Key and Bucket parameters.
my_bucket = "enter your s3 bucket name from which objects or files has to be deleted"
key_name = "enter unique identifier"
s3.delete_object(Bucket= my_bucket, Key= key_name)
Get the object’s metadata:
To get the file or object’s details, such as last modification time, storage class, content length, size in bytes, etc., use the head_object() method with Key and Bucket parameters.
my_bucket = "enter your s3 bucket name from which objects or file's metadata has to be obtained"
key_name = "enter unique identifier"
response = s3.head_object(Bucket= my_bucket, Key= key_name)

Conclusion

AWS S3 is one of the most reliable, flexible, and durable object storage systems that allows users to store and retrieve data. AWS defines boto3 as a Python library or SDK (Software Development Kit) to create, manage and configure AWS services, including S3. The boto3 operates AWS services in a programmatic way from your applications and services.
Key Takeaways:
  • AWS S3 is one object storage service that helps store and retrieve files quickly.
  • Boto3 is a Python SDK or library that can manage Amazon S3, EC2, Dynamo DB, SQS, Cloudwatch, etc.
  • Boto3 clients provide a low-level interface to the AWS services, whereas resources are a higher-level abstraction than clients.
  • Using the Boto3 library with Amazon S3 allows users to create, list, delete, and update S3 Buckets, Objects, S3 Bucket Policies, etc., from Python programs or scripts in a faster way.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

I'm a cloud enthusiast and have done graduation in computer science. My skills include C, C++, Php, Python, Cloud services(AWS, GCP), shell scripting, MySQL, etc. I have hands-on experience and have done several projects and certifications in areas of Python, Php, MySQL, Cloud (GCP, AWS), and C++

Responses From Readers

Clear

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details