Vulnerable Cloud File Storage on Amazon S3

Vulnerable Cloud File Storage on Amazon S3

·

18 min read

Lab Scenario

In this lab, you will take on the role of a developer tasked with creating a web application that allows users to upload and store files securely using Amazon S3. As part of the project, you will set up a vulnerable S3 bucket to serve as the file storage repository for the application.

However, in the rush to meet deadlines and without adequate knowledge of AWS security best practices, developers often end up misconfiguring the S3 bucket. This misconfiguration can lead to serious security vulnerabilities, potentially exposing sensitive data and making it susceptible to unauthorized access or data breaches.

Your challenge in this lab is to navigate through the misconfiguration pitfalls and identify the security weaknesses in the AWS S3 bucket. You will learn about common misconfigurations that can occur and the potential risks associated with each. By doing so, you will gain a deeper understanding of how critical it is to follow proper security measures when deploying Cloud File Storage solutions on AWS S3.

Remember, this lab is designed to be a learning experience. Take the opportunity to explore AWS security features and best practices, ensuring that you develop secure applications and protect your users' data effectively.

Project Objective

This project demonstrates the implementation of vulnerable cloud file storage on AWS S3. Its purpose is to assist developers and cloud security engineers in comprehending security vulnerabilities when creating a web application that enables users to upload and store files on Amazon S3. By the end of this project, you will have acquired knowledge on preventing misconfigurations and implementing secure coding practices.


Let's dive in 🚀.

PART A: Setting up an AWS S3 Bucket

Step 1: Setup IAM User Account

In most cases, developers unintentionally set up vulnerable AWS S3 buckets, allowing users to store data that poses a threat to customer or user data. In this lab, you will create an S3 bucket that will be publicly accessible to anyone via the Internet.

Create an AWS free tier account if you don't have one already.

  • Sign in to your AWS Management console, then navigate to the S3 service under service or search from the search bar at the top.

  • Create a new S3 bucket that will be used to store the uploaded files.

Create an IAM User Account with Full Access to S3 bucket and Cloud Shell

  • Navigate to IAM on your Console to create a User

  • On Add user Page, Enter any User name of your choice (Here, James), then, check the boxes Provide user access to the AWS Management Console” and “I want to create an IAM user” radio buttons. Then uncheck the box “Users must create a new password at next sign-in”, retain other default settings and click Next

  • On the Set permissions page, click the radio button “Attach policies directly” and search and select AmazonS3FullAccess and AWSCloudShellFullAccess, then click Next

  • On the Review and Create Page, Review the configuration and click on Create user.

  • Once the IAM user has been successfully created, click on Download .csv file.

Step 2:Creating Programmatic Access Key for the IAM User

  • On the User page, click on the just-created user (here, James)

  • On the User Info Page (here, James), click on the Security Credentials Tab

  • Next, Scroll down to Access Keys and click on Create Access Keys

  • On the Access Key Best Practices & Alternatives page, select Command Line Interface (CLI), then scroll down to check the confirmation box “I understand the above recommendation and want to proceed to create an access key”. Then click Next

  • You can set a description tag, but it is optional. Next, click on the button Create Access Key.

  • Access keys have been successfully created. Click on Download.csv file.

Step 3: Creating a Vulnerable S3 Bucket with Public Access

  • Open the incognito window and Sign in to the AWS management console with the IAM user details from the previous steps or the downloaded.csv file.

Copy the console sign-in URL and paste on incognito window

  • Once Signed in to the management console, navigate to S3 by searching from the top search bar

  • On the S3 Page, click on Create Bucket.

  • On Create Bucket page, enter any bucket name of your choice (here, my-d3m0-files)

  • Under Object Ownership, select the radio button for ACLs Enabled and Object writer

  • Under Block Public Access settings for this bucket, uncheck Block all public access and check the box “I acknowledge that the current settings might result in this bucket and the objects within becoming public.”

  • Retain other default settings, scroll down and click on Create Bucket.

  • The S3 bucket (my-d3m0-files) has been successfully created. Next, click on the bucket name my-d3m0-files to view the bucket information.

  • On the bucket (my-d3m0-files) info page, click on Permissions Tab and scroll down to the Access Control List (ACL)

  • On the Access control list (ACL) Section, click on the Edit button

  • On the Edit Access Control List (ACL) page, select the Read checkboxes for Objects and Object ACL next to Everyone (public access). Then Scroll down and select the acknowledgment checkbox.

NOTE: Copy the Group URL for Everyone (public access)

  • Then scroll down and click on Save Changes.

Step4: Enabling Full Access Permission to the Bucket

  • Navigate to the CloudShell icon at the top-right corner of the AWS console and Execute the command below.

NB: the Group URL for Everyone public access that you previously copied is used in the “url=” path in the command below

NB: REPLACE “my-d3m0-files” in the command below with your bucket name

aws s3api put-bucket-acl --bucket my-d3m0-files --grant-write-acp uri=http://acs.amazonaws.com/groups/global/AllUsers

The Above command, when executed on Cloudshell, performs the "put-bucket-acl" operation on an S3 bucket named "my-d3m0-files." Specifically, it grants "write ACP" (Access Control Policy) permissions to the predefined group "AllUsers," which includes all AWS accounts and anonymous users. This means that anyone, including the public, will be able to modify the bucket's Access Control Policy.

aws s3api put-bucket-acl --bucket my-d3m0-files --grant-full-control uri=http://acs.amazonaws.com/groups/global/AllUsers

NB: REPLACE “my-d3m0-files” in the above command with your bucket name

The Above command, when executed on Cloudshell, sets an Access Control List (ACL) for the S3 bucket named my-d3m0-files that grants full control to all users, including anonymous users (public access) to the objects in the bucket.

  • Navigate back to the bucket (my-d3m0-files) info page, select the Permissions tab, then

  • Scroll down to the Access Control List (ACL) and click on Edit.

  • On the Edit page, you will notice Full public access has been granted under Grantee for Everyone (public access)

API Keys Table

The Table below shows API Keys to access the AWS S3 buckets

your-access-keyAKIASUL6TUBYR5BLLX33
your-secret-key1IpAaADF8g9NZkJwAtmY0uwTZUFcKFLGieFzXUsI
your-bucket-namemy-d3m0-files
AWS_REGIONus-east-1
S3_BASE_URLarn:aws:s3:::my-d3m0-files
# AWS S3 configuration
AWS_ACCESS_KEY_ID = 'your-access-key-id'
AWS_SECRET_ACCESS_KEY = 'your-secret-access-key'
AWS_REGION = 'your-aws-region'
S3_BUCKET_NAME = 'your-s3-bucket-name'
S3_BASE_URL = f'https://{S3_BUCKET_NAME}.s3.{AWS_REGION}.amazonaws.com/'

PART B: Setting up Flask in VS Code

In this lab, you will deploy Flask, a Python framework for the backend API, and set up the Flask app instance by importing the necessary libraries.

Step 1: Install Python

  • Make sure you have Python installed on your system. You can download Python from the official website (Download here) and install it. Ensure that you add Python to your system's PATH during the installation. Python latest Release: 3.11.4

Note: Ensure you add Python to your system's PATH during the installation. Watch this video on how to add Python to Path Environment on Windows after installation.

OR You can install Python from Microsoft Store

Navigate to Your Microsoft Store, search for Python and install the version you want

  • Navigate to the Command prompt and type the command to check the version of Python installed python --version

Step 2: Install VS Code

If you haven't already, download and install Visual Studio Code (VS Code) from the official website (Download Here).

Step 3: Create a Flask project folder

Create a new folder for your Flask project on your computer. This folder will contain your Flask application code. (in this lab, the Folder is called “Cloud API Project.”)

Step 4: Open the project folder in VS Code

Launch VS Code and open the Flask project folder you created by selecting "File" -> "Open Folder" from the menu. Select the Folder that you created in step 3 (here, Cloud API Project)

  • On the Left pane, click on Explorer or press (Ctrl + Shift + E) to view the folder opened in previous steps.

NB: in the Folder “Cloud API Project” create a sub-folder that will contain the Virtual env files (here, it is called “Venv-files”)

Step 5: Create a virtual environment

  • In VS Code's integrated terminal, navigate to your project folder by typing:

Press (Ctrl + J) to open Terminal or click on the Icon at the top

cd /path/to/your/project/folder

Replace /path/to/your/project/folder with the actual path to your project folder.

NB: The sub-folder “Venv-files” is where the virtual environment files will be stored, using Python's built-in venv module

  • On the terminal, type the command to download the virtual environment and press ENTER key

For Windows:

python -m venv venv

For macOS and Linux:

python3 -m venv venv

NB: in the above command, the folder we changed to the sub-folder created in the previous steps "Venv files". This is where the virtual environment will be downloaded.

Step 6: Activate the virtual environment

  • To activate the virtual environment, use the following command:

For Windows (command prompt):

venv\Scripts\activate

For Windows (PowerShell):

venv\Scripts\Activate.ps1

For macOS and Linux:

source venv/bin/activate

In this lab, we are using Windows. Thus,

  • Navigate to the subfolder on your command prompt and run the command for Windows venv\Scripts\activate

NB: you have to change to the subfolder (here, "Venv files") where the virtual environment was downloaded in the previous steps before executing the activation command venv\Scripts\activate

Step 7: Install Flask

  • You will notice that after running step 6, the virtual environment is activated with (venv) in front of the directory (C:\Users. . .). You can now install Flask using pip command. Enter the command:
pip install Flask

NB: Use the command to upgrade the version of pip python.exe -m pip install --upgrade pip

Step 8: Confirm Installation

  • Navigate to VS code and open the Explorer to view the Folder

The content in the subfolder “Venv-files” shows that the virtual environment successfully installed

  • On VS code terminal type the command to switch to the virtual Environment: venv\Scripts\activate

PART C: Setting up the Web Page

Step 1: Create an Html file

  • Create a sub-folder called templates inside the Venv-files folder. Then Create an HTML file index.html inside the templates subfolder. The web page serves as a file-uploading page.

  • Then, copy and paste the following code into the index.html file.
<!DOCTYPE html>
<html>
<head>
    <title>Cloud File Storage</title>
    <!-- Bootstrap CSS link (you can replace this with your own local CSS file) -->
    <link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/5.3.0/css/bootstrap.min.css">
    <!-- Custom CSS (place this in your local CSS file if you have one) -->
    <style>
        body {
            padding: 20px;
        }

        .container {
            max-width: 600px;
        }

        .file-upload-btn {
            margin-top: 10px;
        }

        .file-list-item {
            margin-top: 5px;
            list-style-type: none;
        }

        .success-message {
            color: green;
            font-weight: bold;
        }

        .error-message {
            color: red;
            font-weight: bold;
        }
    </style>
</head>
<body>
    <div class="container">
        <h1 class="mt-3">Cloud File Storage</h1>
        <!-- Display success message if file is successfully uploaded -->
        {% if success_message %}
            <div class="success-message">{{ success_message }}</div>
        {% endif %}
        <!-- Display error message if there's an error during file upload -->
        {% if error_message %}
            <div class="error-message">{{ error_message }}</div>
        {% endif %}
        <div class="mb-3">
            <form action="/upload" method="post" enctype="multipart/form-data">
                <div class="input-group">
                    <input type="file" class="form-control" name="file" accept=".txt, .pdf, .docx">
                    <button type="submit" class="btn btn-primary file-upload-btn">Upload</button>
                </div>
            </form>
        </div>

        <h2>Uploaded Files</h2>
        <ul class="list-group">
            <!-- Use Flask template code to loop through the uploaded files -->
            {% for filename in uploaded_files %}
                <li class="list-group-item file-list-item">
                    <a href="/download/{{ filename }}">{{ filename }}</a>
                </li>
            {% endfor %}
        </ul>
    </div>

    <!-- Bootstrap JS and Popper.js (you can replace these with your own local files if needed) -->
    <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.16.0/umd/popper.min.js"></script>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/5.3.0/js/bootstrap.min.js"></script>
</body>
</html>

Step 2: Create a Python File

  • Create a new Python file e.g. app.py inside Venv-files folder

  • Then, copy and paste the following code in the app.py file
from flask import Flask, render_template, request, redirect, url_for, flash, send_file
import boto3
import os

app = Flask(__name__)

# AWS S3 configuration
AWS_ACCESS_KEY_ID = 'your-access-key-id'
AWS_SECRET_ACCESS_KEY = 'your-secret-access-key'
AWS_REGION = 'your-aws-region'
S3_BUCKET_NAME = 'your-s3-bucket-name'
S3_BASE_URL = f'https://{S3_BUCKET_NAME}.s3.{AWS_REGION}.amazonaws.com/'

# Configure AWS SDK
s3 = boto3.client(
    's3',
    aws_access_key_id=AWS_ACCESS_KEY_ID,
    aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
    region_name=AWS_REGION
)

@app.route('/')
def index():
    return render_template('index.html')

@app.route('/upload', methods=['POST'])
def upload_file():
    if 'file' not in request.files:
        flash('No file part')
        return redirect(request.url)

    file = request.files['file']
    if file.filename == '':
        flash('No selected file')
        return redirect(request.url)

    # Upload the file to S3
    try:
        s3.upload_fileobj(file, S3_BUCKET_NAME, file.filename)
        flash('File successfully uploaded!')
    except Exception as e:
        flash('An error occurred while uploading the file: ' + str(e))

    return redirect(url_for('index'))

@app.route('/download/<filename>')
def download_file(filename):
    try:
        file_url = f"{S3_BASE_URL}{filename}"
        return redirect(file_url)
    except Exception as e:
        flash('An error occurred while downloading the file: ' + str(e))
        return redirect(url_for('index'))

if __name__ == '__main__':
    app.secret_key = 'your-secret-key'  # Change this to a random secret key
    app.run(debug=True)

NOTE From the Above code, Replace the following
AWS_ACCESS_KEY_ID = 'your-access-key-id'
AWS_SECRET_ACCESS_KEY = 'your-secret-access-key'
AWS_REGION = 'your-aws-region'
S3_BUCKET_NAME = 'your-s3-bucket-name'
app.secret_key = 'your-secret-key'
S3_BASE_URL = f'https://{S3_BUCKET_NAME}.s3.{AWS_REGION}.amazonaws.com/' app.secret_key = 'your-secret-key'

NB: Use the API Keys Table at the End of PART A to replace your code

Step 3: Code Executing and Testing

  • On your VS code terminal (Ctrl + J), Navigate to the Project Folder and ensure that the virtual environment is executed using the command venv\Scripts\activate

NB: In this lab, the folder Venv-files is where all the source code and Virtual env is installed

The virtual environment is activated when it shows (venv) PS C:\Users{your-username}\Desktop{folder-you-created})

  • Execute the code by typing the following command
# run the code one after the other
set FLASK_APP=app.py
set FLASK_ENV=development
flask run

OR

python app.py

  • Click on Choose File to upload a File. Then click on the Upload button once you select a file.

  • Navigate to the IAM user (here, James) Console and select the S3 bucket name (here, my-d3m0-files) to view the Objects (files) uploaded.

  • On the Bucket Info Page, under Object Tab, you will see a list of Uploaded files to the S3 Bucket (click on the refresh Icon)

  • Try Uploading Another file

  • Navigate to the bucket info page (here, my-d3m0-files) and click on the refresh Icon

  • The file “Cloud Security Alliance 2021.pdf” has been successfully Uploaded

Note: While Uploading on the Browser, on your VS code, the upload status change

To Also confirm that the file was uploaded via the Webpage and not the AWS Console, you can check the time stamp from the VS code terminal and the Object uploaded time on AWS

Secure Implementation of Cloud File Upload on S3

As a Developer, you can implement restrictions on access to AWS S3 Buckets using ACLs and Bucket policies instead of making the S3 bucket publicly available to Everyone. Check out the Security Recommendation Section below.

NB: It is also recommended to store the AWS credentials as environment variables on your server and access them using os.environ.get(). This way, you can avoid hardcoding sensitive information in the codebase.

from flask import Flask, render_template, request, redirect, url_for, flash, send_file
import os
import boto3
import botocore.exceptions

app = Flask(__name__)

# AWS S3 configuration
AWS_REGION = 'your-aws-region'
S3_BUCKET_NAME = 'your-s3-bucket-name'
S3_BASE_URL = f'https://{S3_BUCKET_NAME}.s3.{AWS_REGION}.amazonaws.com/'

# Configure AWS SDK using environment variables or IAM roles
s3 = boto3.client('s3', region_name=AWS_REGION)

@app.route('/')
def index():
    return render_template('index.html')

@app.route('/upload', methods=['POST'])
def upload_file():
    if 'file' not in request.files:
        flash('No file part')
        return redirect(request.url)

    file = request.files['file']
    if file.filename == '':
        flash('No selected file')
        return redirect(request.url)

    # File validation (example: limit to 10MB)
    if not allowed_file(file.filename):
        flash('Invalid file type')
        return redirect(request.url)
    if not allowed_file_size(file):
        flash('File size exceeds the limit (10MB)')
        return redirect(request.url)

    # Upload the file to S3
    try:
        s3.upload_fileobj(file, S3_BUCKET_NAME, file.filename)
        flash('File successfully uploaded!')
    except botocore.exceptions.ClientError as e:
        flash('An error occurred while uploading the file')
        app.logger.error(str(e))
    except Exception as e:
        flash('An error occurred while uploading the file')
        app.logger.error(str(e))

    return redirect(url_for('index'))

@app.route('/download/<filename>')
def download_file(filename):
    try:
        file_url = f"{S3_BASE_URL}{filename}"
        return redirect(file_url)
    except Exception as e:
        flash('An error occurred while downloading the file')
        app.logger.error(str(e))
        return redirect(url_for('index'))

# Whitelist of allowed file extensions
def allowed_file(filename):
    return '.' in filename and filename.rsplit('.', 1)[1].lower() in {'txt', 'pdf', 'docx'}

def allowed_file_size(file):
    # 10 MB limit (10 * 1024 * 1024 bytes)
    return len(file.read()) <= 10485760

if __name__ == '__main__':
    app.secret_key = os.environ.get('Access_KEY', 'your-secret-key')  # Use environment variable or a default value
    app.run(debug=True)

NOTE:

From the Above code, Replace the following
ACCESS_KEY_ID = 'your-access-key-id'
SECRET_ACCESS_KEY = 'your-secret-access-key'
AWS_REGION = 'your-aws-region'
S3_BUCKET_NAME = 'your-s3-bucket-name' S3_BASE_URL = f'https://{S3_BUCKET_NAME}.s3.{AWS_REGION}.amazonaws.com/'

NB: Use the API Keys Table at the End of PART A to replace your code

Step 1: Create a new Python file

  • Create a new Python file called Sec-app.py and paste the Above Code

Step 2: Execute Code and Test

  • Execute the Code by typing the following command on your VS code terminal
set FLASK_APP=Sec-app.py
set FLASK_ENV=development
python .\Sec-app.py

  • Try to Upload a file that is more than 10MB

  • From the code, the file upload limit is set to 10MB

  • You get an Error Message

  • Trying another file of 11.2 MB

Security Recommendation (Key Lessons)

PART A Solution (S3 Bucket Exposure)

When creating a Cloud API for an S3 bucket in AWS to allow users to upload files via a web page, it is generally not recommended to give public access to the bucket. Public access to an S3 bucket can lead to security vulnerabilities and unintended data exposure, which is a significant concern for sensitive or private data.

Instead, you should follow AWS best practices for securing your S3 bucket while allowing users to upload files:

  1. AWS Identity and Access Management (IAM) Roles: Create an IAM role that grants the necessary permissions for users to upload files to the S3 bucket. The IAM role should be attached to the web application or web page hosting the upload functionality. This way, users can upload files using the web application, and the IAM role will control access to the S3 bucket based on the policies attached to the role.

  2. Pre-signed URLs: One common method to allow users to upload files directly to an S3 bucket without exposing your AWS credentials is by using pre-signed URLs. A pre-signed URL is a time-limited URL generated by your backend server (using the IAM role) that grants temporary access to upload a specific file to the S3 bucket. The web application can then provide this pre-signed URL to the user, allowing them to upload the file directly to S3 without the need for public access.

  3. Bucket Policies and Access Control Lists (ACLs): You can set up specific bucket policies or ACLs to control access to the bucket. With the appropriate permissions and conditions, you can restrict access to only specific IP addresses, IAM users, or IAM roles, further enhancing security.

  4. Cross-Origin Resource Sharing (CORS): If your web application is hosted on a different domain than the S3 bucket, you might need to set up CORS to allow cross-origin requests. This configuration ensures that the web application can securely interact with the S3 bucket while still maintaining access controls.

PART B & C Solution (Application Security)

When setting up a Flask project in Visual Studio Code, developers should take various security precautions to protect their application and its data.

  1. Securely Manage AWS Credentials:

    Instead of hardcoding the AWS access key ID and secret access key in the code, you can use environment variables or AWS IAM roles for better security. By using environment variables, the sensitive credentials won't be exposed in the codebase or logs.

  2. Apply Proper Error Handling:

    Ensure you handle exceptions more securely and do not display detailed error messages to end users (e.g. Username exists but wrong password). Instead, log errors for debugging purposes and provide generic error messages to users.

  3. Implement File Validation:

    Before uploading a file to AWS S3, ensure you validate its size, type, and other attributes to prevent uploading malicious or oversized files.

  4. Rate Limiting: Implement rate-limiting mechanisms to prevent abuse and brute-force attacks on your APIs or login endpoints.

  5. Secure File Uploads: If your application allows file uploads, ensure that files are scanned for malware, stored in a secure location, and served with appropriate content types to prevent code execution vulnerabilities.

  6. HTTPS: Use HTTPS to encrypt communication between clients and your Flask application. Obtain an SSL certificate and configure your web server (e.g., Gunicorn, uWSGI) to serve your application securely.

Deleting Created Resources

  • Navigate to the IAM User S3 Bucket Object Page, select all the Objects (files) uploaded, and Click on Delete

  • Type the words permanently delete and click on the Delete Objects button

  • Navigate to your Main AWS account (root account) where you created the IAM user (here, James). Check the box for the user (here, James) and click on Delete at the top right.

  • A message box pops up to confirm the deletion, Type the User name (here, James)

  • Still, on your Main AWS Account (root account), navigate to the S3 Bucket page and Delete the bucket created (in this lab, my-d3m0-files).

The content (objects) in the bucket is empty already as shown in the previous steps

Related Resources

  1. Exploiting Misconfigured S3 Bucket in AWS Lab Series by Flaws. cloud

  2. Creating a secure S3 bucket

  3. Restricting Access to AWS S3 Buckets using ACL and Bucket Policy

Source Code

Check my GitHub page for the source code

Did you find this article valuable?

Support Everything ~ Cloud Security by becoming a sponsor. Any amount is appreciated!