S3 upload timeout on dockerized Digital Ocean setup

I have a S3 compatible storage and a Droplet server on Digital Ocean. The dockerized Django app I am running is trying to sync static assets to the storage. This fails from the Droplet server/Docker container, but not when accessing the same S3 storage from my local setup. I can also test the upload straight from the server (outside the dockerized app) and this works, too. So something about the Docker setup is making the S3 requests fail.

I made a simple test case, in s3upload.py with foobar.txt present in the same directory:

from boto3.s3.transfer import S3Transfer
import boto3
import logging

logging.getLogger().setLevel(logging.DEBUG)

client = boto3.client('s3', 
                      aws_access_key_id="…", 
                      aws_secret_access_key="…", 
                      region_name="ams3", 
                      endpoint_url="https://ams3.digitaloceanspaces.com")
transfer = S3Transfer(client)
bucket_name = "…"
transfer.upload_file("foobar.txt", bucket_name, "foobar.txt")

The error I am seeing when calling this from the docker container is:

Traceback (most recent call last):
  File "/usr/local/lib/python3.13/site-packages/boto3/s3/transfer.py", line 372, in upload_file
    future.result()
    ~~~~~~~~~~~~~^^
  File "/usr/local/lib/python3.13/site-packages/s3transfer/futures.py", line 103, in result
    return self._coordinator.result()
           ~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/usr/local/lib/python3.13/site-packages/s3transfer/futures.py", line 264, in result
    raise self._exception
  File "/usr/local/lib/python3.13/site-packages/s3transfer/tasks.py", line 135, in __call__
    return self._execute_main(kwargs)
           ~~~~~~~~~~~~~~~~~~^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/s3transfer/tasks.py", line 158, in _execute_main
    return_value = self._main(**kwargs)
  File "/usr/local/lib/python3.13/site-packages/s3transfer/upload.py", line 796, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
    ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/botocore/client.py", line 569, in _api_call
    return self._make_api_call(operation_name, kwargs)
           ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/botocore/client.py", line 1023, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (RequestTimeout) when calling the PutObject operation (reached max retries: 4): None

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/s3upload.py", line 14, in <module>
    transfer.upload_file("foobar.txt", bucket_name, "foobar.txt")
    ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/boto3/s3/transfer.py", line 378, in upload_file
    raise S3UploadFailedError(
    ...<3 lines>...
    )
boto3.exceptions.S3UploadFailedError: Failed to upload foobar.txt to [bucketname]/foobar.txt: An error occurred (RequestTimeout) when calling the PutObject operation (reached max retries: 4): None

First I wondered about some possible service firewall setting, but the call from outside the docker (on the server droplet) container works. This also confirms the credentials and upload code as such work.

Since this seems to be related to Docker rather than the Digital Ocean Droplet or S3, here my docker setup:

The simple docker compose.yml for the app looks something like this:

services:

  db:
    image: postgres:17
    environment:
      POSTGRES_DB: ${POSTGRES_DB}
      POSTGRES_USER: ${POSTGRES_USER}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data
    env_file:
      - .env

  django-web:
    build: .
    container_name: django-docker
    ports:
      - "80:80"
    expose:
      - "80"
    depends_on:
      - db
    environment:
      SECRET_KEY: ${SECRET_KEY}
      DEBUG: ${DEBUG}

      POSTGRES_DB: ${POSTGRES_DB}
      POSTGRES_USER: ${POSTGRES_USER}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}

      DATABASE_HOST: ${DATABASE_HOST}
      DATABASE_PORT: ${DATABASE_PORT}
    env_file:
      - .env

volumes:
  postgres_data:

And the Dockerfile:

# Stage 1: Build

# Use the official Python runtime image
FROM python:3.13-slim AS builder 
 
# Create the app directory
RUN mkdir /app
 
# Set the working directory inside the container
WORKDIR /app
 
# Set environment variables 
# Prevents Python from writing pyc files to disk
ENV PYTHONDONTWRITEBYTECODE=1
#Prevents Python from buffering stdout and stderr
ENV PYTHONUNBUFFERED=1 
 
# Upgrade pip
RUN pip install --upgrade pip 
 
# Copy the Django project  and install dependencies
COPY requirements.txt  /app/
 
# run this command to install all dependencies 
RUN pip install --no-cache-dir -r requirements.txt



# Stage 2: Run production code

FROM python:3.13-slim

RUN useradd -m -r appuser && mkdir /app && chown -R appuser /app

# Copy the Python dependencies from the builder stage
COPY --from=builder /usr/local/lib/python3.13/site-packages/ /usr/local/lib/python3.13/site-packages/
COPY --from=builder /usr/local/bin/ /usr/local/bin/

# Set the working directory
WORKDIR /app

# Copy application code
COPY --chown=appuser:appuser . .
 
# Set environment variables to optimize Python
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1 
 
# Switch to non-root user
USER appuser

# Expose the Django port
EXPOSE 80

# # Migrate the DB
# RUN ["python", "manage.py", "migrate"]

# Gather static assets

# ---> This is the s3 command not working
# RUN ["python", "manage.py", "collectstatic", "--no-input"]

# Run the app via the gunicorn server
CMD ["gunicorn", "--bind", "0.0.0.0:80", "--workers", "3", "fonts.wsgi"]
# python manage.py migrate && python manage.py collectstatic --no-input && gunicorn myapp.wsgi

So to summarize:

In my local test environment:

  • python s3upload.py works
  • docker compose exec -T django-web python s3upload.py works

On my Digital Ocean server droplet:

  • python s3upload.py works
  • docker compose exec -T django-web python s3upload.py times out with the above error

The error message is also very unhelpful. I've seen other issues with the boto3.exceptions.S3UploadFailedError: Failed to upload [filename] to [bucketname]/[filename]: An error occurred (RequestTimeout) when calling the PutObject operation (reached max retries: 4): error, but the final None detail of the error is a bit vexing.

Any clues how to debug this further or what could be causing this?

Вернуться на верх