S3 Configuration¶
S3Storage connects to Amazon S3 or any S3-compatible service (MinIO, DigitalOcean Spaces, Backblaze B2, etc.). Configuration can be provided via constructor parameters or environment variables.
Constructor Parameters¶
| Parameter | Type | Default | Description |
|---|---|---|---|
bucket |
str |
None |
S3 bucket name. Required (via param or env var). |
region_name |
str |
None |
AWS region (e.g. us-east-1). |
endpoint_url |
str |
None |
Custom endpoint URL for S3-compatible services. |
access_key_id |
str |
None |
AWS access key ID. |
secret_access_key |
str |
None |
AWS secret access key. |
security_token |
str |
None |
AWS session token (for temporary credentials). |
object_prefix |
str |
"" |
Prefix added to all object keys (namespacing). |
upload_extra_args |
dict[str, Any] |
None |
Extra arguments passed to boto3's upload_fileobj() (e.g. encryption settings, ACL, metadata). |
presign_ttl |
int |
3600 |
Presigned URL expiration time in seconds. |
Environment Variables¶
When a constructor parameter is not provided, S3Storage falls back to environment variables:
| Parameter | Primary Env Var | Fallback Env Var |
|---|---|---|
bucket |
AWS_S3_BUCKET_NAME |
— |
region_name |
AWS_S3_REGION_NAME |
AWS_REGION |
endpoint_url |
AWS_S3_ENDPOINT_URL |
— |
access_key_id |
AWS_S3_ACCESS_KEY_ID |
AWS_ACCESS_KEY_ID |
secret_access_key |
AWS_S3_SECRET_ACCESS_KEY |
AWS_SECRET_ACCESS_KEY |
security_token |
AWS_SESSION_TOKEN |
AWS_SECURITY_TOKEN |
Priority: constructor parameter > primary env var > fallback env var.
Usage Examples¶
Basic Setup (Sync)¶
from amsdal_storages.s3 import S3Storage
storage = S3Storage(
bucket='my-app-files',
region_name='us-east-1',
access_key_id='AKIA...',
secret_access_key='...',
)
Async Setup¶
When AMSDAL is running in async mode, S3Storage automatically uses aioboto3 for non-blocking I/O. No changes are needed in how you create the storage — just make sure the async dependency is installed:
pip install amsdal_storages[s3] # includes both boto3 and aioboto3
With Object Prefix (Namespacing)¶
Use object_prefix to store files under a common key prefix. This is useful for multi-tenant setups or separating environments:
storage = S3Storage(
bucket='shared-bucket',
object_prefix='tenant-42/uploads',
)
# Files are stored as: tenant-42/uploads/<filename>
Custom Endpoint (MinIO, DigitalOcean Spaces)¶
For S3-compatible services, set the endpoint_url:
# MinIO
storage = S3Storage(
bucket='my-bucket',
endpoint_url='http://localhost:9000',
access_key_id='minioadmin',
secret_access_key='minioadmin',
)
# DigitalOcean Spaces
storage = S3Storage(
bucket='my-space',
region_name='nyc3',
endpoint_url='https://nyc3.digitaloceanspaces.com',
)
Presigned URLs with Custom TTL¶
S3Storage.url() generates presigned URLs for secure, time-limited access. The default TTL is 3600 seconds (1 hour):
storage = S3Storage(
bucket='my-bucket',
presign_ttl=900, # 15 minutes
)
Upload Extra Args¶
Use upload_extra_args to pass additional arguments to every S3 upload. This is commonly used for server-side encryption, custom ACLs, or object metadata:
import os
from amsdal_storages.s3 import S3Storage
from amsdal.storages import set_default_storage
storage = S3Storage(
bucket='my-bucket',
upload_extra_args={
'ServerSideEncryption': 'aws:kms',
'SSEKMSKeyId': os.environ.get('AWS_S3_KMS_KEY_ARN'),
},
)
set_default_storage(storage)
Any key supported by boto3's S3.Client.upload_fileobj ExtraArgs can be used here.
Setting as Default Storage¶
There are two ways to make S3 the default storage for all File fields.
Via Environment Variable¶
export AMSDAL_DEFAULT_FILE_STORAGE='amsdal_storages.s3.storage.S3Storage'
Via Code¶
from amsdal_storages.s3 import S3Storage
from amsdal.storages import set_default_storage
storage = S3Storage(bucket='my-bucket')
set_default_storage(storage)
Per-Field Storage Override¶
You can assign a specific storage instance to individual File fields on your models, overriding the default:
from amsdal_models.classes.fields.file import FileField
from amsdal_models.classes.model import Model
from amsdal.models.core.file import File
from amsdal_storages.s3 import S3Storage
class Document(Model):
attachment: File = FileField(storage=S3Storage(
bucket='documents-bucket',
object_prefix='attachments',
))