Operations metrics
Track KMS and S3 API calls alongside your DynamoDB operations. When you use encryption or large object storage, pydynox measures the time and calls to these services.
Why track operations metrics
DynamoDB metrics alone don't tell the full story. If you use:
- EncryptedAttribute - KMS calls add latency and cost
- S3Attribute - S3 uploads/downloads add latency and data transfer costs
Without tracking these, you might wonder why a "simple save" takes 200ms when DynamoDB only took 10ms.
KMS metrics
When you use EncryptedAttribute, pydynox calls AWS KMS to encrypt and decrypt data. Each call adds latency (typically 10-50ms) and costs money.
import asyncio
from pydynox import Model, ModelConfig
from pydynox.attributes import EncryptedAttribute, StringAttribute
class SecureUser(Model):
model_config = ModelConfig(table="users")
pk: str = StringAttribute(partition_key=True)
sk: str = StringAttribute(sort_key=True)
ssn: str = EncryptedAttribute(key_id="alias/my-app-key")
async def main():
# Save with encrypted field
user = SecureUser(pk="USER#1", sk="PROFILE", ssn="123-45-6789")
await user.save()
# Check KMS metrics
total = SecureUser.get_total_metrics()
print(f"KMS duration: {total.kms_duration_ms}ms")
print(f"KMS calls: {total.kms_calls}")
asyncio.run(main())
What's tracked
| Field | Type | Description |
|---|---|---|
kms_duration_ms |
float | Total time spent on KMS calls |
kms_calls |
int | Number of KMS API calls |
When metrics are collected
save()- encrypts fields before writing to DynamoDBget()- decrypts fields after reading from DynamoDBquery()/scan()- decrypts fields in each returned item
Envelope encryption
pydynox uses envelope encryption. This means:
- One
GenerateDataKeycall per encrypt operation - Local AES-256-GCM encryption (no KMS call per field)
- One
Decryptcall per decrypt operation
So even with multiple encrypted fields, you typically see 1 KMS call per save/get.
S3 metrics
When you use S3Attribute, pydynox uploads and downloads files from S3. Track these operations to understand data transfer costs and latency.
"""Example: S3 metrics in Model observability.
Shows how to track S3 upload/download metrics when using S3Attribute.
"""
import asyncio
from pydynox import DynamoDBClient, Model, ModelConfig
from pydynox._internal._s3 import S3File
from pydynox.attributes import S3Attribute, StringAttribute
# Create client
client = DynamoDBClient(region="us-east-1")
class Document(Model):
"""Document model with S3 content."""
model_config = ModelConfig(table="documents", client=client)
pk = StringAttribute(partition_key=True)
sk = StringAttribute(sort_key=True)
name = StringAttribute()
content = S3Attribute(bucket="my-bucket", prefix="docs/")
async def main():
# Reset metrics
Document.reset_metrics()
# Save document with S3 content
doc = Document(pk="DOC#1", sk="v1", name="report.pdf")
doc.content = S3File(b"PDF content here...", name="report.pdf")
await doc.save()
# Check S3 metrics
metrics = Document.get_total_metrics()
print("=== S3 Metrics ===")
print(f"S3 duration: {metrics.s3_duration_ms:.2f} ms")
print(f"S3 API calls: {metrics.s3_calls}")
print(f"Bytes uploaded: {metrics.s3_bytes_uploaded}")
print(f"Bytes downloaded: {metrics.s3_bytes_downloaded}")
# Combined with DynamoDB metrics
print("\n=== All Metrics ===")
print(f"Total duration: {metrics.total_duration_ms:.2f} ms")
print(f"DynamoDB operations: {metrics.operation_count}")
print(f"RCU consumed: {metrics.total_rcu}")
print(f"WCU consumed: {metrics.total_wcu}")
asyncio.run(main())
What's tracked
| Field | Type | Description |
|---|---|---|
s3_duration_ms |
float | Total time spent on S3 calls |
s3_calls |
int | Number of S3 API calls |
s3_bytes_uploaded |
int | Total bytes uploaded to S3 |
s3_bytes_downloaded |
int | Total bytes downloaded from S3 |
When metrics are collected
save()- uploadsS3Filevalues to S3delete()- deletes associated S3 objectsasync_save()/async_delete()- async versions
Multipart uploads
For files larger than 5MB, S3 uses multipart upload. The s3_calls field counts:
- Each part upload (one per 5MB chunk)
- The
CreateMultipartUploadcall - The
CompleteMultipartUploadcall
A 15MB file would show ~5 S3 calls (create + 3 parts + complete).
Combined metrics
All metrics are available through get_total_metrics():
metrics = MyModel.get_total_metrics()
# DynamoDB metrics
print(f"DynamoDB duration: {metrics.total_duration_ms}ms")
print(f"RCU: {metrics.total_rcu}, WCU: {metrics.total_wcu}")
# KMS metrics
print(f"KMS duration: {metrics.kms_duration_ms}ms")
print(f"KMS calls: {metrics.kms_calls}")
# S3 metrics
print(f"S3 duration: {metrics.s3_duration_ms}ms")
print(f"S3 calls: {metrics.s3_calls}")
print(f"S3 uploaded: {metrics.s3_bytes_uploaded} bytes")
print(f"S3 downloaded: {metrics.s3_bytes_downloaded} bytes")
Reset metrics
In long-running processes, reset metrics at the start of each request:
# At request start
MyModel.reset_metrics()
# ... do work ...
# At request end
metrics = MyModel.get_total_metrics()
log_metrics(metrics)
Use cases
Cost analysis
Track KMS and S3 costs per operation:
metrics = User.get_total_metrics()
# KMS costs ~$0.03 per 10,000 requests
kms_cost = metrics.kms_calls * 0.000003
# S3 PUT costs ~$0.005 per 1,000 requests
s3_put_cost = metrics.s3_calls * 0.000005
# S3 data transfer ~$0.09 per GB
s3_transfer_cost = (metrics.s3_bytes_uploaded + metrics.s3_bytes_downloaded) / 1e9 * 0.09
Performance debugging
Find where time is spent:
metrics = Document.get_total_metrics()
total = metrics.total_duration_ms + metrics.kms_duration_ms + metrics.s3_duration_ms
print(f"DynamoDB: {metrics.total_duration_ms / total * 100:.1f}%")
print(f"KMS: {metrics.kms_duration_ms / total * 100:.1f}%")
print(f"S3: {metrics.s3_duration_ms / total * 100:.1f}%")
Lambda optimization
In Lambda, every millisecond counts. Track all services:
def handler(event, context):
Document.reset_metrics()
# ... process request ...
metrics = Document.get_total_metrics()
# Log for analysis
logger.info({
"dynamodb_ms": metrics.total_duration_ms,
"kms_ms": metrics.kms_duration_ms,
"s3_ms": metrics.s3_duration_ms,
"s3_bytes": metrics.s3_bytes_uploaded + metrics.s3_bytes_downloaded,
})
Next steps
- Encryption - Field-level encryption with KMS
- S3 attribute - Store large objects in S3
- Observability - DynamoDB metrics and logging