#430 App: File Backup Manager (backup_manager)
Description
Edit## Overview
Scheduled file and database backups to S3 with retention policies and status reporting.
## App Metadata
- **App Name:** backup_manager
- **Publisher:** platform
- **Version:** 1.0.0
- **Priority:** Medium
- **Complexity:** Medium
- **Phase:** 3 (Advanced)
## Actions
### 1. backup_directory
**Description:** Create tarball of directory and upload to S3
**Parameters:**
- source_path (str, required): Directory path to backup
- backup_name (str, required): Name for the backup
- s3_prefix (str, default='backups/'): S3 key prefix
- compression (str, default='gzip'): Compression (gzip|bzip2|none)
- exclude_patterns (list, optional): Glob patterns to exclude
- encryption (bool, default=True): Encrypt backup at rest
**Returns:**
{
"success": true,
"backup_id": "backup_20250101_120000",
"s3_key": "backups/myapp/backup_20250101_120000.tar.gz",
"size_bytes": 1024000,
"files_count": 150,
"duration_seconds": 45
}
### 2. backup_database
**Description:** Dump database and upload to S3
**Parameters:**
- database_type (str, required): Database type (postgresql|mysql)
- database_alias (str, required): Database connection alias
- backup_name (str, required): Name for backup
- s3_prefix (str, default='backups/db/'): S3 prefix
- include_schema (bool, default=True): Include schema definitions
- tables (list, optional): Specific tables to backup (all if omitted)
**Returns:**
{
"success": true,
"backup_id": "db_backup_20250101",
"s3_key": "backups/db/mydb_20250101.sql.gz",
"size_bytes": 5000000,
"tables_count": 25,
"duration_seconds": 120
}
### 3. cleanup_old
**Description:** Remove backups older than retention period
**Parameters:**
- s3_prefix (str, required): S3 prefix to scan
- retention_days (int, default=30): Keep backups newer than this
- dry_run (bool, default=True): Preview without deleting
- min_keep (int, default=3): Minimum backups to keep regardless of age
**Returns:**
{
"deleted_count": 5,
"deleted_size_bytes": 25000000,
"retained_count": 10,
"dry_run": false
}
### 4. send_report
**Description:** Email backup status report
**Parameters:**
- backup_results (list, required): List of backup results
- recipient_email (str, required): Report recipient
- include_errors_only (bool, default=False): Only report failures
- cleanup_results (dict, optional): Cleanup operation results
**Returns:** {"sent": true, "report_type": "summary|errors_only"}
## Example Workflow - Daily Database Backup
builder = WorkflowBuilder(name='daily_db_backup', version='1.0.0')
builder.task(
task_id='backup_main_db',
function='apps.platform.backup_manager.backup_database',
kwargs={
'database_type': 'postgresql',
'database_alias': 'production',
'backup_name': 'production_daily'
},
result_key='backup_result'
)
builder.task(
task_id='cleanup_old_backups',
function='apps.platform.backup_manager.cleanup_old',
kwargs={
's3_prefix': 'backups/db/production_daily',
'retention_days': 30,
'dry_run': False
},
result_key='cleanup_result'
)
builder.task(
task_id='send_status',
function='apps.platform.backup_manager.send_report',
kwargs={
'backup_results': ['{{backup_result}}'],
'cleanup_results': '{{cleanup_result}}',
'recipient_email': 'ops@example.com'
}
)
## Implementation Notes
- Use existing S3 integration (DataShard)
- For directory backup: tarfile module + boto3 multipart upload
- For PostgreSQL: pg_dump via subprocess or container
- For MySQL: mysqldump via container
- Implement encryption using AWS KMS or local AES-256
- Stream large files to avoid memory issues
## Configuration Schema
{
"s3_bucket": "configured_via_installation",
"default_retention_days": 30,
"max_backup_size_gb": 50,
"encryption_enabled": true
}
## Secrets Config
{
"s3_access_key": "vault:shared/s3:access_key",
"s3_secret_key": "vault:shared/s3:secret_key",
"db_production": "vault:tenant/databases:production",
"encryption_key": "vault:tenant/backup:encryption_key"
}
## Dependencies
- boto3 for S3 operations
- tarfile (stdlib)
- subprocess for pg_dump/mysqldump
- Email sending (reuse SMTP config)
Comments
Loading comments...
Context
Loading context...
Audit History
View AllLoading audit history...