Skip to main content
Community Example - Not Officially SupportedThis guide is provided as a community example and is not officially supported by Rhombus Systems. While Rhombus does not officially support on-premise backup, this guide demonstrates how to implement a local backup solution using the Rhombus API.Use this implementation at your own discretion and ensure it meets your organization’s backup and compliance requirements.

Overview

This guide demonstrates how to backup video and audio footage from Rhombus cameras to local storage devices such as Network Attached Storage (NAS), external hard drives, or local servers. The solution uses a Python script that leverages the Rhombus API to download footage in parallel across multiple cameras. The implementation supports:
  • Multi-camera downloads with threading for improved performance
  • Video and audio synchronization with automatic merging
  • Flexible scheduling using cron jobs or task schedulers
  • Customizable time ranges for historical footage backup
  • Location-based filtering to backup specific sites

How It Works

1
Camera Discovery
2
The script queries the Rhombus API to enumerate all cameras in your organization, filtering by connection status, location, or specific camera UUIDs.
3
Session Authentication
4
A federated session token is generated for each camera, providing temporary (1-hour) credentials for media access without exposing your API key in download URLs.
5
Media Download
6
For each camera, the script:
7
  • Requests MPD (MPEG-DASH) playlist URIs for video and audio streams
  • Downloads the initialization segment (seg_init.mp4)
  • Downloads sequential 2-second media segments
  • Writes segments to local storage
  • 8
    Audio-Video Merging
    9
    If audio is available, the script uses FFmpeg to merge video and audio streams into a single file, then cleans up temporary files.

    Prerequisites

    Before implementing local backup, ensure you have:
    • Python 3.7 or higher installed on your backup system
    • Rhombus API key from the Rhombus Console
    • Network connectivity to Rhombus cameras (LAN or WAN)
    • Sufficient storage space on your backup device (estimate 1-2 GB per camera per day)
    • FFmpeg installed for audio-video merging
    Calculate your storage requirements based on camera count, retention period, and recording quality. A typical Rhombus camera generates approximately 40-60 GB per month of footage.

    Installation

    Install Required Software

    # Update package list
    sudo apt update
    
    # Install Python 3 and pip
    sudo apt install python3 python3-pip
    
    # Install FFmpeg
    sudo apt install ffmpeg
    
    # Install Git (to clone the repository)
    sudo apt install git
    

    Download the Backup Script

    Clone the Rhombus API examples repository:
    # Clone the repository
    git clone https://github.com/RhombusSystems/api-examples-python.git
    
    # Navigate to the NAS backup directory
    cd api-examples-python/NAS-Backup-v2
    
    # Install Python dependencies
    pip3 install -r requirements.txt
    

    Verify Installation

    Test that all components are installed correctly:
    # Check Python version
    python3 --version
    
    # Check FFmpeg installation
    ffmpeg -version
    
    # Check pip packages
    pip3 list | grep -E "(requests|ffmpeg|urllib3|xmltodict)"
    

    Configuration

    Command-Line Parameters

    The backup script supports the following parameters:
    --api_key
    string
    required
    Your Rhombus API key for authentication. Required unless using certificate-based authentication.Example: --api_key YOUR_API_KEY_HERE
    --start_time
    integer
    Unix epoch timestamp for backup start time. Defaults to 1 hour ago if not specified.Example: --start_time 1693526400 (August 31, 2023 at 16:00:00 UTC)
    --duration
    integer
    Duration in seconds to backup from start time. Defaults to 3600 seconds (1 hour).Example: --duration 7200 (2 hours)
    --location_uuid
    string
    Filter backup to cameras at a specific location. Useful for multi-site deployments.Example: --location_uuid location-uuid-here
    --camera_uuid
    string
    Backup footage from a specific camera only.Example: --camera_uuid camera-uuid-here
    --usewan
    boolean
    Use WAN addresses instead of LAN. Enable when backup system is outside your local network.Example: --usewan
    --debug
    boolean
    Enable debug logging for troubleshooting.Example: --debug
    --cert
    string
    Path to client certificate for mTLS authentication (advanced use case).Example: --cert /path/to/cert.pem
    --private_key
    string
    Path to private key for mTLS authentication (advanced use case).Example: --private_key /path/to/key.pem

    Usage Examples

    Basic Backup (Last Hour)

    Backup the last hour of footage from all cameras:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY
    

    Specific Time Range

    Backup footage from a specific 2-hour window:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --start_time 1693526400 \
      --duration 7200
    

    Single Camera Backup

    Backup footage from one specific camera:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --camera_uuid camera-uuid-here \
      --duration 3600
    

    Location-Based Backup

    Backup all cameras at a specific location:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --location_uuid location-uuid-here \
      --duration 3600
    

    WAN Access (Remote Backup)

    Backup from outside your local network:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --usewan \
      --duration 3600
    

    Debug Mode

    Enable detailed logging for troubleshooting:
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --debug \
      --duration 3600
    

    Output Files

    Downloaded footage files follow this naming convention:
    {CameraName}_{CameraUUID}_{Timestamp}_{Type}.{Extension}
    
    Examples:
    FrontDoor_abc123def456_1693526400_video.mp4
    Lobby_def789ghi012_1693526400_merged.mp4
    Warehouse_ghi345jkl678_1693526400_video.mp4
    
    File Types:
    • Video-only files: .mp4 format (when no audio is available)
    • Merged files: .mp4 format (video + audio combined via FFmpeg)
    • Temporary files: .webm format (automatically deleted after merging)
    The script automatically cleans up temporary files after successful merging. Only final .mp4 files remain in your backup directory.

    Scheduling Automated Backups

    Using Cron (Linux/macOS/NAS)

    Create automated backups using cron jobs:
    1
    Edit Crontab
    2
    Open your crontab configuration:
    3
    crontab -e
    
    4
    Add Backup Schedule
    5
    Add one of the following examples based on your needs:
    6
    Hourly Backup
    # Backup every hour at minute 0
    0 * * * * cd /path/to/api-examples-python/NAS-Backup-v2 && /usr/bin/python3 copy_footage_script_threading.py --api_key YOUR_API_KEY --duration 3600 >> /var/log/rhombus_backup.log 2>&1
    
    Daily Backup (Midnight)
    # Backup last 24 hours at midnight
    0 0 * * * cd /path/to/api-examples-python/NAS-Backup-v2 && /usr/bin/python3 copy_footage_script_threading.py --api_key YOUR_API_KEY --duration 86400 >> /var/log/rhombus_backup.log 2>&1
    
    Every 4 Hours
    # Backup every 4 hours
    0 */4 * * * cd /path/to/api-examples-python/NAS-Backup-v2 && /usr/bin/python3 copy_footage_script_threading.py --api_key YOUR_API_KEY --duration 14400 >> /var/log/rhombus_backup.log 2>&1
    
    Business Hours Only
    # Backup every 2 hours during business hours (8 AM - 6 PM, Mon-Fri)
    0 8-18/2 * * 1-5 cd /path/to/api-examples-python/NAS-Backup-v2 && /usr/bin/python3 copy_footage_script_threading.py --api_key YOUR_API_KEY --duration 7200 >> /var/log/rhombus_backup.log 2>&1
    
    7
    Save and Verify
    8
    Save the crontab and verify it’s scheduled:
    9
    # List current cron jobs
    crontab -l
    
    # Check cron service status
    sudo systemctl status cron
    
    Cron Schedule Reference:
    • 0 * * * * - Every hour at minute 0
    • */30 * * * * - Every 30 minutes
    • 0 */4 * * * - Every 4 hours
    • 0 0 * * * - Daily at midnight
    • 0 2 * * 0 - Weekly on Sunday at 2 AM

    Using Task Scheduler (Windows)

    1
    Open Task Scheduler
    2
    Press Win + R, type taskschd.msc, and press Enter.
    3
    Create New Task
    4
  • Click “Create Task” in the right panel
  • Name it “Rhombus Footage Backup”
  • Select “Run whether user is logged on or not”
  • 5
    Set Trigger
    6
  • Go to the Triggers tab
  • Click New
  • Choose frequency (Daily, Weekly, etc.)
  • Set start time and recurrence
  • 7
    Configure Action
    8
  • Go to the Actions tab
  • Click New
  • Action: Start a program
  • Program: C:\Python39\python.exe (adjust path)
  • Arguments: copy_footage_script_threading.py --api_key YOUR_API_KEY --duration 3600
  • Start in: C:\path\to\NAS-Backup-v2
  • 9
    Save and Test
    10
  • Click OK to save
  • Right-click the task and select “Run” to test
  • API Endpoints Used

    The backup script interacts with the following Rhombus API endpoints:

    Camera Enumeration

    POST https://api2.rhombussystems.com/api/camera/getMinimalCameraStateList
    
    Retrieves list of cameras with connection status and location information.

    Audio Gateway List

    POST https://api2.rhombussystems.com/api/audiogateway/getMinimalAudioGatewayStateList
    
    Fetches audio devices associated with cameras.

    Session Token Generation

    POST https://api2.rhombussystems.com/api/org/generateFederatedSessionToken
    
    Creates temporary session credentials (1-hour validity) for secure media access.

    Video Media URIs

    POST https://api2.rhombussystems.com/api/camera/getMediaUris
    
    Obtains MPD (MPEG-DASH) playlist templates for video streams.

    Audio Media URIs

    POST https://api2.rhombussystems.com/api/audiogateway/getMediaUris
    
    Obtains MPD templates for audio streams.
    All API calls require authentication using the x-auth-scheme: api-token header with your API key, or certificate-based mTLS authentication.

    Performance Optimization

    Threading Configuration

    The script uses Python’s ThreadPoolExecutor with a maximum of 4 concurrent workers. This balances download speed with API rate limits and system resources. To adjust thread count, modify the script:
    # In copy_footage_script_threading.py
    with ThreadPoolExecutor(max_workers=4) as executor:  # Change 4 to desired value
    
    Recommendations:
    • 2-4 workers: Standard NAS or low-end systems
    • 4-8 workers: High-performance NAS or servers
    • 8-16 workers: Enterprise servers with high bandwidth
    Increasing thread count beyond recommended values may trigger rate limiting or overload your network/storage.

    Storage Considerations

    Calculate Required Space:
    Storage (GB) = Cameras × Days × 1.5 GB/day
    
    Example:
    • 10 cameras × 30 days × 1.5 GB = 450 GB required
    Best Practices:
    • Maintain at least 20% free space on backup device
    • Implement retention policies to delete old footage
    • Monitor disk usage regularly
    • Use compression if long-term archival is needed

    Network Optimization

    LAN vs WAN:
    • LAN Mode (default): Faster downloads, uses local network addresses
    • WAN Mode (--usewan): Required for remote backup, slower but accessible from anywhere
    Bandwidth Requirements:
    • Approximately 2-4 Mbps per concurrent camera download
    • 4 workers = 8-16 Mbps recommended bandwidth

    Retention and Cleanup

    Implement a retention policy to manage storage usage:
    # Create cleanup script
    cat > cleanup_old_footage.sh << 'EOF'
    #!/bin/bash
    BACKUP_DIR="/path/to/backup/directory"
    RETENTION_DAYS=30
    
    find "$BACKUP_DIR" -name "*.mp4" -type f -mtime +$RETENTION_DAYS -delete
    find "$BACKUP_DIR" -name "*.webm" -type f -mtime +$RETENTION_DAYS -delete
    
    echo "Deleted footage older than $RETENTION_DAYS days"
    EOF
    
    chmod +x cleanup_old_footage.sh
    

    Troubleshooting

    Symptoms:
    • 401 Unauthorized errors
    • Invalid API key messages
    Solutions:
    1. Verify API key is correct in Rhombus Console
    2. Ensure API key has camera access permissions
    3. Check that x-auth-scheme header is set correctly
    4. Regenerate API key if compromised
    # Test API authentication
    curl -X POST https://api2.rhombussystems.com/api/camera/getMinimalCameraStateList \
      -H "x-auth-scheme: api-token" \
      -H "x-auth-apikey: YOUR_API_KEY"
    
    Symptoms:
    • Downloads take longer than expected
    • Network timeout errors
    Solutions:
    1. Use LAN mode instead of WAN if backing up locally
    2. Reduce thread count in ThreadPoolExecutor
    3. Check network bandwidth and camera connectivity
    4. Verify storage device write speed
    5. Consider scheduling backups during off-peak hours
    # Test with fewer threads and debug mode
    python3 copy_footage_script_threading.py \
      --api_key YOUR_API_KEY \
      --debug \
      --duration 600  # Start with 10 minutes
    
    Symptoms:
    • Temporary .webm files remain
    • No merged .mp4 output
    • FFmpeg error messages
    Solutions:
    1. Verify FFmpeg is installed and in PATH
    2. Check that both video and audio files were downloaded
    3. Ensure sufficient disk space for temporary files
    4. Update FFmpeg to latest version
    # Check FFmpeg installation
    which ffmpeg
    ffmpeg -version
    
    # Manual merge test
    ffmpeg -i video.webm -i audio.webm -c copy output.mp4
    
    Symptoms:
    • Script reports 0 cameras to backup
    • Empty camera list
    Solutions:
    1. Verify cameras are online in Rhombus Console
    2. Check location UUID filter if specified
    3. Ensure API key has access to cameras
    4. Review camera connection status filters in script
    # List all cameras via API
    curl -X POST https://api2.rhombussystems.com/api/camera/getMinimalCameraStateList \
      -H "x-auth-scheme: api-token" \
      -H "x-auth-apikey: YOUR_API_KEY" \
      | python3 -m json.tool
    
    Symptoms:
    • Script fails with write errors
    • No space left on device messages
    Solutions:
    1. Check available disk space: df -h
    2. Implement retention policy to delete old footage
    3. Reduce backup duration or frequency
    4. Add additional storage capacity
    5. Use compression for archived footage
    # Check disk usage
    df -h /path/to/backup
    
    # Find largest files
    du -sh /path/to/backup/* | sort -hr | head -10
    
    Symptoms:
    • Scheduled backups don’t execute
    • No new footage files created
    Solutions:
    1. Check cron service status: systemctl status cron
    2. Verify crontab syntax: crontab -l
    3. Check cron logs: grep CRON /var/log/syslog
    4. Ensure script paths are absolute
    5. Verify user permissions
    # Test cron job manually
    cd /path/to/NAS-Backup-v2
    /usr/bin/python3 copy_footage_script_threading.py --api_key YOUR_API_KEY
    
    # Check cron logs
    tail -f /var/log/syslog | grep CRON
    

    Best Practices

    Secure API Keys

    Never hardcode API keys in scripts. Use environment variables or secure key management systems. Rotate API keys regularly.

    Monitor Backups

    Set up monitoring and alerting for backup failures. Review logs regularly to ensure backups complete successfully.

    Test Restores

    Regularly test restoring footage from backups to verify integrity and confirm your recovery process works.

    Implement Retention

    Define and enforce retention policies to manage storage costs and comply with data protection regulations.

    Use Redundancy

    Consider multiple backup locations (onsite + offsite) for critical footage. Implement the 3-2-1 backup rule.

    Document Procedures

    Maintain documentation of your backup configuration, schedules, and recovery procedures for your team.

    Security Considerations

    Follow these security best practices when implementing local backup:

    API Key Protection

    1. Store securely: Use environment variables or credential managers
    2. Restrict access: Limit file permissions on scripts containing keys
    3. Rotate regularly: Change API keys periodically
    4. Audit usage: Monitor API key activity in Rhombus Console
    # Store API key in environment variable
    export RHOMBUS_API_KEY="your-api-key-here"
    
    # Use in script
    python3 copy_footage_script_threading.py --api_key "$RHOMBUS_API_KEY"
    
    # Restrict script permissions
    chmod 600 backup_script.sh
    

    Network Security

    • Use LAN mode when possible to avoid WAN exposure
    • Implement firewall rules to restrict access
    • Consider VPN for remote backup scenarios
    • Enable mTLS authentication for enhanced security

    Storage Security

    • Encrypt backup storage devices
    • Restrict file system permissions
    • Implement access controls on NAS/server
    • Regularly audit who has access to backup files

    Compliance Considerations

    When implementing local backup, consider: Data Retention:
    • Follow your organization’s data retention policies
    • Comply with industry regulations (HIPAA, GDPR, etc.)
    • Document retention periods and deletion procedures
    Access Control:
    • Maintain audit logs of who accesses backup footage
    • Implement role-based access controls
    • Document authorized personnel
    Data Protection:
    • Encrypt data at rest and in transit
    • Implement secure deletion procedures
    • Regular security assessments

    Advanced Configuration

    Custom Output Directory

    Modify the script to save files to a specific directory:
    # In copy_footage_script_threading.py
    OUTPUT_DIR = "/mnt/nas/rhombus_backups"
    
    # Create directory structure by date
    import datetime
    date_dir = datetime.datetime.now().strftime("%Y-%m-%d")
    output_path = os.path.join(OUTPUT_DIR, date_dir)
    os.makedirs(output_path, exist_ok=True)
    

    Email Notifications

    Add email alerts for backup completion or failures:
    import smtplib
    from email.mime.text import MIMEText
    
    def send_notification(status, message):
        msg = MIMEText(message)
        msg['Subject'] = f'Rhombus Backup {status}'
        msg['From'] = '[email protected]'
        msg['To'] = '[email protected]'
    
        with smtplib.SMTP('smtp.gmail.com', 587) as server:
            server.starttls()
            server.login('your-email', 'your-password')
            server.send_message(msg)
    
    # Use after backup completes
    try:
        # ... backup code ...
        send_notification("Success", "All cameras backed up successfully")
    except Exception as e:
        send_notification("Failed", f"Backup failed: {str(e)}")
    

    Webhook Integration

    Trigger webhooks after backup completion:
    import requests
    
    def trigger_webhook(status, cameras_backed_up):
        webhook_url = "https://your-webhook-endpoint.com/backup"
        payload = {
            "status": status,
            "timestamp": datetime.datetime.now().isoformat(),
            "cameras": cameras_backed_up
        }
        requests.post(webhook_url, json=payload)
    

    Next Steps

    Additional Resources

    For questions, issues, or to share your backup implementation, visit the Rhombus Developer Community and post in the Guides & Resources section.