How to Interact with AWS CLI from a Batch Script
Amazon Web Services (AWS) provides the AWS Command Line Interface (AWS CLI), a unified tool to manage your AWS services. From downloading files from S3 to spinning up EC2 instances, the AWS CLI can be completely controlled via Windows Batch scripts, allowing you to orchestrate cloud operations natively from your local environment.
In this guide, we will demonstrate how to authenticate and execute common AWS CLI commands directly from a Batch script.
Setup: AWS CLI Configuration
Before running a script, you must configure your credentials. Unlike Azure which relies heavily on Service Principals in scripts, AWS CLI relies heavily on pre-configured profiles.
- Install the AWS CLI for Windows.
- Open your command prompt and run:
aws configure(oraws configure --profile myProfile). - Enter your AWS Access Key ID, Secret Access Key, Region, and Output format (typically
jsonortext).
Implementation Script
@echo off
setlocal enabledelayedexpansion
:: 1. Define AWS Profile and Region
set "awsProfile=default"
set "awsRegion=us-east-1"
:: 2. Verify AWS CLI is installed
where aws >nul 2>nul
if !errorlevel! neq 0 (
echo [ERROR] AWS CLI is not installed or not in PATH.
pause
exit /b 1
)
echo Verifying AWS Identity...
:: Command: Get Caller Identity
call aws sts get-caller-identity --profile "%awsProfile%" >nul 2>nul
if !errorlevel! neq 0 (
echo [ERROR] Authentication failed. Check your credentials.
pause
exit /b 1
)
echo [OK] Identity verified.
echo.
:: 3. Example: List all S3 Buckets
echo --- S3 BUCKETS ---
call aws s3 ls --profile "%awsProfile%"
if !errorlevel! neq 0 (
echo [ERROR] Failed to list S3 buckets.
pause
exit /b 1
)
echo.
:: 4. Example: Upload a local file to S3
set "localFile=C:\Backups\Database.bak"
set "s3Bucket=s3://my-company-backup-bucket/daily/"
if exist "%localFile%" (
echo Uploading "%localFile%" to %s3Bucket%...
call aws s3 cp "%localFile%" "%s3Bucket%" --profile "%awsProfile%"
if !errorlevel! neq 0 (
echo [ERROR] Failed to upload "%localFile%" to S3.
pause
exit /b 1
)
echo [OK] Upload complete.
) else (
echo [SKIP] Upload skipped, file "%localFile%" not found.
)
:: 5. Example: Querying EC2 Instances (JSON Parsing)
echo.
echo --- EC2 INSTANCES (Running^) ---
:: Use --output text or table for easy Batch reading
call aws ec2 describe-instances --filters "Name=instance-state-name,Values=running" --query "Reservations[*].Instances[*].[InstanceId, PublicIpAddress]" --output table --profile "%awsProfile%" --region "%awsRegion%"
if !errorlevel! neq 0 (
echo [ERROR] Failed to query EC2 instances.
pause
exit /b 1
)
echo.
echo [SUCCESS] AWS Operations Complete!
endlocal
pause
exit /b 0
Parsing AWS Output into Batch Variables
A common requirement is extracting a specific piece of information, like an Instance IP or an S3 Object URL, and saving it to a variable for the next step of the script. The AWS CLI supports the --query parameter (which uses JMESPath), and outputting as text makes FOR /F loops perfectly suited to read the result.
@echo off
setlocal enabledelayedexpansion
set "bucketName=my-public-assets"
:: Get the total size of all objects in a bucket
set "totalSizeBytes="
for /f "tokens=*" %%A in ('call aws s3api list-objects --bucket "%bucketName%" --query "[sum(Contents[].Size)]" --output text 2^>nul') do set "totalSizeBytes=%%A"
if not defined totalSizeBytes (
echo [ERROR] Failed to retrieve bucket size for "%bucketName%".
pause
exit /b 1
)
echo Bucket "%bucketName%" consumes !totalSizeBytes! bytes.
endlocal
pause
exit /b 0
Why Interact with AWS CLI from Batch?
- Automated Offsite Backups: Using
aws s3 syncinside a scheduled Batch script to push local directories to Amazon S3 incrementally. - Environment Orchestration: A script that automatically starts a development EC2 instance (
aws ec2 start-instances), waits 30 seconds, and then SSHs into it. - Cross-Cloud Operations: A script that pulls data from Azure, processes it locally, and pushes the result to an AWS Glue job.
Important Considerations
- Use
call: Similar to the Azure CLI,awscommands on Windows are sometimes routed through batch wrappers. Usingcall aws s3 lsensures your script doesn't unexpectedly terminate after the first AWS command executes. - Credential Security: The
aws configurecommand stores your credentials plaintext in~/.aws/credentials. If running on a shared server, ensure only the authorized Service Account has read access to this file. Alternatively, use IAM Roles if running the script directly on an EC2 instance. - Output Formats: By default, AWS CLI outputs detailed JSON. Use
--output textor--output tablewhen interacting with Batch to avoid the nightmare of parsing multi-line JSON natively.
Conclusion
Interfacing with the AWS CLI brings the full power of Amazon Web Services to your local Windows automation. By utilizing configured profiles, the call command, and text-based output formatting, you can construct elegant Batch scripts that provision cloud resources, sync terabytes of data, and manage massive EC2 fleets with a single double-click.