Skip to main content

How to Convert JSON to CSV in Batch Script

JavaScript Object Notation (JSON) is the universal format for web APIs, log streaming, and NoSQL datastores. However, JSON's nested hierarchy makes it extremely difficult for non-technical users to analyze in Excel or Power BI. To make this data accessible in a tabular format, you must convert the JSON array into a Comma-Separated Values (CSV) file.

Since Batch cannot natively flatten nested arrays without writing complex findstr algorithms, using PowerShell as a bridge is the most robust, reliable, and error-free approach.

The Strategy: The PowerShell Bridge

  1. Identify the source .json file containing an array of objects.
  2. Use PowerShell's Get-Content to read the file.
  3. Use the ConvertFrom-Json cmdlet to cast the JSON text into native .NET objects.
  4. Optionally flatten nested structures, then use Export-Csv to format the objects into rows and write the result to disk.

Setup: A Sample JSON Payload

To test the script, create a sample JSON file named servers.json:

[
{
"Hostname": "web-prod-01",
"IPAddress": "192.168.1.10",
"OS": "Ubuntu 22.04"
},
{
"Hostname": "db-prod-01",
"IPAddress": "192.168.1.20",
"OS": "Windows Server 2022"
}
]

Implementation Script

@echo off
setlocal enabledelayedexpansion

:: Define file paths (using script directory for portability)
set "jsonFile=%~dp0servers.json"
set "csvFile=%~dp0servers.csv"

echo Processing JSON Payload: "%jsonFile%"...
if not exist "%jsonFile%" (
echo [ERROR] The JSON file "%jsonFile%" was not found.
pause
exit /b 1
)

echo Converting to Tabular CSV Format...

:: Execute via PowerShell bridge safely using environment variables
:: -LiteralPath handles special characters like [] in filenames
powershell -NoProfile -Command ^
"$data = Get-Content -LiteralPath $env:jsonFile -Raw -ErrorAction Stop | ConvertFrom-Json; " ^
"$data | Export-Csv -LiteralPath $env:csvFile -NoTypeInformation -Encoding UTF8"

:: Capture the exit code immediately
set "psResult=%errorlevel%"

if %psResult% neq 0 (
echo [ERROR] PowerShell conversion failed with exit code %psResult%.
pause
exit /b 1
)

:: Verify the output file was created
if not exist "%csvFile%" (
echo [ERROR] Output file "%csvFile%" was not created.
pause
exit /b 1
)

echo.
echo ==========================================
echo CONVERSION SUCCESSFUL
echo Output saved at: "%csvFile%"
echo ==========================================

endlocal
pause
exit /b 0

Advanced: Flattening Nested JSON Data

A major problem when converting JSON to CSV is dealing with nested lists. Consider a server object that has "Disks": ["C:", "D:"]. The standard Export-Csv cmdlet will write System.Object[] in that column because Excel doesn't understand lists inside a single cell.

You can solve this by explicitly filtering the properties with custom expressions:

@echo off
setlocal enabledelayedexpansion

:: Define paths relative to script location for portability
set "jsonFile=%~dp0servers.json"
set "csvFile=%~dp0servers_flat.csv"

if not exist "%jsonFile%" (
echo [ERROR] The JSON file "%jsonFile%" was not found.
pause
exit /b 1
)

echo Flattening nested JSON to CSV...

:: Uses $env to safely pass paths and -LiteralPath for robust handling.
:: The custom Select-Object joined the "Disks" array with a semicolon.
powershell -NoProfile -Command ^
"$data = Get-Content -LiteralPath $env:jsonFile -Raw -ErrorAction Stop | ConvertFrom-Json; " ^
"$data | Select-Object Hostname, IPAddress, OS, @{Name='Disks';Expression={$_.Disks -join ';'}} | " ^
"Export-Csv -LiteralPath $env:csvFile -NoTypeInformation -Encoding UTF8"

:: Capture the exit code immediately
set "psResult=%errorlevel%"

if %psResult% neq 0 (
echo [ERROR] PowerShell conversion failed with exit code %psResult%.
pause
exit /b 1
)

if not exist "%csvFile%" (
echo [ERROR] Output file "%csvFile%" was not created.
pause
exit /b 1
)

echo.
echo [SUCCESS] Flattened CSV generated at:
echo "%csvFile%"

endlocal
pause
exit /b 0

Why Convert JSON to CSV?

  1. Reporting Engines: A nightly Batch script that polls the Docker API for stopped containers (returned in JSON), flattens it to CSV, and emails the Excel file directly to operations managers.
  2. Dashboard Analytics: Most Business Intelligence (BI) tools consume flat .csv spreadsheets much faster than calculating complex hierarchical relationships from raw JSON fields.
  3. Auditing Interfaces: Moving user metadata pulled from Azure AD out of the cloud console into a standard local file for compliance auditing.

Important Considerations

  1. Uniformity: CSV files expect all rows to have the same columns. If your JSON array has one object with 15 fields and another object with only 3 fields, the resulting CSV might be sparse or skewed depending on which object is parsed first. Using Select-Object Property1, Property2... forces uniformity across the entire export pipeline.
  2. Raw Encoding: Use the -Raw switch with Get-Content. JSON can contain problematic line breaks. By loading the entire file into memory as a solitary string before passing to ConvertFrom-Json, you prevent formatting glitches.
  3. Large Files: If your JSON file is multi-gigabyte, converting the entire Get-Content array locally into .NET memory will likely crash your script with Out of Memory errors. Stream large JSON files through dedicated tools like jq.exe.

Conclusion

The transformation of dense, hierarchical JSON architecture into a flat, business-readable CSV file unlocks automation metrics for non-technical stakeholders. While native Command Shell logic cannot interpret multi-layered JSON arrays cleanly, employing the ConvertFrom-Json and Export-Csv cmdlets via PowerShell handles the serialization faultlessly.