RESERVE achieves efficient batch data retrieval through controlled concurrency.
RESERVE uses concurrency to improve performance when working with multiple FRED® series. Rather than fetching data one request at a time, RESERVE allows multiple network requests to be in flight simultaneously—reducing total execution time for batch operations.
Concurrency is intentionally scoped to network-bound workloads, where latency—not CPU—is the primary bottleneck.
How Concurrency Works
Concurrency in RESERVE is controlled via a config.json setting or a global flag:
# Global flag example
--concurrency int max parallel requests for batch operations (default: 8)
# Configuration in config.json
{
"api_key": "5fc037###################",
"default_format": "table",
"timeout": "30s",
"concurrency": 8,
"rate": 5,
"base_url": "https://api.stlouisfed.org/fred/",
"db_path": ""
}If not specified, RESERVE defaults to 8 concurrent requests.
When multiple series are requested, RESERVE:
- launches one goroutine per series
- limits active work using a bounded semaphore
- waits for all operations to complete
- preserves output in the original input order
This ensures that concurrency improves performance without introducing unpredictability.
Where Concurrency Is Used
Concurrency is applied in batch data retrieval workflows, including:
Series Metadata
- reserve meta series GDP CPIAUCSL UNRATE
- reserve fetch series GDP CPIAUCSL –with-meta
Observations (Time Series Data)
- reserve obs get GDP CPIAUCSL UNRATE
- reserve fetch series GDP CPIAUCSL –with-obs
- reserve fetch query “inflation” –with-obs
# FRED API does not return a combined payload. The below uses concurrencty to make the SERIES request AND make the OBS request and stitch them together in a single JSON payload.
reserve fetch series GDP --with-obs
{
"series_id": "GDP",
"title": "Gross Domestic Product",
"frequency": "Quarterly",
"units": "Billions of Dollars",
"observations": [
{"date": "2023-01-01", "value": 26813.6},
{"date": "2023-04-01", "value": 27063.0},
{"date": "2023-07-01", "value": 27357.8}
]
}
# A reqeuest like this clearly bennefits from concurrent reqeuests and background data merging.
reserve fetch series GDP CPIAUCSL UNRATE FEDFUNDS PCE --with-obs --concurrency 8Mixed Workflows
- reserve fetch series GDP CPIAUCSL –store
In these cases, each series is fetched independently and concurrently.
Where Concurrency Is Not Used
Concurrency is not applied to:
- Single-series requests
- Category, release, source, and tag browsing
- Local transformations (transform, window)
- Statistical analysis (analyze)
- Terminal rendering (chart)
- Local database writes (intentionally batched)
These operations are either:
- not network-bound
- already efficient
- or designed to remain deterministic and simple
Rate Limiting and Safety
Concurrency is combined with client-side rate limiting. The following config.json values translate to 8 concurrent requests limited to 5 requests per second:
"concurrency": 8,
"rate": 5,Even if concurrency is increased, requests are still throttled to avoid overwhelming the FRED API.
This means:
- concurrency controls parallelism
- rate limiting controls request pace
Together, they ensure safe and reliable API usage.
Why Concurrency Improves Performance
The performance gain comes from overlapping network latency.
Without concurrency
Fetching 10 series:
- 10 sequential HTTP requests
- total time = sum of all request latencies
With concurrency
Fetching 10 series:
- multiple requests in flight simultaneously
- total time ≈ slowest group of requests
This can significantly reduce wall-clock time for batch operations.
Example
reserve fetch series GDP CPIAUCSL UNRATE FEDFUNDS --with-obsWith concurrency:
- all series requests overlap
- total execution time is reduced
Design Principles
RESERVE uses a bounded concurrency model:
- Fixed number of concurrent workers
- No unbounded fan-out
- Controlled resource usage
- Deterministic output ordering
Failures are handled gracefully:
- partial results are returned
- warnings are collected per series
- one failure does not stop the entire batch
When to Adjust Concurrency
Increase concurrency when:
- fetching many series
- working over high-latency networks
Keep defaults when:
- working with small batches
- using rate-limited environments
Higher values do not always mean faster results, especially when rate limits apply.
Summary
Concurrency in RESERVE is:
- targeted at network-bound operations
- bounded and controlled
- safe through rate limiting
- deterministic in output
It provides meaningful performance improvements for batch data workflows without sacrificing reliability or reproducibility.