Implementation:Apache Druid Task Submission Api
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Data_Ingestion, Task_Management |
| Last Updated | 2026-02-10 00:00 GMT |
Overview
Concrete API call for submitting a complete batch ingestion specification to the Druid Overlord task endpoint.
Description
The handleSubmitTask method in LoadDataView POSTs the complete ingestion spec to /druid/indexer/v1/task via the Api singleton (Axios-based HTTP client). On success, it extracts the returned task ID and navigates the user to the Tasks view filtered by the target datasource. On failure, it displays an error toast notification.
Usage
Call this function when the user clicks the "Submit" button after reviewing the complete ingestion spec in the final wizard step.
Code Reference
Source Location
- Repository: Apache Druid
- File: web-console/src/views/load-data-view/load-data-view.tsx
- Lines: L3700-L3719
Signature
// LoadDataView class method
private handleSubmitTask = async (): Promise<void> => {
const { spec } = this.state;
// Submits spec to POST /druid/indexer/v1/task
// On success: navigates to Tasks view
// On error: shows AppToaster error notification
}
// Underlying API call:
Api.instance.post('/druid/indexer/v1/task', spec)
// Returns: { task: string } — the task ID
Import
import { Api } from '../../singletons';
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| spec | Partial<IngestionSpec> | Yes | Complete ingestion spec with all configuration (source, format, schema, partitioning, tuning, publication) |
Outputs
| Name | Type | Description |
|---|---|---|
| task | string | Task ID returned from the Druid Overlord |
| navigation | URL change | Browser navigates to Tasks view filtered by datasource name |
Usage Examples
Submitting a Batch Task
import { Api } from '../../singletons';
const spec = {
type: 'index_parallel',
spec: {
dataSchema: {
dataSource: 'my_events',
timestampSpec: { column: 'ts', format: 'iso' },
dimensionsSpec: { dimensions: ['user', 'action'] },
metricsSpec: [{ type: 'count', name: 'count' }],
granularitySpec: { segmentGranularity: 'DAY', queryGranularity: 'HOUR', rollup: true },
},
ioConfig: {
type: 'index_parallel',
inputSource: { type: 's3', uris: ['s3://bucket/data.json'] },
inputFormat: { type: 'json' },
},
tuningConfig: {
type: 'index_parallel',
partitionsSpec: { type: 'dynamic', maxRowsPerSegment: 5000000 },
},
},
};
const response = await Api.instance.post('/druid/indexer/v1/task', spec);
const taskId = response.data.task;
// Navigate to: #tasks?datasource=my_events
Related Pages
Implements Principle
Requires Environment
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment