1
Upload Your Workload
Drag & drop your workload file here
⚡ Spark
🔷 Argo
🌊 Airflow
{ } JSON
or
What file formats are supported?
Dystrio JSON
{
"tasks": ["task_a", "task_b", "task_c"],
"edges": [
["task_a", "task_b", 100000000],
["task_b", "task_c", 50000000]
]
}
Each edge is: [source, destination, bytes_transferred]
Spark Event Log
Upload your Spark event log directly. We auto-detect stages and shuffle bytes.
spark.eventLog.dir → Download from Spark History Server
Argo Workflow YAML
Export your Argo workflow and upload directly. We extract DAG dependencies.
kubectl get workflow <name> -o yaml > workflow.yaml
Airflow DAG JSON
Export your Airflow DAG and upload. We extract tasks and dependencies.
airflow dags show <dag_id> --output json > dag.json
2
Set Your Goals
$
Leave empty for no budget constraint
3