This is a subsystem of the Cloud-Barista platform that provides workflow management for cloud migration.
- Create and management workflow through Airflow.
- Create workflow based on gusty.
- Tested operating systems (OSs):
- Ubuntu 24.04, Ubuntu 22.04, Ubuntu 18.04
- Language:
- Go: 1.23.0
Build and run the binary with Airflow server
make run
Or, you can run it within Docker by this command.
make run_docker
If you want to stop the binary with Airflow server, run this command.
make stop
- Configuration file name is 'cm-cicada.yaml'
- The configuration file must be placed in one of the following directories.
- .cm-cicada/conf directory under user's home directory
- 'conf' directory where running the binary
- 'conf' directory where placed in the path of 'CMCICADA_ROOT' environment variable
- Configuration options
- task_component
- load_examples : Load task component examples if true.
- examples_directory : Specify directory where task component examples are located. Must be set if 'load_examples' is true.
- workflow_template
- templates_directory : Specify directory where workflow templates are located.
- airflow-server
- address : Specify Airflow server's address ({IP or Domain}:{Port})
- use_tls : Must be true if Airflow server uses HTTPS.
- skip_tls_verify : Skip TLS/SSL certificate verification. Must be set if 'use_tls' is true.
- init_retry : Retry count of initializing Airflow server connection used by cm-cicada.
- timeout : HTTP timeout value as seconds.
- username : Airflow login username.
- password : Airflow login password.
- connections : Pre-define Airflow connections (Set multiple connections)
- id : ID of connection
- type : Type of connection
- description : Description of connection
- host : Host address or URL of connection
- port : Port number for use connection
- schema : Connection schema
- login : Username for use connection
- password : Password for use connection
- dag_directory_host : Specify DAG directory of the host. (Mounted DAG directory used by Airflow container.)
- dag_directory_container : Specify DAG directory of Airflow container. (DAG directory inside the container.)
- task_component
- listen
- port : Listen port of the API.
- Configuration file example
cm-cicada: task_component: load_examples: true examples_directory: "./lib/airflow/example/task_component/" workflow_template: templates_directory: "./lib/airflow/example/workflow_template/" airflow-server: address: 127.0.0.1:8080 use_tls: false # skip_tls_verify: true init_retry: 5 timeout: 10 username: "airflow" password: "airflow_pass" connections: - id: honeybee_api type: http description: HoneyBee API host: 127.0.0.1 port: 8081 schema: http - id: beetle_api type: http description: Beetle API host: 127.0.0.1 port: 8056 schema: http login: default password: default - id: tumblebug_api type: http description: TumbleBug API host: 127.0.0.1 port: 1323 schema: http login: default password: default dag_directory_host: "./_airflow/airflow-home/dags" dag_directory_container: "/usr/local/airflow/dags" # Use dag_directory_host for dag_directory_container, if this value is empty listen: port: 8083
Check workflow template list.
curl -X 'GET' \
'http://127.0.0.1:8083/cicada/workflow_template' \
-H 'accept: application/json'
Get workflow template and copy the content.
curl -X 'GET' \
'http://127.0.0.1:8083/cicada/workflow_template/81bbeb23-2c48-4536-9f01-55796e0fa394' \
-H 'accept: application/json'
Create the workflow by pasting copied workflow template content. Modify all of 'sgId' params. (Create the source group from honeybee.)
Check the ID of the created workflow.
curl -X 'POST' \
'http://127.0.0.1:8083/cicada/workflow' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"name": "migrate_infra_workflow",
"data": {
"description": "Migrate Server",
"task_groups": [
{
"name": "migrate_infra",
"description": "Migrate Server",
"tasks": [
{
"name": "infra_import",
"task_component": "honeybee_task_import_infra",
"request_body": "",
"path_params": {
"sgId": "ba695bbb-d673-4092-9821-c9cb05676228"
},
"dependencies": []
},
{
"name": "infra_get",
"task_component": "honeybee_task_get_infra_refined",
"request_body": "",
"path_params": {
"sgId": "ba695bbb-d673-4092-9821-c9cb05676228"
},
"dependencies": [
"infra_import"
]
},
{
"name": "infra_recommend",
"task_component": "beetle_task_recommend_infra",
"request_body": "infra_get",
"path_params": null,
"dependencies": [
"infra_get"
]
},
{
"name": "infra_migration",
"task_component": "beetle_task_infra_migration",
"request_body": "infra_recommend",
"path_params": null,
"dependencies": [
"infra_recommend"
]
},
{
"name": "register_target_to_source_group",
"task_component": "honeybee_register_target_info_to_source_group",
"request_body": "infra_migration",
"path_params": {
"sgId": "ba695bbb-d673-4092-9821-c9cb05676228"
},
"dependencies": [
"infra_migration"
]
}
]
}
]
}
}'
curl -X 'POST' \
'http://127.0.0.1:8083/cicada/workflow/4420da6c-c50f-4d8b-bc2e-f02d8b557fad/run' \
-H 'accept: application/json' \
-d ''
Check if CM-Cicada is running
curl http://127.0.0.1:8083/cicada/readyz
# Output if it's running successfully
# {"message":"CM-Cicada API server is ready"}