Running and scheduling workflows
Datamin’s workflows can be triggered in 5 different ways:

Both buttons to schedule or run workflow manually are located in the top right corner of the workflow's canvas.
Manual trigger is typical for:
- Test workflows, that are used for debugging or testing data sources or destinations
- Workflows that don't have any periodical schedule, but need to be run sometimes when a user needs it
After clicking on
Run workflow
button a user gets an output that shows the status of every task:
Scheduling is the most common way of triggering workflows. It can be configured using either a visual interface or if you are familiar with a cronjob format, you can also use this one.


Some of the more detailed examples of metrics can be found here in the list of our use cases. Our library of templates also contains multiple ones, which you can use for configuring your own metrics.
For example, by Apache Airflow, Jenkins, or any other software that is already used in your company for workflow or pipeline orchestration.
More information about triggering workflows via API can be found in the OAuth Clients and API Endpoints.
When placed as a first task in a workflow, external_trigger allows you to trigger workflows from your data streaming platforms to make it 100% real-time.

The first data streaming platform we integrate with is Kafka. And the open-source library that can trigger workflows from it is hosted on our Github: https://github.com/datamin-io/kafka-trigger
Last modified 4mo ago