Getting Started with Tavio
Build
Creating Your First Workflow
6 min
creating your first workflow in the tavio studio is where your integration strategy becomes reality you will start by dragging nodes —the fundamental building blocks of the platform—onto the design canvas to map out your process visually whether you are connecting to an external api using a smart connector or defining complex business logic with conditional branches, the intuitive interface allows you to assemble powerful functionality without writing boilerplate code beyond the drag and drop mechanics, this section guides you through the essential configuration steps required to bring your logic to life you will learn how to define a node's inputs and outputs , configure node properties, and link operations together to establish the flow of execution by the end of this guide, you will have a high level view of the steps required to create, deploy, support, and maintain your integration workflows more detailed courses which provide step by step examples are avalable in the tavio https //thecloudconnectors io/learn/tutorials/01 creating your first workflow platform please https //www tavio io/contact us/ if you would like to have access to that course content defining your use case tavio offers immense flexibility, yet maintaining a scalable integration strategy requires that individual workflows remain narrowly defined as a general rule, a workflow should focus on a single object type moving in one direction from a source system to a destination continue while the platform can support complex multi stage processes, keeping the use case narrow ensures that each workflow within a solution is significantly easier to build, test, and maintain furthermore, this modular design simplifies long term ownership, allowing teams to extend functionality or debug and fix issues with minimal impact on the broader integration ecosystem for the purpose of this guide, we will use as an example the task of creating a job requisition via a web hook, performing a transformation into the required schema, and finally pushing that data to destination system using a purpose built smart connector laying the groundwork to begin, you would log into your development environment and navigate to the workflows menu from there, choose the new workflow , selecting the option to create workflow from the resulting dialogue provide a unique name and a descriptive, human readable summary that clearly defines the workflow's purpose, such as job requisition webhook to training system continue the fundamental units of logic in tavio are nodes , which are organized into functional families called packages , typically abbreviated as packs tavio has built packs for logical flow, file management, data transformation, and connectivity, including packs for specific external platforms and use cases such as edi to equip your workflow with the necessary logic, you must install the relevant packages expand the left hand panel by clicking the \[+] tab and select the packages view from here, search for and install the latest versions of the following triggers this pack provides the essential nodes required to initiate your workflow, such as the generic rest trigger, which acts as a webhook listener which would await the incoming job requisition data \[one or more system specific pack] each system for which we support connectivity has its own pack, which contains one or more nodes, including the purpose built smart connector needed in order to pull from, or create/update records in that system since we are creating job requisitions, the connector for any ats would make sense here transform this enables the use of clouddata, tavio’s proprietary transformation language, to remodel your incoming structured data into the required destination schema data health this framework allows you to capture record by record success and error telemetry, ensuring that you have the database driven visibility required to monitor and troubleshoot your integrations at scale defining execution flow to define your integration's logic, you will arrange and link nodes—the fundamental units of logic—on the design canvas they can be dragged into the visual editor from the nodes view in the editors left hand panel, and arranged in a way that allows you to map out the workflow's functionality anatomy of a port execution flow is established by connecting ports, which are the contact points on each node input port (left side) receives the incoming payload and data scope from the previous step standard out port (top right) triggers the next node when the current operation completes successfully error out port (bottom right) triggers a different path if the node encounters a failure or exception specialized endpoints, such as the standard in node (which has only an output) and the standard out node (which has only an input), define the absolute start and finish of a workflow, and in more complex scenarios can defind the start and finish of scopes within workflow executions, such as loops and subflows building the circuit in our example job requisition use case, your execution flow would proceed through each step logically based on the required steps for example, you might link the standard out port from the in node to the tavio training smart connector to allow for initial manual testing connect the webhook 's success port to the clouddata node to perform the data transformation on incoming data wire the clouddata success port to your chosen smart connector to push the transformed requisition data into the associated system to capture telemetry, link the success port of the smart connector to the data success node, and its error port to a data error finally, both data health nodes should eventually terminate at the standard out or error nodes, respectively, to close the circuit and ensure an accurate execution status configuring your nodes the next step is to configure each node, and you would do that for each one in succession by double clicking on your selected node and choosing the appropriate options in the left hand configuration panel again, for our example use case continue in the generic rest trigger node, you'll configure the endpoint url and the appropriate authentication, including basic authentication, bearer tokens api keys, oauth, and signature based authentication in the clouddata node, you'll configure the necessary transformation with our easy to use, but powerful declarative transformation language in your chosen smart connector, you'll set up authentication, choose an end point—such as creating a job, for this example—and configure the query to include your transformed data finally, the data health nodes are there to capture successes and error transactions, and you'll configure them to capture any data necessary for reporting and troubleshooting in addition to basic system level data such as object type, workflow name, and execution id, the data health nodes support dynamic message logging, multiple primary keys for searchable parameters, and up to 50 fields of custom metadata in cases where granular metadata is a requirement