1. Kaholo Getting Started

The Kaholo Platform is the control center for rapid development, control, and monitoring of your DevOps Pipelines. Before we jump into using Kaholo, here are the main concepts that make up the solution.

 

 

Kaholo community on Slack

The Kaholo platform interface currently supports Google Chrome and Mozilla Firefox.

Pipelines

Pipelines are the central concept of the Kaholo Platform. Pipelines are automation workflows that consist of a number of Actions orchestrated with flow, from one action, to the next in order to achieve a final result. You simply drag & drop Actions onto the canvas, link them together, to create functional Pipeline.

100’s of Plugins

Plugins are a collection of coded extensions that allow you to integrate with all your favorite external services and tools. They enable you to work with various cloud platforms, source code repositories, configuration management, deployment tools, etc., using configuration instead of code. Each plugin has one or more methods, and each method is referred to in Kaholo as an “Action”.

Actions

An Action is a single step in the pipeline, based on one of the selected plugin’s methods. Each Action has a collection of configurable parameters that determine precisely what that Action does.

For example, when using the “Launch VM” method within the Google Cloud Compute Plugin. The parameters include credentials, machine details and subnet in which to deploy the new virtual machine. Thanks to Kaholo’s flexible plugin architecture, we can support nearly any method in any platform or tool with minimal or no code.

Actions

The Vault

Some parameters are secrets, such as SSH keys or passwords. The Kaholo Vault securely encrypts these secrets so they may be referenced in parameters without being exposed. Your secrets are safe in Kaholo.

3.3 Vault

Hosted and Self-Hosted Agents

Agents are container instances where the pipeline actions are actually executed. They can either be hosted and managed by Kaholo or you can self-host agents. Each account has at least one dedicated agent. Multiple agents can be deployed to handle higher workloads or meet enterprise-class scalability requirements. Agents also provide assured security between separate accounts on the platform.

 

For more advanced needs, Kaholo provides a fully capable JavaScript code layer where you can access pipeline, execution, and action data.

  • Conditionally run an Action
  • Trigger functions before and after Actions are executed (Hooks)
  • Pass data between Actions
  • Transform data before passing them between Actions

There are multiple places where JS code can be used:

  1. Main Code Page: For adding larger complex functions, initializing variables, and when data manipulations are needed. It has built-in automatic code completion (IntelliSense) so you can see everything we provide in our SDK under the Kaholo and actions objects. For example, declaring and assigning a variable which would then be used in an Action Parameter field with its Code option activated (toggle above the field):
    var publicIp;
    var publicIp = kaholo.execution.configuration.publicIp;
  2. Action Parameter Fields: Activate the code feature by turning the toggle switch to on above a parameter field. This is used to set values dynamically when the pipeline is executed. Here you can either access the Kaholo SDK to directly insert values, invoke a function or get a variable defined in the main Code layer. Here’s an example for accessing the results of a previously executed Action: actions.myaction.result
  3. Condition Fields: Determine whether an Action should execute. For example, basing it on a previous Action’s status: actions.myaction.status=='skipped'
    Another example for using the Condition field is to set a condition on a loop action2Iterations < 5
  4. Hook Fields: Run a function before or after an Action is executed. A common example is to increment a variable in a Post-execution Hook in conjunction with using the Loop and Conditional code features action2Iterations++

Configurations allow you to add multiple JSON configuration documents in order to reuse the same pipeline for different scenarios. Each scenario has its own set of values to use as inputs for the pipeline.

One example of when to use Configurations is using the same CI/CD pipeline for different environments. Like Production and Staging which have different values for things like regions, IPs, etc.. In this case, you create two Configurations, one for the Production environment and another for Staging. When the pipeline is executed, you can choose which configuration gets used. Example coded parameter such as kaholo.execution.configuration.publicIp will be evaluated differently depending on which environment’s configuration is active.

Triggers can also reference specific configurations.

Kaholo automations can be triggered manually within the user interface, externally through an event-driven trigger like a commit to a specific Github branch, or from a scheduled trigger to execute it on a regular basis.

In order to extend your automation pipeline for multiple scenarios, you also have the option of using a specific configuration within a Trigger.

Like all of our plugins, Triggers can be found in our Github repository which contains additional details on how to set them up.

Execution results show you the current as well as previous execution statuses. Once you click on one Execution, you’ll see a list of Actions that ran along with their statuses. Clicking on a particular Action will show its Activity Log and separately its Final Result.

This makes it easier to troubleshoot your pipelines, but can also be used to execute the pipeline to see what results are returned by an Action. You can access those results through the code layer in subsequent Actions of the same Pipeline.

Execution Results