Quickstart for the Developer plan
We're still working on this one! Expect changes and note that standard SLAs don't apply, so don't rely on this for production workloads yet. Refer to our release stages to learn more.
Introductionโ
Your Ascend Instance comes pre-configured with everything you need to start building data pipelines: a Repository, Otto's Expeditions Project, a Workspace, Connections, and more.
This quickstart guide focuses on the essentials of delightful Ascend development. You'll learn how to:
- Create a Flow to organize your data pipeline
- Build a Read Component with data quality tests
- Create a Transform Component to process your data
- Implement a Task for custom operations
- Run a Flow in a Workspace and in a Deployment
Create a Flowโ
A Flow is the foundation of your data pipeline. Let's create one to organize your Components:
- Form
- Files panel
-
Right-click anywhere in the Super Graph (the overview of all Flows in your Project) and select Create Flow
-
Fill out the form with these details:
- Flow name:
developer-quickstart
- Version:
1.0.0
Other fields are optional.
- Flow name:
-
Scroll down and click Save to create your Flow
- Right-click the flows folder and select New File
- Name your file
developer-quickstart.yaml
- Copy the following configuration into your new file:
flow:
version: 0.1.0
Create a Read Componentโ
Throughout this quickstart, we'll be recreating Components from Otto's Expeditions, where Otto the data pipeline goat creates unforgettable mountain adventures. Let's start by building a Read Component that pulls route closure data from an existing Local File Connection.
Typically, you'd create a Connection before building Components, but we'll be leveraging an existing Local File Connection.
- UI
- Files panel
-
Navigate to your
developer-quickstart
Flow using the Build panel or by double-clicking its node in the Super Graph -
Right-click anywhere in the Flow Graph background (the graph showing your current Flow's Components)
-
Hover over Create Component, then Read, and choose From Scratch
-
Name your Component
read_route_closures
and ensure YAML is selected as the file type -
Copy the following configuration:
- In the Files panel, right-click the components folder inside your
developer-quickstart
Flow and select New File - Name your file
read_route_closures.yaml
- Copy the following configuration into your new file:
component:
read:
connection: read_local_files
local_file:
parser: auto
path: route_closures.csv
Create a Transformโ
Next, let's create a Transform that uses our Read Component as input and cleans the column names in our data:
- In the Files panel, right-click the components folder inside your
developer-quickstart
Flow and select New File - Name your file
route_closures.py
- Copy the following code into your new file:
import ibis
import ottos_expeditions.lib.transform as T
from ascend.application.context import ComponentExecutionContext
from ascend.resources import ref, transform
@transform(inputs=[ref("read_route_closures")])
def route_closures(read_route_closures: ibis.Table, context: ComponentExecutionContext) -> ibis.Table:
route_closures = T.clean(read_route_closures)
return route_closures
The custom cleaning function used here is shared code from the ottos_expeditions/lib
folder.
Create a Taskโ
Finally, let's create a Task that demonstrates Ascend's logging capabilities by logging each route that gets updated:
- UI
- Files panel
-
Right-click anywhere in the Flow Graph background (the graph showing your current Flow's Components)
-
Hover over Create Component, then choose Task
-
Name your Component
task_update_route_closures_calendar
and ensure Python is selected as the file type -
Copy the following code:
- In the Files panel, right-click the components folder inside your
developer-quickstart
Flow and select New File - Name your file
task_update_route_closures_calendar.py
- Copy the following code into your new file:
import ibis
from ascend.application.context import ComponentExecutionContext
from ascend.common.events import log
from ascend.resources import ref, task
@task(
dependencies=[
ref("route_closures"),
]
)
def task_update_route_closures_calendar(
route_closures: ibis.Table,
context: ComponentExecutionContext,
) -> None:
for route in route_closures["route_id"].to_pyarrow().to_pylist():
log(f"Updating route {route}")
Run your Flow in a Workspaceโ
Now it's time to see your pipeline in action! Your completed Flow should display all three Components connected in sequence:
Click Run Flow in the build info panel in the top left, and select Run in the modal that appears.

Watch as Ascend processes your data through each Component - from reading the route closure data, to cleaning it with your Transform, to logging updates with your Task.
Workspaces are ideal for development, with editable code and Automations disabled. Deployments are ideal for production with read-only code and Automations enabled. Refer to our concept docs to learn more.
Run your Flow in a Deploymentโ
Now, let's promote your code to the Development Deployment and run it in an isolated environment.
-
From your Workspace, navigate to the Source Control panel in the left sidebar and click Open Git Log & Actions.
-
In the Git log window, select Merge to Deployment and choose development. Click Merge in the confirmation dialog to promote your code to the Development Deployment.
-
From the Ascend homepage, locate the Development Deployment and click on it to open the Deployments Dashboard.
noteThe screenshots below show the
Demo Development
Deployment, but you should use the regularDevelopment
Deployment. -
In the Deployments Dashboard, select the
developer-quickstart
Flow from the left sidebar. -
Click the Run Flow button in the top right corner to run your pipeline:
-
In the run configuration modal, click Run to start the execution.
-
Monitor the execution progress through the Flow runs matrix, which displays the status of both the overall Flow and its individual Components:
๐ Congratulations! You've successfully built and deployed your first Ascend Flow, complete with a Read Component, Transform, and Task!
Next stepsโ
Ready to dive deeper? Here's what to explore next:
- ๐ญ Explore core Ascend concepts to build a deeper understanding of the platform
- ๐ Master DataOps best practices for robust production workflows
- ๐งช Add data quality tests to validate and monitor your data pipelines
- โ๏ธ Learn more about Deployments to promote your pipelines from development to production