Hands-on lab: Agentic Data Engineering on Ascend
At TDWI Transform in Orlando, Florida.
The plan
Use Otto, your Agentic Data Engineering assistant in Ascend, to:
- Build your first data pipeline with an agent
- Create your first agentic Automation
- Explore next steps in the platform
The prizes: $$$
You can win prizes by completing the activities in Ascend! We have prizes for:
- $250 Amazon gift card for the coolest DAG built in Ascend
- 3 x $100 Amazon gift cards via random raffle
- $100 Amazon gift card each for the first Flow run on:
See the contest details section below for more information.
Part 0: Start your free trial & login
Go to https://app.ascend.io/signup and fill out the form to start your free Developer plan trial.
You must use a work email address to sign up for Ascend; common personal email domains like @gmail.com and @yahoo.com are disallowed.
Your free trial lasts for 14 days and includes full access to Ascend! You can continue to submit work until next Tuesday for prizes.

After submitting the form, head to your email inbox and accept the invite.

From there, create a password or login via Google SSO. Otto should greet you on the homepage!
Part 1: Building your first data pipeline
From the screen below, click the Not sure where to start? button!

This will send a prompt to Otto giving him initial direction to build your first data pipeline in Ascend.
Lost? Navigate into your Workspace, open Otto assistant in the sidebar (Ctrl + I or the button toward the top right), and use the prompt:
TDWI!
to trigger the first pipeline build with Otto.

Sit back and observe the agentic data engineering loop in action!
Part 2: Building your first agentic Automation
Recall in Ascend, developers (and their agents) develop in a Workspace and deploy & monitor in a Deployment. Automations allow you to run Flows or other actions on a schedule or in response to events.

In this part of the lab, you'll need to:
- Learn the basics of customizing Otto (rules)
- Give Otto access to Slack tools via MCP setup
- Test Otto can use Slack through the chat interface
- Create an Automation to run Otto and send a Slack message
- Run a Flow in a Deployment and observe the Automation's resulting slack message
Learning the rules of agentic data engineering
We recommend starting by reviewing how Otto built the data pipeline above -- what context was used and how? Exploring the platform, you may notice the otto/ directory in the filesystem. Open it up!

You'll then see otto/rules -- rules allow you to conditionally inject context into Otto's system. They are written as Markdown files with some frontmatter. Try to find the rule used to build the first data pipeline! Then, try creating your own rule to get a feel for customizing Otto.
Give Otto access to Slack via MCP
Use:
to give Otto access to Slack via MCP.
Send a test Slack message via chat
Use the chat interface to test that Otto can send a Slack message.
Send a Slack message via Automation
Now, adjust the Automation in your Project to send a Slack message with context generated by Otto when the data pipeline succeeds OR fails!
Run a Flow in a Deployment
Finally, head to your Deployment and run a Flow to see the Automation in action!
Part 3: Choose your own adventure
Explore our documentation & platform to learn more about Agentic Data Engineering!
The end
Congratulations!
You can reach any of us at:
- Cody: cody.peterson@ascend.io
- Jenny: jenny@ascend.io
- Sean: sean@ascend.io
- Tessa: tessa@ascend.io
General Ascend email aliases:
- Sales: sales@ascend.io
- Support: support@ascend.io
Contest details
Speed prizes
One $100 Amazon gift card each for:
- the first Flow run on Databricks
- the first Flow run on Snowflake
Random raffle prizes
Three $100 Amazon gift cards will be awarded via random drawing from all eligible entries for the coolest DAG prize.
Coolest DAG prize
One $250 Amazon gift card for the coolest DAG built in Ascend:
- judged by the Ascend team
- What counts as a pipeline? At least 4 Components (data ingestion + transformation)
- What we're looking for: Innovative use cases, cool data sources, clever transformations, or elegant solutions to real data problems
Winner announced Tuesday, November 25, 2025.
How to enter
Enter by:
- Raising your hand for speed prizes during the lab
- Submit your entries

- Take screenshots of your pipeline graph(s) - Ascend team will calculate the # of entries for you
- Submit via email to
data-eng@ascend.io - Include:
- Your contact information
- Which data sources you connected to
- Screenshots of your pipeline graphs
Contest rules & FAQs
- Can I win multiple prizes?
- Yes! You can win a speed prize + be entered in the random drawing and Coolest DAG judging.
- What counts as a “component” in my pipeline?
- Read Component, Transformation Components, Tasks, and Write Components. To qualify for entries, your pipeline needs at least 4 total.
- Can I submit pipelines after the in-person lab?
- Yes! You have until Monday, November 24, 2025 at 11:59 PM PT to submit your entries.
- What if I connect to a Data Plane not listed (not Snowflake or Databricks)?
- Great! You’ll still get 1 entry for that connection. Note that speed prizes only apply to the two listed data planes.
- How many times can I enter?
- You get 1 entry per data source connected + 1 entry per pipeline built (max 10 pipelines). Maximum possible entries: 3 sources + 10 pipelines = 13 entries.
- What happens if multiple people connect at the same time for speed prizes?
- First person to raise their hand and show the connection wins. Lab facilitators will make the call.
- What makes a DAG “cool”?
- We’re looking for creativity, technical complexity, and real-world applicability. Think: Does this solve an interesting problem? Would other data engineers want to use this?
Technical requirements
- All pipelines must be built in the Ascend.io platform during or after the lab
- Screenshots must clearly show the pipeline graph with Components visible
- Pipelines should be functional (not just placeholder Components
Happy building!



