Skip to main content

Hands-on workshop: Agentic Data Prep & Visualization

· 4 min read
Cody
Product @ Ascend
Jenny
Chief of Staff @ Ascend

Transform messy data and generate visualizations using AI agents in a single workflow.

Step 1: Sign up for Ascend

Go to https://app.ascend.io/signup and fill out the form to start your free Developer plan trial.

tip

Your free trial lasts for 14 days or 100 Ascend Credits, whichever comes first. To continue using Ascend beyond these limits, you can select a plan to subscribe to.

Sign up page

danger

You must use a work email address. Common personal email domains like @gmail.com are disallowed.

Sign up form

Check your email for a verification message from support@ascend.io. Click to accept the invite and create a password or sign in with Google SSO.

tip

Ask us live or email support if you don't receive the email within a few minutes.

Email invite

Step 2: Build, prep, and visualize with Otto

Once signed in, open Otto with Ctrl + I (or Cmd + I on Mac) and send the following prompt:

Hi Otto! Help me build a data prep and visualization pipeline in Ascend.
Follow these steps in order:

1. Run the sales Flow and wait for it to complete.
2. Query the sales Components to understand the schema and data.
3. In the sales Flow, add a Read Component that reads the built-in goats.csv
from the Local Files Connection.
4. Add a SQL Transform called "sales_sample" that selects 10,000 random rows
from the sales data.
5. Add a Python Transform called "messy_sales" that:
- Combines the sampled sales data with the goat data
- Introduces realistic data quality issues: some null values, a few type
mismatches (e.g., numbers stored as strings), and inconsistent date
formatting
- Keep the logic simple and easy to follow — no complex joins or
explosions, just straightforward messiness
6. Add a SQL Transform called "clean_sales" that:
- Cleans up the messy data from the previous step
- Casts types correctly, standardizes dates, and handles nulls
- Keep the SQL simple and readable
7. Add data quality tests to validate the cleaning — no nulls in required
fields, no duplicate records, and correct types.
8. Run the Flow and make sure everything passes.
9. Query the clean_sales data, then create an artifact that serves as a rich,
interactive dashboard for visualization and analysis of our goats and sales
data over time. The dashboard should include:
- Charts showing trends, breakdowns, and key metrics
- Filters to slice the data (e.g., filter out nulls, filter by category
or date range)
- A light/dark mode toggle
- Summary statistics and KPIs at the top

Sit back and watch Otto work through the entire workflow — ingestion, data prep, cleaning, and visualization — in a single agentic loop.

tip

Otto may make mistakes along the way. Agentic development loops tend to iterate toward success, so let Otto fix issues on its own. If it gets stuck, give it a nudge!

Wrap up

You've seen how a single prompt to an AI agent can handle the full data prep and visualization workflow:

  • Ingest raw data from a file
  • Combine datasets with SQL & Python transforms
  • Clean and standardize for downstream consumption
  • Build a dashboard

all without writing code manually!

Keep exploring

We're in exciting times. Here are more prompts to try on your own:

Add a new data source to the analytics Flow — pull in weather data from the
Open-Meteo API and correlate it with sales trends.
Create a rule that enforces a visualization style guide so all future dashboards
use consistent colors, fonts, and chart types.
Schedule this pipeline to run daily at midnight, then deploy it to production.
Send me an email with a link to the dashboard artifact.
Build a second dashboard that compares data quality before and after cleaning —
show counts of nulls, duplicates, and type mismatches at each stage.

Resources

Questions? Reach out: