Skip to main content

Lab 1: Agentic Analysis

Track 2: Agentic Analytics · Day 2 breakout lab

In this lab you'll use Otto to build a comprehensive analysis from raw data sources. The business case: GreenTech Manufacturing is looking to tackle a critical sustainability challenge facing their five UK production facilities. The company consumes over 42 million kWh annually running energy-intensive operations, and with UK carbon offset costs at £50/ton CO2, their environmental impact is becoming a significant financial burden.

The key insight: the UK's electricity grid carbon intensity swings dramatically throughout the day. Your mission is to build an intelligent forecasting system that learns from 30 days of historical weather and carbon data to predict these low-carbon windows up to 7 days in advance, then identifies which flexible operations can be strategically shifted from high-carbon hours to clean-energy windows.

How this works

You're not writing code — Otto generates it for you. Your job is to ask good questions, inspect what Otto produces, and redirect when it's not quite right. Think of Otto as an analyst who works at the speed of thought — your job is to direct the work, not execute it.

Before you start

  • Complete Hands-On Lab: Getting Agentic on Day 1
  • You should have an Ascend account and an existing Project
  • No SQL experience required — though it helps if you want to go deeper

The plan

  1. Build the full carbon + operations optimization pipeline with a single prompt
  2. Verify the pipeline runs successfully end-to-end

The dataset

You're working with five data sources that together tell the story of your manufacturing energy footprint:

SourceWhat it provides
UK Carbon Intensity APIReal-time and historical carbon intensity of the UK electricity grid
Open-Meteo Weather APIHistorical and forecast weather data for predictive modeling
Facilities (CSV)5 UK manufacturing sites with annual energy consumption and location
Machines (CSV)125 machines with energy draw, type, and whether they're can be scheduled
Production Schedule (CSV)Weekly shift patterns showing when each machine runs

Step 1: Build the data pipeline

Data Pipelines

Data pipelines are a way to read and transform data from one or more sources into a single destination. They are a core concept in data engineering and are used to build complex data products and solutions. In this example, we're building a data pipeline that reads data from the 5 sources above and uses SQL and Python code to transform and analyze the data.

You're going to give Otto a single prompt, and it will build the entire optimization pipeline — combining our operations data with live weather and carbon intensity data to find the scheduling windows that save the most.

Open Otto with Ctrl + I (or Cmd + I on Mac), start a new thread, and paste the following prompt:

We want to create a brand new flow. In this flow we want to:

Use custom Python to read weather data (from the OpenMeteo API) and
carbon data (UK Carbon Intensity API - https://api.carbonintensity.org.uk)
for the past 30 days, creating a model to help us understand the impact
of weather & time of day on carbon.

Create a predictive model for carbon impact based on the weather forecast
for the next 7 days.

Layer in our own operations data to determine optimal operations windows
and cost savings (both in carbon offsets and energy cost savings) based
on rescheduling. The relevant data is in the data folder
(production_schedule, machines, facilities).

Important context: Not all operations can be rescheduled — some machines
are fixed by production flow dependencies, and some run 24/7. The machines
CSV has a schedulable column that indicates which operations we can
actually move. Also, we can only reschedule within shift boundaries
(±8 hours) to keep things realistic.

Please run the flow end to end and iterate until it succeeds.

Watch Otto work

Sit back and watch. Otto will:

  1. Create Python Read Components for weather and carbon intensity APIs
  2. Build a weather forecast component for the next 7 days
  3. Write SQL transforms to join all five data sources
  4. Build a predictive model for carbon intensity
  5. Calculate optimal scheduling windows for every machine
  6. Run the flow and iterate through any failures

Otto isn't generating code in isolation — it's looking at the actual data coming back from the APIs and cross-referencing it with your operations data. If something fails, Otto reads the actual error message and fixes the issue based on what it observed.

This takes 10–15 minutes

This is a complex pipeline with five data sources and a predictive model. Otto will likely hit errors and fix them over 5+ iterations. Each round, Otto reads the actual error logs and adjusts its approach. If Otto is still stuck after 15 minutes, try giving it a specific hint about the error you see in the flow run logs.

Step 2: Verify the pipeline

Once the flow runs successfully, take a minute to spot-check the results before moving on to Lab 2.

Give me a quick summary of the optimization results. How many machines 
can we reschedule? What are the projected savings?

[TODO: screenshot of a successful flow run in the Ascend UI]

What just happened?

You just did in 20 minutes what would typically take days — connecting to two live APIs, ingesting 30 days of historical data, building a predictive model, cross-referencing with your operations data, and producing scheduling recommendations with cost savings. Otto handled the Python, SQL, and API calls. You directed the analysis. That's the division of labor that scales.

You've completed Lab 1!

By the end of this lab, you should have:

  • Built the carbon + operations optimization pipeline from a single prompt
  • Verified the pipeline runs end-to-end
  • Spot-checked the optimization results

Need help? Ask a bootcamp instructor or reach out in the Ascend Community Slack.

Next steps

Continue to Lab 2: Verifying Agent Output to explore the data, verify the optimization logic, and build visualizations you'd be confident presenting to an operations VP.

Resources

Questions?

Reach out to your bootcamp instructors or support@ascend.io.