Create a Simple Snowpark Transform
This guide shows you how to build a Simple Snowpark Transform that processes data without using incremental or smart partitioning strategies.
Snowpark is Snowflake's developer framework that enables processing data directly where it's stored using familiar programming languages like Python.
Let's keep it Simple!
Note that Snowpark is only available to Ascend Instances running on Snowflake. Check out our Quickstart to set up a Snowflake instance.
Prerequisites​
- Ascend Flow
Create a Transform​
You can create a Transform in two ways: through the form UI or directly in the Files panel.
- Using the Component Form
- Using the Files Panel
- Double-click the Flow where you want to add your Transform
- Right-click on an existing component (typically a Read component or another Transform) that will provide input data
- Select Create Downstream → Transform
- Complete the form with these details:
- Select your Flow
- Enter a descriptive name for your Transform (e.g.,
sales_aggregation
) - Choose the appropriate file type for your Transform logic
- Open the files panel in the top left corner
- Navigate to and select your desired Flow
- Right-click on the components directory and choose New file
- Name your file with a descriptive name that reflects its purpose (e.g.,
sales_aggregation
) - Choose the appropriate file extension based on your Transform type:
.py
for Python Transforms.sql
for SQL Transforms
Create your Simple Snowpark Transform​
Structure your Snowpark Transform using these steps:
-
Import required packages:
- Ascend resources (
Snowpark
,ref
) - Snowpark objects (
DataFrame
,Session
)
- Ascend resources (
-
Define your transform function:
- Create a function that processes your input data
- The example below simply returns the data unchanged
-
Apply the
@snowpark()
decorator:- Specify your
inputs
using refs - Set
event_time
andcluster_by
parameters to control how Snowpark organizes your data
- Specify your
-
Return structured data:
- Your function must return a DataFrame
The @snowpark()
decorator handles all conversions between Spark and Ascend's internal format, allowing your Transform to integrate seamlessly with other Components in your Flow.
Example​
Here's a basic example of a Snowpark Transform:
from snowflake.snowpark import DataFrame as SnowparkDataFrame
from ascend.resources import ref, snowpark
@snowpark(
inputs=[
ref("cab_rides"),
],
event_time="pickup_datetime",
cluster_by=["cab_type"],
)
def cab_rides_simple_snowpark(cab_rides: SnowparkDataFrame, context):
return cab_rides
Check out our reference guide for complete parameter options, advanced configurations, and additional examples.
🎉 Congratulations! You've successfully created a Simple Snowpark Transform in Ascend.