Edge Functions
Write serverless functions in Python or Rust that run on the edge, close to your users, with built-in access to your database, storage, and AI libraries.
Overview
DYPAI Edge Functions let you deploy custom server-side logic without managing infrastructure. Functions are deployed to edge locations worldwide and execute in isolated environments with low cold-start times. Each function receives an HTTP request and returns a response, making them suitable for API endpoints, webhooks, background tasks, and data pipelines.
You can write functions in Python or Rust. Both runtimes have full access to your project's database, storage, and environment variables. Choose the language that best fits your use case, or mix both within the same project.
When to Use Edge Functions
Edge Functions are the right choice whenever you need to run trusted code on the server side. Common use cases include:
- Business logic: Validation rules, pricing calculations, and workflow orchestration that should not run on the client
- Third-party integrations: Connect to payment providers, email services, CRMs, and other external APIs securely
- Data processing: Transform, aggregate, or filter data before returning it to the client
- Webhooks: Receive and process incoming webhook events from Stripe, GitHub, Slack, and other services
- AI and ML: Run inference, call LLM APIs, process embeddings, or execute machine learning pipelines
- Cryptography: Generate tokens, verify signatures, hash passwords, and encrypt sensitive data
Python Functions
Python is the recommended runtime for AI workloads, data processing, and rapid prototyping. The Python runtime includes a curated set of packages and provides async support out of the box.
Here is a complete Python function that parses a JSON request body, queries the database, and returns a structured response:
# functions/analyze-orders.py
import json
from datetime import datetime, timedelta
async def handler(request):
# Parse the incoming request
body = await request.json()
days = body.get('days', 30)
# Query the database using the built-in client
cutoff = (datetime.now() - timedelta(days=days)).isoformat()
orders = await request.db.query(
"SELECT status, COUNT(*) as count, SUM(total) as revenue "
"FROM orders WHERE created_at >= $1 GROUP BY status",
[cutoff]
)
# Process results
summary = {
"period_days": days,
"generated_at": datetime.now().isoformat(),
"statuses": [
{
"status": row["status"],
"count": row["count"],
"revenue": float(row["revenue"] or 0),
}
for row in orders
],
"total_orders": sum(row["count"] for row in orders),
"total_revenue": sum(float(row["revenue"] or 0) for row in orders),
}
return {
"statusCode": 200,
"body": summary,
}The following packages are pre-installed in the Python runtime and available for immediate use:
| Package | Version | Purpose |
|---|---|---|
httpx | 0.27+ | Async HTTP client for external API calls |
anthropic | latest | Official Anthropic SDK for Claude models |
openai | latest | Official OpenAI SDK for GPT models |
pandas | 2.x | Data manipulation and analysis |
numpy | 1.x | Numerical computing and array operations |
pillow | 10.x | Image processing and manipulation |
pydantic | 2.x | Data validation and serialization |
Rust Functions
Rust functions offer near-native performance with minimal memory overhead. They are ideal for CPU-intensive tasks, latency-sensitive endpoints, and scenarios where predictable performance matters. The Rust runtime uses a pre-compiled WASM target for fast cold starts.
Here is a complete Rust function with typed request and response handling:
// functions/validate-payment.rs
use serde::{Deserialize, Serialize};
use sha2::{Sha256, Digest};
#[derive(Deserialize)]
struct PaymentRequest {
amount: f64,
currency: String,
card_token: String,
idempotency_key: String,
}
#[derive(Serialize)]
struct PaymentResponse {
approved: bool,
transaction_id: String,
message: String,
}
pub async fn handler(request: Request) -> Response {
// Parse and validate the typed request body
let payment: PaymentRequest = match request.json().await {
Ok(data) => data,
Err(_) => return Response::bad_request("Invalid request body"),
};
// Validate the payment amount
if payment.amount <= 0.0 || payment.amount > 50_000.0 {
return Response::bad_request("Amount must be between 0 and 50,000");
}
// Generate a deterministic transaction ID
let mut hasher = Sha256::new();
hasher.update(payment.idempotency_key.as_bytes());
let transaction_id = format!("txn_{:x}", hasher.finalize());
// Return the typed response
Response::json(&PaymentResponse {
approved: true,
transaction_id,
message: format!("Payment of {} {} approved", payment.amount, payment.currency),
})
}The following crates are available in the Rust runtime:
| Crate | Purpose |
|---|---|
serde / serde_json | Serialization and deserialization of JSON data |
reqwest | HTTP client for external API requests |
tokio | Async runtime for concurrent operations |
rayon | Parallel data processing across threads |
sha2 | SHA-256 and SHA-512 hashing |
uuid | UUID generation (v4 and v7) |
chrono | Date and time handling with timezone support |
Comparison
Use the table below to decide which runtime fits your workload. You can mix both languages within a single project and route different endpoints to different runtimes.
| Feature | Python | Rust |
|---|---|---|
| Development speed | Fast -- concise syntax, no compilation step | Moderate -- type system adds safety but verbosity |
| Execution performance | Good -- suitable for most workloads | Excellent -- near-native speed |
| Cold start time | ~50 ms | ~5 ms |
| Memory usage | Higher -- Python interpreter overhead | Minimal -- no garbage collector |
| AI libraries | Extensive -- Anthropic, OpenAI, pandas, numpy | Limited -- HTTP clients for API calls only |
| Package ecosystem | Large -- 7 pre-installed packages | Focused -- 7 pre-installed crates |
| Type safety | Optional -- via Pydantic models | Enforced -- compile-time guarantees |
| Deployment size | ~15 MB (runtime + packages) | ~2 MB (compiled WASM) |
Deployment
Functions are deployed automatically when you push to your connected Git repository or manually through the DYPAI CLI. Each deployment creates an immutable version, allowing you to roll back instantly if needed.
- Place your function files in the
functions/directory at the project root - Python files use
.pyextension, Rust files use.rs - Each file with a
handlerfunction becomes an independently callable endpoint - The function URL is derived from the file name:
functions/hello.pybecomes/v1/functions/hello
Environment Variables
Edge Functions can access environment variables set in your project dashboard or via the CLI. Use them for API keys, configuration values, and secrets that should not be committed to source control.
# Python: access environment variables
import os
async def handler(request):
api_key = os.environ.get("STRIPE_SECRET_KEY")
webhook_secret = os.environ.get("STRIPE_WEBHOOK_SECRET")
# ...// Rust: access environment variables
pub async fn handler(request: Request) -> Response {
let api_key = std::env::var("STRIPE_SECRET_KEY")
.expect("STRIPE_SECRET_KEY must be set");
// ...
}Environment variables are encrypted at rest and injected into the function runtime at execution time. They are never included in deployment artifacts or logs.
Next steps
- Python guide -- Deep dive into Python functions with advanced examples
- Rust guide -- Build high-performance functions with Rust
- AI integration -- Use edge functions to power intelligent agents