Endpoints
Endpoints are the specific URLs where your frontend sends requests to perform actions or retrieve data.
Request Types
DYPAI supports all standard HTTP methods to handle different types of operations:
- GET (Read): Used to fetch data from your database (e.g., listing products or getting a user profile).
- POST (Create/Action): Used to save new information or trigger a process (e.g., creating an order or starting an AI task).
- PATCH/PUT (Update): Used to modify existing records (e.g., changing a user's status).
- DELETE (Remove): Used to delete information securely.
Managing Endpoints
In the API Builder section, you can see a list of all your endpoints. From here, you can:
Edit with the Canvas
Click on any endpoint to open the Visual Canvas. This is where you can:
- Review the logic flow.
- Modify existing nodes or add new ones.
- Change the input and output structure.
Test in Real-time
Use the Test panel to verify that your endpoint works as expected. You can simulate different scenarios and see exactly what the API returns.
Start, Triggers, and Webhooks
Every endpoint in DYPAI starts from a single Start node. That Start node defines how the workflow is triggered.
Available Trigger Types
| Trigger | Best for | What happens |
|---|---|---|
HTTP API | Frontend and backend requests | DYPAI exposes a standard API endpoint that your app can call directly |
Webhook | External services like Stripe, GitHub, or payment gateways | DYPAI exposes a webhook URL that third-party services can call |
Schedule | Recurring jobs and automations | DYPAI runs the workflow automatically on a time-based rule |
Telegram | Bot-driven workflows | DYPAI starts the workflow when your Telegram bot receives a message |
When to Use a Webhook
Use a Webhook when an external service needs to notify your project about an event.
Common examples:
- Stripe payment succeeded
- GitHub push event
- Form submission from an external tool
- Delivery status callback from a messaging provider
Unlike a normal HTTP API endpoint, a webhook is designed to be called by another platform, not by your own frontend UI.
How Webhooks Work in DYPAI
When you configure the Start node as a webhook:
- DYPAI exposes a unique webhook URL for that endpoint
- the workflow starts automatically when that URL receives a request
- you can protect the webhook with the configured authentication method (system secret, header token, HMAC, or basic auth)
- you can inspect webhook status and recent runs from the endpoint detail view
This makes it easy to connect DYPAI workflows to third-party event systems without building a separate backend receiver.
HTTP API Authentication Modes
When using the HTTP API trigger, you must select an Authentication Mode:
jwt: The endpoint requires a valid user session. You can further restrict access using Allowed Roles.api_key: The endpoint requires the project's API Key in theX-API-KEYheader. This bypasses role checks and is ideal for server-to-server communication.
Public (unauthenticated) endpoints are not supported. Every request must be identified by either a user or the project key.
For most app screens and user actions, use HTTP API. Use Webhook when the caller is an external system such as Stripe, GitHub, or another backend.
Building with AI
Remember, you don't have to build these manually. You can use MCP to manage your endpoints using natural language:
- "List all my current endpoints."
- "Explain the logic of the 'process-payment' endpoint."
- "Update the 'get-user' endpoint to also return the user's role."
Placeholder Contract (workflow_code)
To avoid ambiguity, use one strict format in workflows:
${input.field}: endpoint input data${vars.<variable>.field}: output from a previous node by itsvariablealias (recommended)${nodes.<node_id>.field}: output from a previous node by technicalid(also supported)${current_user.field}and${current_user_id}: authenticated user context
Do not use flat placeholders like ${name}, ${email}, ${id}.
The engine validator rejects ambiguous placeholders. If a workflow does not follow this contract, it fails validation before execution.
MCP Preflight Before Creating/Updating Workflows
Before generating a workflow_code with AI:
- Use
search_nodesto discover available nodes. - If you need technical details, call
search_nodeswithnode_id. - Build workflows only with existing
node_typevalues. - Run workflow validation before saving.
Credentials in Integration Nodes
For nodes that require external providers (AI, messaging, email, sheets, payments), include credential_id in the node parameters.
Recommended:
credential_id: "${input.<provider>_credential_id}"
Do not hardcode secrets inside workflow JSON.
Insert Data Rule
For dypai_database with operation = insert:
input_mode: "explicit"(recommended): requiresdatawith defined columns.input_mode: "passthrough": ifdatais empty, it uses the endpoint request body.
This enables fast prototyping while keeping production behavior predictable.
Custom Query (SQL)
In the dypai_database node with operation = custom_query, the query field uses the same official syntax as the rest of the workflow:
- Format:
${variable} - Available variables:
${input.field},${params.field},${current_user_id},${record_id},${limit},${offset} - No quotes: Do not wrap placeholders in quotes. The engine converts them to safe bind params ($1, $2...).
Wrong: WHERE id = '${current_user_id}' β quotes turn the value into a literal and cause type errors (e.g. UUID).
Correct: WHERE id = ${current_user_id} β the engine replaces with a bind param with the correct type.