Build AI-powered web apps with WeWeb and OpenAI

Generate content, analyze data, power assistants, and automate workflows while keeping full control over logic, data, and user experience.

Table of Contents
Start building today with WeWeb!
Build your first application in no time!

Call OpenAI models from your web apps

You can connect user inputs, database records, and workflows to AI-driven logic without hardcoding the frontend:

  • Design custom AI-driven interfaces visually in the WeWeb Editor
  • Send prompts to OpenAI models from WeWeb workflows
  • Generate text dynamically based on user input or app data
  • Analyze and summarize structured or unstructured content
  • Power AI assistants, chat interfaces, and help tools
  • Use AI responses to drive conditional logic and UI states
  • Combine OpenAI with your backend data (Supabase, Xano, Airtable, APIs)
  • Control when and how OpenAI is called using workflows and conditions

What WeWeb supports natively

Category Feature What it does
Plugin setup OpenAI extension plugin Adds the OpenAI plugin from Plugins → Extensions and loads a secured OpenAI integration.
Credentials Secure API key storage Stores your OpenAI API key in plugin settings so it isn’t exposed in frontend calls.
Secured prompts Plugin-level prompt definitions Defines named prompts with variables at plugin level, stored securely server-side.
Text completion Create completion action Calls text models with configurable prompt, tokens, and temperature.
Chat completion Create chat completion action Calls chat models with messages and a secured system prompt.
Image generation Create image action Generates images from text prompts and returns image URLs.
Prompt variables Bind UI data into prompts Injects user input into secured prompts using variables.
Frontend safety Hide system prompts Keeps system prompts out of the browser Network tab.
Workflow integration Use responses in UI Stores OpenAI responses in variables and binds them to UI.
Backend support Backend OpenAI integration Uses @weweb/openai in backend workflows for embeddings or moderation.

Why use WeWeb with OpenAI

OpenAI provides the intelligence. WeWeb provides the structure, control, and UI needed to turn AI into a real product feature.

Perfect for:

  • AI-powered internal tools for summarizing tickets, reports, or documents
  • Customer-facing chat assistants connected to your product data
  • Smart CRMs that generate notes, follow-ups, or insights
  • AI-driven dashboards that explain or analyze metrics

And more…

Integrating Open AI with WeWeb

Integrating OpenAI with WeWeb is straightforward:

  1. Create an OpenAI API key from your OpenAI dashboard
  2. Add OpenAI as an external API in the WeWeb Dashboard
  3. Configure authentication using your API key
  4. Create workflows to send prompts and parameters to OpenAI
  5. Bind AI responses to variables, UI elements, or logic flows

Learn more about WeWeb x OpenAI integration

Best practices 

  • Be mindful of OpenAI rate limits and token usage
  • Avoid sending sensitive or unnecessary data in prompts
  • Use clear, structured prompts for more predictable results
  • Cache or store AI responses when appropriate to reduce repeated calls

FAQs

1. How do I connect OpenAI to a WeWeb project?

Install the OpenAI extension and add your OpenAI API key in the plugin settings so calls are routed through WeWeb rather than from the browser directly. Once configured, OpenAI actions become available in workflows (for text, chat, and image generation).

2. What is the advantage of plugin‑level “secured prompts”?

Secured prompts let you define system instructions and base prompts on the server side, referenced only by an ID from the client. This keeps prompt logic and secrets out of the browser network inspector, reducing leakage of sensitive instructions.

3. Can I dynamically inject user context (plan, language, input) into prompts?

Yes, prompts can contain variables like {{question}} or {{language}} that are bound to WeWeb variables at runtime when the workflow runs. This allows personalized responses while the underlying prompt template remains secured in the plugin.

4. How do I handle and display OpenAI responses in the UI?

Workflow actions can save the OpenAI response into variables, which you then bind to text components, rich‑text blocks, or collections. For multi‑step flows, you typically append new responses to an array variable representing a chat history or list of suggestions.

5. What are the main limitations of the native OpenAI plugin?

The plugin focuses on mainstream endpoints (completions, chat, images) and common parameters. More advanced features such as Assistants, function/tool calling orchestration, and fine‑tuning often require custom backend logic or direct API integrations outside the predefined actions.

Start building for free

Sign up now, pay when you're ready to publish.