Official OpenAI API provider plugin for 302 AI Studio.
- π Support for all GPT models (GPT-3.5, GPT-4, GPT-4 Turbo, GPT-4o)
- π§ Support for reasoning models (o1, o3)
- ποΈ Vision capabilities (GPT-4 Vision, GPT-4o)
- π§ Function calling support
- π Custom base URL support (for proxies)
- π’ Organization ID support
- Open 302 AI Studio
- Go to Settings β Plugins β Marketplace
- Search for "OpenAI Provider"
- Click "Install"
- Download the latest
.zipfile from Releases - In 302 AI Studio, go to Settings β Plugins
- Click "Install from File" and select the downloaded ZIP
After installation, configure the plugin:
- Go to Settings β Providers
- Select "OpenAI" provider
- Enter your OpenAI API key
- (Optional) Configure custom base URL or organization ID
- API Key (required): Your OpenAI API key from platform.openai.com/api-keys
- Base URL (optional): Custom API endpoint (default:
https://api.openai.com/v1) - Organization ID (optional): Your OpenAI organization ID for multi-org accounts
This plugin automatically fetches and supports all available GPT models from your OpenAI account, including:
- GPT-4o and GPT-4o mini
- GPT-4 Turbo and GPT-4
- GPT-3.5 Turbo
- o1 and o3 (reasoning models)
- Node.js 20+
- pnpm 8+
# Install dependencies
pnpm install
# Build plugin
pnpm run build
# Package as ZIP
pnpm run packageopenai-plugin/
βββ src/
β βββ index.ts # Plugin implementation
βββ dist/ # Compiled output (generated)
βββ plugin.json # Plugin metadata
βββ package.json
βββ tsconfig.json
βββ tsup.config.ts
βββ scripts/
βββ package.js # ZIP packaging script
MIT License - see LICENSE file for details.