← Back to feed
6

Community Tool Routes ChatGPT OAuth Tokens to OpenAI-Compatible API Endpoint

Open Source1 source·Mar 16

Summary

  • • Unofficial tool proxies ChatGPT account OAuth tokens to mimic OpenAI API access
  • • Supports gpt-5.4, gpt-5.3-codex via Codex CLI endpoint, no API key needed
  • • Carries significant ToS and security risks — tokens are password-equivalent credentials
  • • Works as localhost proxy or Vercel AI SDK provider via npx openai-oauth
Adjust signal

Details

1.New Tech

openai-oauth proxies ChatGPT OAuth tokens to a local OpenAI-compatible API endpoint

Running 'npx openai-oauth' spins up a localhost proxy at 127.0.0.1:10531/v1 that accepts standard OpenAI SDK calls. It reuses the auth tokens created by OpenAI's own Codex CLI, meaning no separate API key or billing account is required.

2.Tech Info

Underlying mechanism uses Codex CLI's special rate-limited endpoint at chatgpt.com/backend-api/codex/responses

OpenAI's Codex CLI uses OAuth-authenticated access to a backend endpoint with rate limits tied to ChatGPT subscriptions rather than API credit purchases. This tool reuses those same tokens to access the same endpoint outside the official CLI.

3.Product Launch

Available models include gpt-5.4 and gpt-5.3-codex, limited to Codex-supported models

The model list is constrained to whatever OpenAI exposes through the Codex backend API. Users cannot access models outside that allowlist, and the available set may change without notice as OpenAI controls the upstream endpoint.

4.Tech Info

Working features include streaming, tool calls, reasoning traces, and three API endpoints

The proxy supports /v1/responses, /v1/chat/completions, and /v1/models. It also ships as a Vercel AI SDK provider for direct integration into Next.js and similar workflows. Stateful replay on /v1/responses is not yet implemented.

5.Security Alert

OAuth tokens used are password-equivalent and must not be shared, hosted, or pooled

The tool reads from a local auth.json file created by the Codex CLI. The project's own disclaimer explicitly warns against running this as a hosted service or sharing access, as doing so would expose credentials that grant full ChatGPT account access.

6.Legal

Project is unofficial and places full ToS compliance responsibility on the user

The disclaimer states users are solely responsible for complying with OpenAI's Terms of Service. The project is not affiliated with, endorsed by, or sponsored by OpenAI.

7.Context

Package is split into three components: core transport, Vercel AI SDK provider, and CLI proxy

The modular structure (openai-oauth-core, openai-oauth-provider, openai-oauth) allows developers to use just the provider layer in application code or the full CLI proxy for local tooling, sharing the same OAuth transport and token-refresh logic.

New Tech = novel capability or tool; Tech Info = how it works; Product Launch = publicly released software; Security Alert = credential/access risk; Legal = ToS/compliance concern; Context = background framing

What This Means

This tool surfaces a practical workaround for developers who want to experiment with OpenAI's newest models — including gpt-5.4 and gpt-5.3-codex — without API credits, by reusing the OAuth tokens that ChatGPT and Codex CLI already generate locally. It supports streaming, tool calls, and reasoning traces, making it functionally comparable to a paid API integration for personal experimentation. The project places full ToS compliance responsibility on the user, and its own disclaimer flags the tokens as password-equivalent credentials, limiting recommended use to local, non-shared environments. For developers evaluating model capabilities before committing to API spend, the tool offers a low-friction entry point — though the ToS compliance question remains the user's to navigate.

Sources

Similar Events