Gemini-OpenAI-Proxy 这个工具可以起一个服务,将 OpenAI 的 API 调用转为 Gemini Pro API 的 API 调用,从而可以使用现有的 ChatGPT 客户端,体验 Gemini Pro。
OpenAI to Google Gemini https://gemini-openai-proxy.deno.dev
gemini-openai-proxy.zuisong.workers.dev
Gemini-OpenAI-Proxy is a proxy software. It is designed to convert OpenAI API protocol calls into Google Gemini Pro protocol, so that software using OpenAI protocol can use Gemini Pro model without perception.
If you're interested in using Google Gemini but don't want to modify your software, Gemini-OpenAI-Proxy is a great option. It allows you to easily integrate the powerful features of Google Gemini without having to do any complex development work.
Get api key from https://makersuite.google.com/app/apikey
✅ Gemini Pro
curl -s http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello, Who are you?"}],
"temperature": 0.7
}'
✅ Gemini Pro Vision
-
/v1/chat/completions
- stream
- complete
OpenAI Model | Gemini Model |
---|---|
gpt-3.5-turbo | gemini-1.0-pro-latest |
gpt-4 | gemini-1.5-pro-latest |
gpt-4-vision-preview | gemini-1.0-pro-vision-latest |
gpt-4-turbo | gemini-1.5-pro-latest |
gpt-4o | gemini-1.5-flash-latest |
gpt-4-turbo-preview | gemini-1.5-pro-latest |
...others | gemini-1.0-pro-latest |
build command
npm run build:cf_worker
Copy main_cloudflare-workers.mjs
to
cloudflare-workers
build command
npm run build:deno
Copy main_deno.mjs
to deno deploy
build command
npm run build:cf_worker
- Alternatively can be deployed with cli:
vercel deploy
- Serve locally:
vercel dev
- Vercel Functions limitations (with Edge runtime)
deno task start:deno
npm install && npm run start:node
bun run start:bun
from https://github.com/zuisong/gemini-openai-proxy
No comments:
Post a Comment