[GH-ISSUE #8302] [FR] Update the AI docker container to use OpenAPIs integrations for self-hosted AI #3712

Open
opened 2026-03-23 21:32:39 +00:00 by mirror · 0 comments
Owner

Originally created by @vendornet on GitHub (Oct 19, 2025).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/8302

Description

Hi,

I have an olllama instance running on my server and I have an exposed OpenAPI server with access token. The current AppFlowy AI docker container only accepts OpenAI and Azure integrations.

Can you also make it support generic OpenAPI endpoints?

Thanks!

Impact

All self-hosted users who are running a self-hosted LLM as an API server.

All self-hosted users who want to integrate with 3rd party LLM provider who supports the OpenAPI interface. They all do at the moment.

Additional Context

No response

Originally created by @vendornet on GitHub (Oct 19, 2025). Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/8302 ### Description Hi, I have an olllama instance running on my server and I have an exposed OpenAPI server with access token. The current AppFlowy AI docker container only accepts OpenAI and Azure integrations. Can you also make it support generic OpenAPI endpoints? Thanks! ### Impact All self-hosted users who are running a self-hosted LLM as an API server. All self-hosted users who want to integrate with 3rd party LLM provider who supports the OpenAPI interface. They all do at the moment. ### Additional Context _No response_
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
AppFlowy-IO/AppFlowy#3712
No description provided.