mirror of
https://github.com/AppFlowy-IO/AppFlowy.git
synced 2026-03-24 04:46:56 +00:00
[GH-ISSUE #7686] [FR] Support OpenAI compatible API endpoint for local AI inference #3401
Labels
No labels
2024
2025
2026
acct mgmt
AI
automation
bug
calendar
ci
CJK
cloud
code-block
collaboration
copy-paste
database
data migration
data sync
deploy
desktop
develop
develop
documentation
duplicate
editor
editor-plugin
emoji
export
files
flutter-only
follow-up
formula
good first issue for devs
good first issue for experienced devs
grid
hacktoberfest
HACKTOBERFEST-ACCEPTED
help wanted
i18n
icons
images
importer
improvements
infra
install
integrations
IR
kanban board
login
look and joy
mentorship
mobile
mobile
needs design
new feature
new feature
non-coding
notes
notifications
onboarding
organization
P0+
permission
platform-linux
platform-mac
platform-windows
plugins
program
pull-request
Q1 25
Q1 26
Q2 24
Q2 25
Q3 24
Q3 25
Q4 24
Q4 25
react
regression
rust
rust
Rust-only
Rust-only
Rust-starter
Rust-starter
self-hosted
shortcuts
side panel
slash-menu
sync v2
table
tablet
task
tauri
templates
tests
themes
translation
v0.5.6
v0.5.8
v0.5.9
v0.6.0
v0.6.1
v0.6.4
v0.6.7
v0.6.8
v0.7.1
v0.7.4
v0.7.4
v0.7.5
v0.7.6
v0.7.7
v0.7.8
v0.8.0
v0.8.4
v0.8.5
v0.8.9
web
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
AppFlowy-IO/AppFlowy#3401
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @rampa3 on GitHub (Apr 4, 2025).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7686
Description
I suggest adding support for OpenAI compatible API endpoint for local AI inference. Main reasoning is, that while Ollama support is useful for some users, majority of the target audience of local AI users will likely already have a local AI inference API running for usage with other apps, and unless they were Ollama users since the beginning, their local API will be OAI compatible endpoint, as it is the most widespread AI API type. I think incompatibility with most common AI API type is a major obstacle slowing down adoption of the newly added free local AI option.
Impact
Additional Context
Links to related AppFlowy Discord conversations (+ comments to the messages):
https://discord.com/channels/903549834160635914/903553722804748309/1351760764569911367
Comment: Ollama API is Ollama specific; also UX-wise, Ollama as a fully CLI operated app is unfriendly to less computer savvy users
https://discord.com/channels/903549834160635914/903553722804748309/1351912249404559362
Comment: using two inference API servers side-by-side is problematic due to storage and system resources constraints
https://discord.com/channels/903549834160635914/903553722804748309/1351890916230823966
Extra context:
https://ollama.com/blog/openai-compatibility
Quote from the linked blog post: "Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally."
Personal case to illustrate the storage constraints problem:
As a user of local AI models, I was looking forward to AppFlowy's local AI implementation since announcement, but it being limited only to Ollama as inference API server is sadly a dealbreaker, as I already run an instance of LocalAI API, and don't want to run two APIs side-by-side due to storage constraints. I have quite a big amount of various models installed in the LocalAI instance, which also means that the model folder for it is quite big. Having to duplicately install my text generation models and embedders in Ollama would take up the space for these models twice, once in LocalAI for OpenAI compatible tools, once in Ollama for AppFlowy. Doing this would mean having at minimum (counting only one text model + embedder) around 5 GiBs of duplicate files in my storage, cutting away from free space for installing apps, documents storage and system files.