[GH-ISSUE #4981] [FR] Add local AI capability (Windows/MacOS/Linux) #2218

Closed
opened 2026-03-23 21:20:42 +00:00 by mirror · 5 comments
Owner

Originally created by @cabusar on GitHub (Mar 25, 2024).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/4981

Description

Hello,

Great work on this project, many thanks. :)

As OpenAI capability is already here and OpenAI client lib could be used for local AI, it would be nice to be able to modify the destination server so we can use our own LLM instead.

In python it would mean to have acces to openai.base_url as shown here :

import openai

# optional; defaults to `os.environ['OPENAI_API_KEY']`
openai.api_key = '...'

# all client options can be configured just like the `OpenAI` instantiation counterpart
openai.base_url = "https://..."

In this case the openAI API key is not relevant and could be anything except null.

Impact

Users would be able to use any AI service that use the OpenAI request format, including local LLM users.

Additional Context

No response

Originally created by @cabusar on GitHub (Mar 25, 2024). Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/4981 ### Description Hello, Great work on this project, many thanks. :) As OpenAI capability is already here and OpenAI client lib could be used for local AI, it would be nice to be able to modify the destination server so we can use our own LLM instead. In python it would mean to have acces to openai.base_url as shown here : ``` import openai # optional; defaults to `os.environ['OPENAI_API_KEY']` openai.api_key = '...' # all client options can be configured just like the `OpenAI` instantiation counterpart openai.base_url = "https://..." ``` In this case the openAI API key is not relevant and could be anything except null. ### Impact Users would be able to use any AI service that use the OpenAI request format, including local LLM users. ### Additional Context _No response_
mirror 2026-03-23 21:20:42 +00:00
Author
Owner

@annieappflowy commented on GitHub (Mar 29, 2024):

What LLM models/services would you like to use in AppFlowy?
Which AI features would you like to run on these models?

<!-- gh-comment-id:2026932980 --> @annieappflowy commented on GitHub (Mar 29, 2024): What LLM models/services would you like to use in AppFlowy? Which AI features would you like to run on these models?
Author
Owner

@basilkorompilias commented on GitHub (Apr 21, 2024):

Local AI seems a great suggestion, but for those with older machines who rely on external processing for good AI, it would be great to be able to use other services too than only openai, with a nice little list perhaps.
The most universal would be something like:
https://openrouter.ai/
https://replicate.com/

<!-- gh-comment-id:2068242350 --> @basilkorompilias commented on GitHub (Apr 21, 2024): Local AI seems a great suggestion, but for those with older machines who rely on external processing for good AI, it would be great to be able to use other services too than only openai, with a nice little list perhaps. The most universal would be something like: https://openrouter.ai/ https://replicate.com/
Author
Owner

@Braintelligence commented on GitHub (Sep 5, 2024):

If I understand correctly this feature is already existing for macOS clients, right? Is there any ETA or roadmap on making it available for Microsoft Copilot capable Windows devices or others? Leveraging local AI is a really cool feature!

<!-- gh-comment-id:2331188047 --> @Braintelligence commented on GitHub (Sep 5, 2024): If I understand correctly this feature is already existing for macOS clients, right? Is there any ETA or roadmap on making it available for Microsoft Copilot capable Windows devices or others? Leveraging local AI is a really cool feature!
Author
Owner

@RobertTheBrucey commented on GitHub (Jan 14, 2025):

I have https://github.com/oobabooga/text-generation-webui running with an openai API plugin active, please allow us to change the AI endpoint so we can self-host all the components AppFlowy.

<!-- gh-comment-id:2589225582 --> @RobertTheBrucey commented on GitHub (Jan 14, 2025): I have https://github.com/oobabooga/text-generation-webui running with an openai API plugin active, please allow us to change the AI endpoint so we can self-host all the components AppFlowy.
Author
Owner

@annieappflowy commented on GitHub (Mar 18, 2025):

We recently supported Ollama on macOS, Windows, and Linux
The feature will be available in the upcoming release (v0.8.7)

<!-- gh-comment-id:2731910210 --> @annieappflowy commented on GitHub (Mar 18, 2025): We recently supported Ollama on macOS, Windows, and Linux The feature will be available in the upcoming release (v0.8.7)
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
AppFlowy-IO/AppFlowy#2218
No description provided.