mirror of
https://github.com/AppFlowy-IO/AppFlowy.git
synced 2026-03-24 12:56:59 +00:00
[GH-ISSUE #4981] [FR] Add local AI capability (Windows/MacOS/Linux) #2218
Labels
No labels
2024
2025
2026
acct mgmt
AI
automation
bug
calendar
ci
CJK
cloud
code-block
collaboration
copy-paste
database
data migration
data sync
deploy
desktop
develop
develop
documentation
duplicate
editor
editor-plugin
emoji
export
files
flutter-only
follow-up
formula
good first issue for devs
good first issue for experienced devs
grid
hacktoberfest
HACKTOBERFEST-ACCEPTED
help wanted
i18n
icons
images
importer
improvements
infra
install
integrations
IR
kanban board
login
look and joy
mentorship
mobile
mobile
needs design
new feature
new feature
non-coding
notes
notifications
onboarding
organization
P0+
permission
platform-linux
platform-mac
platform-windows
plugins
program
pull-request
Q1 25
Q1 26
Q2 24
Q2 25
Q3 24
Q3 25
Q4 24
Q4 25
react
regression
rust
rust
Rust-only
Rust-only
Rust-starter
Rust-starter
self-hosted
shortcuts
side panel
slash-menu
sync v2
table
tablet
task
tauri
templates
tests
themes
translation
v0.5.6
v0.5.8
v0.5.9
v0.6.0
v0.6.1
v0.6.4
v0.6.7
v0.6.8
v0.7.1
v0.7.4
v0.7.4
v0.7.5
v0.7.6
v0.7.7
v0.7.8
v0.8.0
v0.8.4
v0.8.5
v0.8.9
web
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
AppFlowy-IO/AppFlowy#2218
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @cabusar on GitHub (Mar 25, 2024).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/4981
Description
Hello,
Great work on this project, many thanks. :)
As OpenAI capability is already here and OpenAI client lib could be used for local AI, it would be nice to be able to modify the destination server so we can use our own LLM instead.
In python it would mean to have acces to openai.base_url as shown here :
In this case the openAI API key is not relevant and could be anything except null.
Impact
Users would be able to use any AI service that use the OpenAI request format, including local LLM users.
Additional Context
No response
@annieappflowy commented on GitHub (Mar 29, 2024):
What LLM models/services would you like to use in AppFlowy?
Which AI features would you like to run on these models?
@basilkorompilias commented on GitHub (Apr 21, 2024):
Local AI seems a great suggestion, but for those with older machines who rely on external processing for good AI, it would be great to be able to use other services too than only openai, with a nice little list perhaps.
The most universal would be something like:
https://openrouter.ai/
https://replicate.com/
@Braintelligence commented on GitHub (Sep 5, 2024):
If I understand correctly this feature is already existing for macOS clients, right? Is there any ETA or roadmap on making it available for Microsoft Copilot capable Windows devices or others? Leveraging local AI is a really cool feature!
@RobertTheBrucey commented on GitHub (Jan 14, 2025):
I have https://github.com/oobabooga/text-generation-webui running with an openai API plugin active, please allow us to change the AI endpoint so we can self-host all the components AppFlowy.
@annieappflowy commented on GitHub (Mar 18, 2025):
We recently supported Ollama on macOS, Windows, and Linux
The feature will be available in the upcoming release (v0.8.7)