[PR #7834] [MERGED] Support switch local models #8148

Closed
opened 2026-03-23 23:22:49 +00:00 by mirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/AppFlowy-IO/AppFlowy/pull/7834
Author: @appflowy
Created: 4/25/2025
Status: Merged
Merged: 4/26/2025
Merged by: @appflowy

Base: mainHead: support_switch_local_models


📝 Commits (7)

📊 Changes

25 files changed (+896 additions, -554 deletions)

View changed files

📝 frontend/appflowy_flutter/lib/ai/service/ai_model_state_notifier.dart (+85 -109)
📝 frontend/appflowy_flutter/lib/ai/service/select_model_bloc.dart (+1 -1)
📝 frontend/appflowy_flutter/lib/ai/widgets/prompt_input/select_model_menu.dart (+62 -64)
📝 frontend/appflowy_flutter/lib/plugins/ai_chat/presentation/message/ai_message_action_bar.dart (+1 -1)
📝 frontend/appflowy_flutter/lib/plugins/ai_chat/presentation/message/ai_message_bubble.dart (+1 -1)
📝 frontend/appflowy_flutter/lib/workspace/application/settings/ai/ollama_setting_bloc.dart (+163 -131)
📝 frontend/appflowy_flutter/lib/workspace/application/settings/ai/settings_ai_bloc.dart (+5 -4)
📝 frontend/appflowy_flutter/lib/workspace/presentation/home/menu/sidebar/shared/sidebar_setting.dart (+6 -3)
📝 frontend/appflowy_flutter/lib/workspace/presentation/settings/pages/setting_ai_view/ollama_setting.dart (+65 -0)
📝 frontend/resources/translations/en.json (+1 -0)
📝 frontend/rust-lib/Cargo.lock (+71 -27)
frontend/rust-lib/flowy-ai-pub/src/persistence/local_model_sql.rs (+54 -0)
📝 frontend/rust-lib/flowy-ai-pub/src/persistence/mod.rs (+2 -0)
📝 frontend/rust-lib/flowy-ai/Cargo.toml (+3 -3)
📝 frontend/rust-lib/flowy-ai/src/ai_manager.rs (+153 -121)
📝 frontend/rust-lib/flowy-ai/src/entities.rs (+12 -6)
📝 frontend/rust-lib/flowy-ai/src/event_handler.rs (+19 -10)
📝 frontend/rust-lib/flowy-ai/src/event_map.rs (+17 -10)
📝 frontend/rust-lib/flowy-ai/src/local_ai/controller.rs (+149 -62)
📝 frontend/rust-lib/flowy-ai/src/local_ai/resource.rs (+0 -1)

...and 5 more files

📄 Description

Feature Preview


PR Checklist

  • My code adheres to AppFlowy's Conventions
  • I've listed at least one issue that this PR fixes in the description above.
  • I've added a test(s) to validate changes in this PR, or this PR only contains semantic changes.
  • All existing tests are passing.

Summary by Sourcery

Introduce support for selecting a default local AI model (Ollama) from a list of available models discovered on the user's machine.

New Features:

  • Allow users to select a default chat model from available local Ollama models in AI settings.
  • Display available local models fetched from the configured Ollama instance in settings.
  • Enable switching between available local and cloud models within the chat interface contextually.
  • Persist the user's selected default local model preference.
  • Fetch and display available models from the configured local AI provider (Ollama).

Enhancements:

  • Integrate the Ollama API client (ollama-rs) to interact with the local instance for model discovery.
  • Refactor AI model management logic in both frontend and backend to handle lists of local models and a user-selected default.
  • Improve the UI for selecting models in the chat input and AI settings pages, including animations.
  • Determine and cache whether a discovered local model is for chat or embedding.
  • Rename internal fields like selected_model to global_model and chat_model_name to global_chat_model for better clarity.
  • Update the AI model selection dropdown in settings to list available local models.
  • Fetch local models and settings concurrently on settings page load.
  • Restart local AI provider connection only when the server URL changes.
  • Add animations to the model selection button in the chat input action bar.
  • Make the model selection list in the chat input scrollable.
  • Add a dropdown in Ollama settings to select the default local chat model.
  • Refactor Ollama settings BLoC for better state management and event handling.
  • Refactor backend AI Manager to handle model selection logic more robustly, considering local vs cloud modes and model availability.
  • Refactor local AI controller to use Ollama client for fetching models and checking types.
  • Adapt mobile AI settings UI to reflect model selection changes.
  • Adapt AI model state notifier to use the renamed global_model field.
  • Adapt workspace AI model selection UI to use the renamed global_model field.
  • Add error handling for Ollama API errors.
  • Add SQLite persistence layer for caching local AI model types.
  • Add database schema and migrations for the new local AI model table.
  • Update event handlers and mappings for new local model events.
  • Update default local model names to include ':latest' tag.
  • Add SQLite persistence for local AI model type caching.
  • Add database migration for the local_ai_model_table.

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/AppFlowy-IO/AppFlowy/pull/7834 **Author:** [@appflowy](https://github.com/appflowy) **Created:** 4/25/2025 **Status:** ✅ Merged **Merged:** 4/26/2025 **Merged by:** [@appflowy](https://github.com/appflowy) **Base:** `main` ← **Head:** `support_switch_local_models` --- ### 📝 Commits (7) - [`7dd8d06`](https://github.com/AppFlowy-IO/AppFlowy/commit/7dd8d06c85e16f9a89dcdceb5e0eb40a0cf8139b) chore: display all local models - [`86e6845`](https://github.com/AppFlowy-IO/AppFlowy/commit/86e6845302d92b9fd7ee6d802c0cd8c47d7fec56) chore: update ui - [`90000ad`](https://github.com/AppFlowy-IO/AppFlowy/commit/90000add22bf1b47ef57eee0e1015a02debb7407) chore: update ui - [`3bc0cc7`](https://github.com/AppFlowy-IO/AppFlowy/commit/3bc0cc7b43db809718b82aee27299a8482614bcb) chore: rename - [`549e8ae`](https://github.com/AppFlowy-IO/AppFlowy/commit/549e8aee03b9649d8850a690620434416f761159) chore: clippy - [`f374ca1`](https://github.com/AppFlowy-IO/AppFlowy/commit/f374ca157423b0c7e8bbca118e55abffc898460b) chore: update - [`28a4303`](https://github.com/AppFlowy-IO/AppFlowy/commit/28a4303f0c76278a939f0a6ff287d362de88a97e) Merge branch 'main' into support_switch_local_models ### 📊 Changes **25 files changed** (+896 additions, -554 deletions) <details> <summary>View changed files</summary> 📝 `frontend/appflowy_flutter/lib/ai/service/ai_model_state_notifier.dart` (+85 -109) 📝 `frontend/appflowy_flutter/lib/ai/service/select_model_bloc.dart` (+1 -1) 📝 `frontend/appflowy_flutter/lib/ai/widgets/prompt_input/select_model_menu.dart` (+62 -64) 📝 `frontend/appflowy_flutter/lib/plugins/ai_chat/presentation/message/ai_message_action_bar.dart` (+1 -1) 📝 `frontend/appflowy_flutter/lib/plugins/ai_chat/presentation/message/ai_message_bubble.dart` (+1 -1) 📝 `frontend/appflowy_flutter/lib/workspace/application/settings/ai/ollama_setting_bloc.dart` (+163 -131) 📝 `frontend/appflowy_flutter/lib/workspace/application/settings/ai/settings_ai_bloc.dart` (+5 -4) 📝 `frontend/appflowy_flutter/lib/workspace/presentation/home/menu/sidebar/shared/sidebar_setting.dart` (+6 -3) 📝 `frontend/appflowy_flutter/lib/workspace/presentation/settings/pages/setting_ai_view/ollama_setting.dart` (+65 -0) 📝 `frontend/resources/translations/en.json` (+1 -0) 📝 `frontend/rust-lib/Cargo.lock` (+71 -27) ➕ `frontend/rust-lib/flowy-ai-pub/src/persistence/local_model_sql.rs` (+54 -0) 📝 `frontend/rust-lib/flowy-ai-pub/src/persistence/mod.rs` (+2 -0) 📝 `frontend/rust-lib/flowy-ai/Cargo.toml` (+3 -3) 📝 `frontend/rust-lib/flowy-ai/src/ai_manager.rs` (+153 -121) 📝 `frontend/rust-lib/flowy-ai/src/entities.rs` (+12 -6) 📝 `frontend/rust-lib/flowy-ai/src/event_handler.rs` (+19 -10) 📝 `frontend/rust-lib/flowy-ai/src/event_map.rs` (+17 -10) 📝 `frontend/rust-lib/flowy-ai/src/local_ai/controller.rs` (+149 -62) 📝 `frontend/rust-lib/flowy-ai/src/local_ai/resource.rs` (+0 -1) _...and 5 more files_ </details> ### 📄 Description <!--- Thank you for submitting a pull request to AppFlowy. The team will dedicate their best efforts to reviewing and approving your pull request. If you have any questions about the project or feedback for us, please join our [Discord](https://discord.gg/wdjWUXXhtw). --> <!--- If your pull request adds a new feature, please drag and drop a video into this section to showcase what you've done! If not, you may delete this section. --> ### Feature Preview <!--- List at least one issue here that this PR addresses. If it fixes the issue, please use the [fixes](https://docs.github.com/en/get-started/writing-on-github/working-with-advanced-formatting/using-keywords-in-issues-and-pull-requests) keyword to close the issue. For example: fixes https://github.com/AppFlowy-IO/AppFlowy/pull/2106 --> --- <!--- Before you mark this PR ready for review, run through this checklist! --> #### PR Checklist - [ ] My code adheres to [AppFlowy's Conventions](https://docs.appflowy.io/docs/documentation/software-contributions/conventions) - [ ] I've listed at least one issue that this PR fixes in the description above. - [ ] I've added a test(s) to validate changes in this PR, or this PR only contains semantic changes. - [ ] All existing tests are passing. ## Summary by Sourcery Introduce support for selecting a default local AI model (Ollama) from a list of available models discovered on the user's machine. New Features: - Allow users to select a default chat model from available local Ollama models in AI settings. - Display available local models fetched from the configured Ollama instance in settings. - Enable switching between available local and cloud models within the chat interface contextually. - Persist the user's selected default local model preference. - Fetch and display available models from the configured local AI provider (Ollama). Enhancements: - Integrate the Ollama API client (`ollama-rs`) to interact with the local instance for model discovery. - Refactor AI model management logic in both frontend and backend to handle lists of local models and a user-selected default. - Improve the UI for selecting models in the chat input and AI settings pages, including animations. - Determine and cache whether a discovered local model is for chat or embedding. - Rename internal fields like `selected_model` to `global_model` and `chat_model_name` to `global_chat_model` for better clarity. - Update the AI model selection dropdown in settings to list available local models. - Fetch local models and settings concurrently on settings page load. - Restart local AI provider connection only when the server URL changes. - Add animations to the model selection button in the chat input action bar. - Make the model selection list in the chat input scrollable. - Add a dropdown in Ollama settings to select the default local chat model. - Refactor Ollama settings BLoC for better state management and event handling. - Refactor backend AI Manager to handle model selection logic more robustly, considering local vs cloud modes and model availability. - Refactor local AI controller to use Ollama client for fetching models and checking types. - Adapt mobile AI settings UI to reflect model selection changes. - Adapt AI model state notifier to use the renamed `global_model` field. - Adapt workspace AI model selection UI to use the renamed `global_model` field. - Add error handling for Ollama API errors. - Add SQLite persistence layer for caching local AI model types. - Add database schema and migrations for the new local AI model table. - Update event handlers and mappings for new local model events. - Update default local model names to include ':latest' tag. - Add SQLite persistence for local AI model type caching. - Add database migration for the `local_ai_model_table`. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
mirror 2026-03-23 23:22:49 +00:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
AppFlowy-IO/AppFlowy#8148
No description provided.