[GH-ISSUE #7722] [Bug] LAI doesn't get detected by the flatpak package #3412

Closed
opened 2026-03-23 21:29:58 +00:00 by mirror · 5 comments
Owner

Originally created by @Serendeep on GitHub (Apr 10, 2025).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7722

Originally assigned to: @appflowy on GitHub.

Bug Description

AppFlowy Local AI says app was not installed correctly, irrespective of having the plugin.

How to Reproduce

Install the flatpak version of the app using : flatpak install flathub io.appflowy.AppFlowy

Install on LAI app following the instructions for Ollama and using this command for installing the LAI plugin: curl -fsSL https://raw.githubusercontent.com/AppFlowy-IO/AppFlowy-LAI/main/install.sh | sudo bash

Expected Behavior

AppFlowy Local AI gets installed correctly

Operating System

Arch Linux

AppFlowy Version(s)

0.8.8

Screenshots

Image

Image

Additional Context

sudo lsof -i:11434
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
ollama 2961 ollama 3u IPv4 39942 0t0 TCP localhost:11434 (LISTEN)

Originally created by @Serendeep on GitHub (Apr 10, 2025). Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7722 Originally assigned to: @appflowy on GitHub. ### Bug Description AppFlowy Local AI says app was not installed correctly, irrespective of having the plugin. ### How to Reproduce Install the flatpak version of the app using : flatpak install flathub io.appflowy.AppFlowy Install on LAI app following the instructions for Ollama and using this command for installing the LAI plugin: curl -fsSL https://raw.githubusercontent.com/AppFlowy-IO/AppFlowy-LAI/main/install.sh | sudo bash ### Expected Behavior AppFlowy Local AI gets installed correctly ### Operating System Arch Linux ### AppFlowy Version(s) 0.8.8 ### Screenshots ![Image](https://github.com/user-attachments/assets/c7904af9-fbac-4666-88f4-707d972c6f3a) ![Image](https://github.com/user-attachments/assets/6b13a241-9894-43f3-958e-92fb8c688f40) ### Additional Context sudo lsof -i:11434 COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME ollama 2961 ollama 3u IPv4 39942 0t0 TCP localhost:11434 (LISTEN)
mirror 2026-03-23 21:29:58 +00:00
Author
Owner

@LucasXu0 commented on GitHub (Apr 11, 2025):

@Serendeep Could you share the log files with us? Here's the guide to export the log files.

https://appflowy.com/ccf3ca5a-7ecd-4c35-805c-3a3c9c22e87d/AppFlowy-Q-A

<!-- gh-comment-id:2795680387 --> @LucasXu0 commented on GitHub (Apr 11, 2025): @Serendeep Could you share the log files with us? Here's the guide to export the log files. https://appflowy.com/ccf3ca5a-7ecd-4c35-805c-3a3c9c22e87d/AppFlowy-Q-A
Author
Owner

@Serendeep commented on GitHub (Apr 11, 2025):

@LucasXu0

2025-04-11 13:46:22 INFO flowy_ai::local_ai::controller: [AI Plugin] enable: true, thread id: ThreadId(17)
at flowy-ai/src/local_ai/controller.rs:480

{"msg":"[AI Plugin] enable: true, thread id: ThreadId(17)","time":"04-11 13:46:22","target":"flowy_ai::local_ai::controller"}
2025-04-11 13:46:22 INFO flowy_ai::local_ai::controller: [AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)
at flowy-ai/src/local_ai/controller.rs:535

{"msg":"[AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)","time":"04-11 13:46:22","target":"flowy_ai::local_ai::controller"}
2025-04-11 13:46:22 INFO flowy_ai::ai_manager: Set global active model to local ai: llama3.1
at flowy-ai/src/ai_manager.rs:400

{"msg":"Set global active model to local ai: llama3.1","time":"04-11 13:46:22","target":"flowy_ai::ai_manager"}
2025-04-11 13:46:22 INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "llama3.1", is_local: true, desc: "" }
at flowy-ai/src/ai_manager.rs:379

{"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: "llama3.1", is_local: true, desc: "" }","time":"04-11 13:46:22","target":"flowy_ai::ai_manager"}

Relevant logs from running via command line. Also attaching the logs from yesterday when I was trying this out for additional context.

log.2025-04-10.txt

<!-- gh-comment-id:2796188807 --> @Serendeep commented on GitHub (Apr 11, 2025): @LucasXu0 2025-04-11 13:46:22 INFO flowy_ai::local_ai::controller: [AI Plugin] enable: true, thread id: ThreadId(17) at flowy-ai/src/local_ai/controller.rs:480 {"msg":"[AI Plugin] enable: true, thread id: ThreadId(17)","time":"04-11 13:46:22","target":"flowy_ai::local_ai::controller"} 2025-04-11 13:46:22 INFO flowy_ai::local_ai::controller: [AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17) at flowy-ai/src/local_ai/controller.rs:535 {"msg":"[AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)","time":"04-11 13:46:22","target":"flowy_ai::local_ai::controller"} 2025-04-11 13:46:22 INFO flowy_ai::ai_manager: Set global active model to local ai: llama3.1 at flowy-ai/src/ai_manager.rs:400 {"msg":"Set global active model to local ai: llama3.1","time":"04-11 13:46:22","target":"flowy_ai::ai_manager"} 2025-04-11 13:46:22 INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "llama3.1", is_local: true, desc: "" } at flowy-ai/src/ai_manager.rs:379 {"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: \"llama3.1\", is_local: true, desc: \"\" }","time":"04-11 13:46:22","target":"flowy_ai::ai_manager"} Relevant logs from running via command line. Also attaching the logs from yesterday when I was trying this out for additional context. [log.2025-04-10.txt](https://github.com/user-attachments/files/19701469/log.2025-04-10.txt)
Author
Owner

@LucasXu0 commented on GitHub (Apr 15, 2025):

Hi @Serendeep, as far as I know, the flatpak application should have no access permission to the system path. Flatpak Sandbox Permissions

flatpak override io.appflowy.appflowy --filesystem=/usr/local/bin

Could you try running the command above and try again? If it works, we can add it to the documentation.

<!-- gh-comment-id:2803545901 --> @LucasXu0 commented on GitHub (Apr 15, 2025): Hi @Serendeep, as far as I know, the flatpak application should have no access permission to the system path. [Flatpak Sandbox Permissions](https://docs.flatpak.org/en/latest/sandbox-permissions.html) ``` flatpak override io.appflowy.appflowy --filesystem=/usr/local/bin ``` Could you try running the command above and try again? If it works, we can add it to the documentation.
Author
Owner

@Serendeep commented on GitHub (Apr 15, 2025):

2025-04-15 14:59:46  INFO flowy_ai::local_ai::controller: [AI Plugin] enable: true, thread id: ThreadId(17)
    at flowy-ai/src/local_ai/controller.rs:480

{"msg":"[AI Plugin] enable: true, thread id: ThreadId(17)","time":"04-15 14:59:46","target":"flowy_ai::local_ai::controller"}
  2025-04-15 14:59:46  INFO flowy_ai::local_ai::controller: [AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)
    at flowy-ai/src/local_ai/controller.rs:535

{"msg":"[AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)","time":"04-15 14:59:46","target":"flowy_ai::local_ai::controller"}
  2025-04-15 14:59:46  INFO flowy_ai::ai_manager: Set global active model to local ai: llama3.1
    at flowy-ai/src/ai_manager.rs:400

{"msg":"Set global active model to local ai: llama3.1","time":"04-15 14:59:46","target":"flowy_ai::ai_manager"}
  2025-04-15 14:59:46  INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "llama3.1", is_local: true, desc: "" }
    at flowy-ai/src/ai_manager.rs:379

{"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: \"llama3.1\", is_local: true, desc: \"\" }","time":"04-15 14:59:46","target":"flowy_ai::ai_manager"}
{"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 14:59:46","target":"client_api::http_settings"}
  2025-04-15 14:59:47  INFO client_api::http: request_id: "5de44524ecd5fcfbf75cc2edcea40432"
    at /home/runner/.cargo/git/checkouts/appflowy-cloud-875aed6322f3953d/f7288f4/libs/client-api/src/http.rs:1177
    in client_api::http_settings::get_workspace_settings

{"msg":"[GET_WORKSPACE_SETTINGS - EVENT] request_id: \"5de44524ecd5fcfbf75cc2edcea40432\"","time":"04-15 14:59:47","target":"client_api::http"}
{"msg":"[GET_WORKSPACE_SETTINGS - END]","time":"04-15 14:59:47","target":"client_api::http_settings"}

2025-04-15 15:00:20  INFO flowy_ai::local_ai::controller: [AI Plugin] enable: false, thread id: ThreadId(17)
    at flowy-ai/src/local_ai/controller.rs:480

{"msg":"[AI Plugin] enable: false, thread id: ThreadId(17)","time":"04-15 15:00:20","target":"flowy_ai::local_ai::controller"}
{"msg":"[🟢 DESTROY_PLUGIN - START]","time":"04-15 15:00:20","target":"af_local_ai::ollama_plugin"}
{"msg":"[DESTROY_PLUGIN - END]","time":"04-15 15:00:20","target":"af_local_ai::ollama_plugin"}
  2025-04-15 15:00:20  INFO flowy_ai::ai_manager: Set global active model to default
    at flowy-ai/src/ai_manager.rs:405

{"msg":"Set global active model to default","time":"04-15 15:00:20","target":"flowy_ai::ai_manager"}
{"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 15:00:20","target":"client_api::http_settings"}
  2025-04-15 15:00:21  INFO client_api::http: request_id: "82768be8348b8a6729d87ebf72d18318"
    at /home/runner/.cargo/git/checkouts/appflowy-cloud-875aed6322f3953d/f7288f4/libs/client-api/src/http.rs:1177
    in client_api::http_settings::get_workspace_settings

{"msg":"[GET_WORKSPACE_SETTINGS - EVENT] request_id: \"82768be8348b8a6729d87ebf72d18318\"","time":"04-15 15:00:21","target":"client_api::http"}
{"msg":"[GET_WORKSPACE_SETTINGS - END]","time":"04-15 15:00:21","target":"client_api::http_settings"}
  2025-04-15 15:00:21  INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "Auto", is_local: false, desc: "Auto select the best model" }
    at flowy-ai/src/ai_manager.rs:379

{"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: \"Auto\", is_local: false, desc: \"Auto select the best model\" }","time":"04-15 15:00:21","target":"flowy_ai::ai_manager"}
{"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 15:00:21","target":"client_api::http_settings"}

Still the same, will try reinstalling and see whether that would solve any issues

<!-- gh-comment-id:2804427942 --> @Serendeep commented on GitHub (Apr 15, 2025): ``` 2025-04-15 14:59:46 INFO flowy_ai::local_ai::controller: [AI Plugin] enable: true, thread id: ThreadId(17) at flowy-ai/src/local_ai/controller.rs:480 {"msg":"[AI Plugin] enable: true, thread id: ThreadId(17)","time":"04-15 14:59:46","target":"flowy_ai::local_ai::controller"} 2025-04-15 14:59:46 INFO flowy_ai::local_ai::controller: [AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17) at flowy-ai/src/local_ai/controller.rs:535 {"msg":"[AI Plugin] lack of resource: LackOfAIResourcePB { resource_type: PluginExecutableNotReady, missing_model_names: [] } to initialize plugin, thread: ThreadId(17)","time":"04-15 14:59:46","target":"flowy_ai::local_ai::controller"} 2025-04-15 14:59:46 INFO flowy_ai::ai_manager: Set global active model to local ai: llama3.1 at flowy-ai/src/ai_manager.rs:400 {"msg":"Set global active model to local ai: llama3.1","time":"04-15 14:59:46","target":"flowy_ai::ai_manager"} 2025-04-15 14:59:46 INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "llama3.1", is_local: true, desc: "" } at flowy-ai/src/ai_manager.rs:379 {"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: \"llama3.1\", is_local: true, desc: \"\" }","time":"04-15 14:59:46","target":"flowy_ai::ai_manager"} {"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 14:59:46","target":"client_api::http_settings"} 2025-04-15 14:59:47 INFO client_api::http: request_id: "5de44524ecd5fcfbf75cc2edcea40432" at /home/runner/.cargo/git/checkouts/appflowy-cloud-875aed6322f3953d/f7288f4/libs/client-api/src/http.rs:1177 in client_api::http_settings::get_workspace_settings {"msg":"[GET_WORKSPACE_SETTINGS - EVENT] request_id: \"5de44524ecd5fcfbf75cc2edcea40432\"","time":"04-15 14:59:47","target":"client_api::http"} {"msg":"[GET_WORKSPACE_SETTINGS - END]","time":"04-15 14:59:47","target":"client_api::http_settings"} ``` ``` 2025-04-15 15:00:20 INFO flowy_ai::local_ai::controller: [AI Plugin] enable: false, thread id: ThreadId(17) at flowy-ai/src/local_ai/controller.rs:480 {"msg":"[AI Plugin] enable: false, thread id: ThreadId(17)","time":"04-15 15:00:20","target":"flowy_ai::local_ai::controller"} {"msg":"[🟢 DESTROY_PLUGIN - START]","time":"04-15 15:00:20","target":"af_local_ai::ollama_plugin"} {"msg":"[DESTROY_PLUGIN - END]","time":"04-15 15:00:20","target":"af_local_ai::ollama_plugin"} 2025-04-15 15:00:20 INFO flowy_ai::ai_manager: Set global active model to default at flowy-ai/src/ai_manager.rs:405 {"msg":"Set global active model to default","time":"04-15 15:00:20","target":"flowy_ai::ai_manager"} {"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 15:00:20","target":"client_api::http_settings"} 2025-04-15 15:00:21 INFO client_api::http: request_id: "82768be8348b8a6729d87ebf72d18318" at /home/runner/.cargo/git/checkouts/appflowy-cloud-875aed6322f3953d/f7288f4/libs/client-api/src/http.rs:1177 in client_api::http_settings::get_workspace_settings {"msg":"[GET_WORKSPACE_SETTINGS - EVENT] request_id: \"82768be8348b8a6729d87ebf72d18318\"","time":"04-15 15:00:21","target":"client_api::http"} {"msg":"[GET_WORKSPACE_SETTINGS - END]","time":"04-15 15:00:21","target":"client_api::http_settings"} 2025-04-15 15:00:21 INFO flowy_ai::ai_manager: [Model Selection] update ai_models_global_active_model selected model: AIModel { name: "Auto", is_local: false, desc: "Auto select the best model" } at flowy-ai/src/ai_manager.rs:379 {"msg":"[Model Selection] update ai_models_global_active_model selected model: AIModel { name: \"Auto\", is_local: false, desc: \"Auto select the best model\" }","time":"04-15 15:00:21","target":"flowy_ai::ai_manager"} {"msg":"[🟢 GET_WORKSPACE_SETTINGS - START]","time":"04-15 15:00:21","target":"client_api::http_settings"} ``` Still the same, will try reinstalling and see whether that would solve any issues
Author
Owner

@Serendeep commented on GitHub (Apr 15, 2025):

As per the documentation: https://docs.flatpak.org/en/latest/sandbox-permissions.html#reserved-paths

/usr seems to be a reserved path, and overriding access won't have any effect.

<!-- gh-comment-id:2804442594 --> @Serendeep commented on GitHub (Apr 15, 2025): As per the documentation: https://docs.flatpak.org/en/latest/sandbox-permissions.html#reserved-paths /usr seems to be a reserved path, and overriding access won't have any effect.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
AppFlowy-IO/AppFlowy#3412
No description provided.