mirror of
https://github.com/AppFlowy-IO/AppFlowy.git
synced 2026-03-24 04:46:56 +00:00
[GH-ISSUE #7600] [Bug] AI features “Ask AI Anything” and “Explain” failing with ExpertType errors #3370
Labels
No labels
2024
2025
2026
acct mgmt
AI
automation
bug
calendar
ci
CJK
cloud
code-block
collaboration
copy-paste
database
data migration
data sync
deploy
desktop
develop
develop
documentation
duplicate
editor
editor-plugin
emoji
export
files
flutter-only
follow-up
formula
good first issue for devs
good first issue for experienced devs
grid
hacktoberfest
HACKTOBERFEST-ACCEPTED
help wanted
i18n
icons
images
importer
improvements
infra
install
integrations
IR
kanban board
login
look and joy
mentorship
mobile
mobile
needs design
new feature
new feature
non-coding
notes
notifications
onboarding
organization
P0+
permission
platform-linux
platform-mac
platform-windows
plugins
program
pull-request
Q1 25
Q1 26
Q2 24
Q2 25
Q3 24
Q3 25
Q4 24
Q4 25
react
regression
rust
rust
Rust-only
Rust-only
Rust-starter
Rust-starter
self-hosted
shortcuts
side panel
slash-menu
sync v2
table
tablet
task
tauri
templates
tests
themes
translation
v0.5.6
v0.5.8
v0.5.9
v0.6.0
v0.6.1
v0.6.4
v0.6.7
v0.6.8
v0.7.1
v0.7.4
v0.7.4
v0.7.5
v0.7.6
v0.7.7
v0.7.8
v0.8.0
v0.8.4
v0.8.5
v0.8.9
web
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
AppFlowy-IO/AppFlowy#3370
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Medve01 on GitHub (Mar 24, 2025).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7600
Bug Description
When using the AI features in AppFlowy, I’m encountering errors related to undefined ExpertTypes. Specifically:
• “Ask AI Anything” feature results in: “No ExpertType associated with value 7”
• “Explain” feature results in: “No ExpertType associated with value 6”
These errors prevent the AI features from functioning properly.
Environment
Steps to Reproduce
1. Open AppFlowy
2. Attempt to use the “Ask AI Anything” feature
3. Observe the error in logs: “No ExpertType associated with value 7”
4. Attempt to use the “Explain” feature
5. Observe the error in logs: “No ExpertType associated with value 6”
How to Reproduce
Expected Behavior
The AI features should function correctly without ExpertType errors.
Operating System
Client: MacOS (15.3.2), Server: Ubuntu 24.04.2 LTS, Docker version 27.5.1
AppFlowy Version(s)
Client: 0.8.7, Server: latest docker images as of today
Screenshots
No response
Additional Context
Logs:
ai-1 | {"timestamp": "2025-03-24 10:44:34,338", "logger": "uvicorn.error", "level": "ERROR", "message": "Exception in ASGI application
ai-1 | "}
ai-1 | Traceback (most recent call last):
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 257, in call
ai-1 | await wrap(partial(self.listen_for_disconnect, receive))
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap
ai-1 | await func()
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 230, in listen_for_disconnect
ai-1 | message = await receive()
ai-1 | ^^^^^^^^^^^^^^^
ai-1 | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive
ai-1 | await self.message_event.wait()
ai-1 | File "/usr/local/lib/python3.12/asyncio/locks.py", line 212, in wait
ai-1 | await fut
ai-1 | asyncio.exceptions.CancelledError: Cancelled by cancel scope 737857eca030
ai-1 |
ai-1 | During handling of the above exception, another exception occurred:
ai-1 |
ai-1 | + Exception Group Traceback (most recent call last):
ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
ai-1 | | result = await app( # type: ignore[func-returns-value]
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
ai-1 | | return await self.app(scope, receive, send)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call
ai-1 | | await super().call(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in call
ai-1 | | await self.middleware_stack(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in call
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in call
ai-1 | | await self.app(scope, receive, _send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in call
ai-1 | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
ai-1 | | await app(scope, receive, sender)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 715, in call
ai-1 | | await self.middleware_stack(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 735, in app
ai-1 | | await route.handle(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
ai-1 | | await self.app(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
ai-1 | | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
ai-1 | | await app(scope, receive, sender)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 74, in app
ai-1 | | await response(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 250, in call
ai-1 | | async with anyio.create_task_group() as task_group:
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 680, in aexit
ai-1 | | raise BaseExceptionGroup(
ai-1 | | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
ai-1 | +-+---------------- 1 ----------------
ai-1 | | Traceback (most recent call last):
ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 59, in event_stream
ai-1 | | async for output in stream:
ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 114, in astream
ai-1 | | chain = self._build_chain(text, completion_type, custom_prompt)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 74, in _build_chain
ai-1 | | expert = expert_description_from_type(completion_type)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 39, in expert_description_from_type
ai-1 | | expert_type = get_expert_type_from_int(expert_type)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 32, in get_expert_type_from_int
ai-1 | | raise ValueError(f"No ExpertType associated with value {value}")
ai-1 | | ValueError: No ExpertType associated with value 6
ai-1 | |
ai-1 | | During handling of the above exception, another exception occurred:
ai-1 | |
ai-1 | | Traceback (most recent call last):
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap
ai-1 | | await func()
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 242, in stream_response
ai-1 | | async for chunk in self.body_iterator:
ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 72, in event_stream
ai-1 | | raise HTTPException(
ai-1 | | fastapi.exceptions.HTTPException: 500: An error occurred while streaming: No ExpertType associated with value 6
ai-1 | +------------------------------------
ai-1 | {"timestamp": "2025-03-24 10:45:21,087", "logger": "root", "level": "ERROR", "message": "[AI Completion] stream_message exception: No ExpertType associated with value 7"}
@khorshuheng commented on GitHub (Mar 24, 2025):
Is this for self hosted Appflowy Cloud? If so, that is expected, as the current publicly available AppFlowy AI image for self hosting is not the latest version.
@Medve01 commented on GitHub (Mar 24, 2025):
Yes, it is self hosted - any way to fix it?
@khorshuheng commented on GitHub (Mar 24, 2025):
As of the moment, the latest version of AppFlowy AI is not released publicly due to future plans to offer this as part of the commercial self hosting plan.
Assuming that you have no plans to purchase a license, there are two alternatives at the moment:
If you have an Ollama deployment (and prefer it over OpenAI), you can use the AppFlowy Local AI plugin, which is recently open sourced and available for free. That allows you to get newer AI features and bypass the need of having appflowy_ai service, as the client application will send the request to Ollama directly.
Since the client for appflowy_ai is open source (while the source code for appflowy_ai is not), it is possible to implement your own http service that satisfies the same client. If you need help with that, you can join our official discord, and I can provide more detailed explanation over there.
@phubinhdang commented on GitHub (Apr 18, 2025):
@Medve01 I can confirm this error with my local setup. A fix for your personal use could be done by looking into the code inside the appflowy_ai container and provide some code, primarily prompts, to follow the contract of the client for appflow_ai.
If you have the container started, the code for can be found in
/usr/src/app.