[GH-ISSUE #7600] [Bug] AI features “Ask AI Anything” and “Explain” failing with ExpertType errors #3370

Closed
opened 2026-03-23 21:29:38 +00:00 by mirror · 4 comments
Owner

Originally created by @Medve01 on GitHub (Mar 24, 2025).
Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7600

Bug Description

When using the AI features in AppFlowy, I’m encountering errors related to undefined ExpertTypes. Specifically:
• “Ask AI Anything” feature results in: “No ExpertType associated with value 7”
• “Explain” feature results in: “No ExpertType associated with value 6”
These errors prevent the AI features from functioning properly.
Environment

•	AppFlowy version: Please specify your AppFlowy version
•	AI container version: If known
•	Deployment method: Docker containers
•	AI API: Using OpenAI API (key set via environment variable)

Steps to Reproduce
1. Open AppFlowy
2. Attempt to use the “Ask AI Anything” feature
3. Observe the error in logs: “No ExpertType associated with value 7”
4. Attempt to use the “Explain” feature
5. Observe the error in logs: “No ExpertType associated with value 6”

How to Reproduce

1.	Open AppFlowy
2.	Attempt to use the “Ask AI Anything” feature
3.	Observe the error in logs: “No ExpertType associated with value 7”
4.	Attempt to use the “Explain” feature
5.	Observe the error in logs: “No ExpertType associated with value 6”

Expected Behavior

The AI features should function correctly without ExpertType errors.

Operating System

Client: MacOS (15.3.2), Server: Ubuntu 24.04.2 LTS, Docker version 27.5.1

AppFlowy Version(s)

Client: 0.8.7, Server: latest docker images as of today

Screenshots

No response

Additional Context

Logs:
ai-1 | {"timestamp": "2025-03-24 10:44:34,338", "logger": "uvicorn.error", "level": "ERROR", "message": "Exception in ASGI application
ai-1 | "}
ai-1 | Traceback (most recent call last):
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 257, in call
ai-1 | await wrap(partial(self.listen_for_disconnect, receive))
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap
ai-1 | await func()
ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 230, in listen_for_disconnect
ai-1 | message = await receive()
ai-1 | ^^^^^^^^^^^^^^^
ai-1 | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive
ai-1 | await self.message_event.wait()
ai-1 | File "/usr/local/lib/python3.12/asyncio/locks.py", line 212, in wait
ai-1 | await fut
ai-1 | asyncio.exceptions.CancelledError: Cancelled by cancel scope 737857eca030
ai-1 |
ai-1 | During handling of the above exception, another exception occurred:
ai-1 |
ai-1 | + Exception Group Traceback (most recent call last):
ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
ai-1 | | result = await app( # type: ignore[func-returns-value]
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
ai-1 | | return await self.app(scope, receive, send)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call
ai-1 | | await super().call(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in call
ai-1 | | await self.middleware_stack(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in call
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in call
ai-1 | | await self.app(scope, receive, _send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in call
ai-1 | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
ai-1 | | await app(scope, receive, sender)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 715, in call
ai-1 | | await self.middleware_stack(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 735, in app
ai-1 | | await route.handle(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
ai-1 | | await self.app(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
ai-1 | | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
ai-1 | | raise exc
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
ai-1 | | await app(scope, receive, sender)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 74, in app
ai-1 | | await response(scope, receive, send)
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 250, in call
ai-1 | | async with anyio.create_task_group() as task_group:
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 680, in aexit
ai-1 | | raise BaseExceptionGroup(
ai-1 | | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
ai-1 | +-+---------------- 1 ----------------
ai-1 | | Traceback (most recent call last):
ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 59, in event_stream
ai-1 | | async for output in stream:
ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 114, in astream
ai-1 | | chain = self._build_chain(text, completion_type, custom_prompt)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 74, in _build_chain
ai-1 | | expert = expert_description_from_type(completion_type)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 39, in expert_description_from_type
ai-1 | | expert_type = get_expert_type_from_int(expert_type)
ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 32, in get_expert_type_from_int
ai-1 | | raise ValueError(f"No ExpertType associated with value {value}")
ai-1 | | ValueError: No ExpertType associated with value 6
ai-1 | |
ai-1 | | During handling of the above exception, another exception occurred:
ai-1 | |
ai-1 | | Traceback (most recent call last):
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap
ai-1 | | await func()
ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 242, in stream_response
ai-1 | | async for chunk in self.body_iterator:
ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 72, in event_stream
ai-1 | | raise HTTPException(
ai-1 | | fastapi.exceptions.HTTPException: 500: An error occurred while streaming: No ExpertType associated with value 6
ai-1 | +------------------------------------
ai-1 | {"timestamp": "2025-03-24 10:45:21,087", "logger": "root", "level": "ERROR", "message": "[AI Completion] stream_message exception: No ExpertType associated with value 7"}

Originally created by @Medve01 on GitHub (Mar 24, 2025). Original GitHub issue: https://github.com/AppFlowy-IO/AppFlowy/issues/7600 ### Bug Description When using the AI features in AppFlowy, I’m encountering errors related to undefined ExpertTypes. Specifically: • “Ask AI Anything” feature results in: “No ExpertType associated with value 7” • “Explain” feature results in: “No ExpertType associated with value 6” These errors prevent the AI features from functioning properly. Environment • AppFlowy version: Please specify your AppFlowy version • AI container version: If known • Deployment method: Docker containers • AI API: Using OpenAI API (key set via environment variable) Steps to Reproduce 1. Open AppFlowy 2. Attempt to use the “Ask AI Anything” feature 3. Observe the error in logs: “No ExpertType associated with value 7” 4. Attempt to use the “Explain” feature 5. Observe the error in logs: “No ExpertType associated with value 6” ### How to Reproduce 1. Open AppFlowy 2. Attempt to use the “Ask AI Anything” feature 3. Observe the error in logs: “No ExpertType associated with value 7” 4. Attempt to use the “Explain” feature 5. Observe the error in logs: “No ExpertType associated with value 6” ### Expected Behavior The AI features should function correctly without ExpertType errors. ### Operating System Client: MacOS (15.3.2), Server: Ubuntu 24.04.2 LTS, Docker version 27.5.1 ### AppFlowy Version(s) Client: 0.8.7, Server: latest docker images as of today ### Screenshots _No response_ ### Additional Context Logs: ai-1 | {"timestamp": "2025-03-24 10:44:34,338", "logger": "uvicorn.error", "level": "ERROR", "message": "Exception in ASGI application ai-1 | "} ai-1 | Traceback (most recent call last): ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 257, in __call__ ai-1 | await wrap(partial(self.listen_for_disconnect, receive)) ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap ai-1 | await func() ai-1 | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 230, in listen_for_disconnect ai-1 | message = await receive() ai-1 | ^^^^^^^^^^^^^^^ ai-1 | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive ai-1 | await self.message_event.wait() ai-1 | File "/usr/local/lib/python3.12/asyncio/locks.py", line 212, in wait ai-1 | await fut ai-1 | asyncio.exceptions.CancelledError: Cancelled by cancel scope 737857eca030 ai-1 | ai-1 | During handling of the above exception, another exception occurred: ai-1 | ai-1 | + Exception Group Traceback (most recent call last): ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi ai-1 | | result = await app( # type: ignore[func-returns-value] ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__ ai-1 | | return await self.app(scope, receive, send) ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__ ai-1 | | await super().__call__(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__ ai-1 | | await self.middleware_stack(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__ ai-1 | | raise exc ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__ ai-1 | | await self.app(scope, receive, _send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__ ai-1 | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app ai-1 | | raise exc ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app ai-1 | | await app(scope, receive, sender) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 715, in __call__ ai-1 | | await self.middleware_stack(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 735, in app ai-1 | | await route.handle(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle ai-1 | | await self.app(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 76, in app ai-1 | | await wrap_app_handling_exceptions(app, request)(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app ai-1 | | raise exc ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app ai-1 | | await app(scope, receive, sender) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 74, in app ai-1 | | await response(scope, receive, send) ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 250, in __call__ ai-1 | | async with anyio.create_task_group() as task_group: ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__ ai-1 | | raise BaseExceptionGroup( ai-1 | | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) ai-1 | +-+---------------- 1 ---------------- ai-1 | | Traceback (most recent call last): ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 59, in event_stream ai-1 | | async for output in stream: ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 114, in astream ai-1 | | chain = self._build_chain(text, completion_type, custom_prompt) ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/src/app/core/chains/completion/base.py", line 74, in _build_chain ai-1 | | expert = expert_description_from_type(completion_type) ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 39, in expert_description_from_type ai-1 | | expert_type = get_expert_type_from_int(expert_type) ai-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ai-1 | | File "/usr/src/app/core/chains/completion/prompt.py", line 32, in get_expert_type_from_int ai-1 | | raise ValueError(f"No ExpertType associated with value {value}") ai-1 | | ValueError: No ExpertType associated with value 6 ai-1 | | ai-1 | | During handling of the above exception, another exception occurred: ai-1 | | ai-1 | | Traceback (most recent call last): ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 253, in wrap ai-1 | | await func() ai-1 | | File "/usr/local/lib/python3.12/site-packages/starlette/responses.py", line 242, in stream_response ai-1 | | async for chunk in self.body_iterator: ai-1 | | File "/usr/src/app/server/app/router/completion.py", line 72, in event_stream ai-1 | | raise HTTPException( ai-1 | | fastapi.exceptions.HTTPException: 500: An error occurred while streaming: No ExpertType associated with value 6 ai-1 | +------------------------------------ ai-1 | {"timestamp": "2025-03-24 10:45:21,087", "logger": "root", "level": "ERROR", "message": "[AI Completion] stream_message exception: No ExpertType associated with value 7"}
Author
Owner

@khorshuheng commented on GitHub (Mar 24, 2025):

Is this for self hosted Appflowy Cloud? If so, that is expected, as the current publicly available AppFlowy AI image for self hosting is not the latest version.

<!-- gh-comment-id:2747724822 --> @khorshuheng commented on GitHub (Mar 24, 2025): Is this for self hosted Appflowy Cloud? If so, that is expected, as the current publicly available AppFlowy AI image for self hosting is not the latest version.
Author
Owner

@Medve01 commented on GitHub (Mar 24, 2025):

Yes, it is self hosted - any way to fix it?

<!-- gh-comment-id:2747832848 --> @Medve01 commented on GitHub (Mar 24, 2025): Yes, it is self hosted - any way to fix it?
Author
Owner

@khorshuheng commented on GitHub (Mar 24, 2025):

As of the moment, the latest version of AppFlowy AI is not released publicly due to future plans to offer this as part of the commercial self hosting plan.

Assuming that you have no plans to purchase a license, there are two alternatives at the moment:

  1. If you have an Ollama deployment (and prefer it over OpenAI), you can use the AppFlowy Local AI plugin, which is recently open sourced and available for free. That allows you to get newer AI features and bypass the need of having appflowy_ai service, as the client application will send the request to Ollama directly.

  2. Since the client for appflowy_ai is open source (while the source code for appflowy_ai is not), it is possible to implement your own http service that satisfies the same client. If you need help with that, you can join our official discord, and I can provide more detailed explanation over there.

<!-- gh-comment-id:2747938479 --> @khorshuheng commented on GitHub (Mar 24, 2025): As of the moment, the latest version of AppFlowy AI is not released publicly due to future plans to offer this as part of the commercial self hosting plan. Assuming that you have no plans to purchase a license, there are two alternatives at the moment: 1. If you have an Ollama deployment (and prefer it over OpenAI), you can use the AppFlowy Local AI plugin, which is recently open sourced and available for free. That allows you to get newer AI features and bypass the need of having appflowy_ai service, as the client application will send the request to Ollama directly. 2. Since the client for appflowy_ai is open source (while the source code for appflowy_ai is not), it is possible to implement your own http service that satisfies the same client. If you need help with that, you can join our official discord, and I can provide more detailed explanation over there.
Author
Owner

@phubinhdang commented on GitHub (Apr 18, 2025):

@Medve01 I can confirm this error with my local setup. A fix for your personal use could be done by looking into the code inside the appflowy_ai container and provide some code, primarily prompts, to follow the contract of the client for appflow_ai.

If you have the container started, the code for can be found in /usr/src/app.

Image

<!-- gh-comment-id:2815483293 --> @phubinhdang commented on GitHub (Apr 18, 2025): @Medve01 I can confirm this error with my local setup. A fix for your personal use could be done by looking into the code inside the [appflowy_ai container](https://hub.docker.com/r/appflowyinc/appflowy_ai/tags) and provide some code, primarily prompts, to follow the contract of the client for appflow_ai. If you have the container started, the code for can be found in `/usr/src/app`. ![Image](https://github.com/user-attachments/assets/3df0b161-b295-47ee-9ca7-0488bd4dbe6f)
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
AppFlowy-IO/AppFlowy#3370
No description provided.