This issue has been created
 
 
LLM AI Integration / cid:jira-generated-image-avatar-5193c96f-9212-4494-8b7a-7d5018cbbdd9 LLMAI-93 Open

The LLM chat doesn't work when XWiki is the root application

 
View issue   ยท   Add comment
 

Issue created

 
cid:jira-generated-image-avatar-02faf6d2-298c-4734-8fdc-133169f8897c Michael Hamann created this issue on 01/Jul/24 13:56
 
Summary: The LLM chat doesn't work when XWiki is the root application
Issue Type: cid:jira-generated-image-avatar-5193c96f-9212-4494-8b7a-7d5018cbbdd9 Bug
Affects Versions: 0.4
Assignee: Unassigned
Created: 01/Jul/24 13:56
Priority: cid:jira-generated-image-static-major-5e0d9ecb-2c8e-4727-a75c-0bb12adc0bd9 Major
Reporter: Michael Hamann
Description:

Steps to reproduce:

  1. Install the LLM application on an XWiki instance that is configured as root application, e.g., the standard Docker image of XWiki.
  2. Configure some models.
  3. Open the chat UI by clicking on the "LLM Chat" button at the bottom right of the page.

Expected result:

The configured models are listed.

Actual result:

No models are listed. Looking at the network tab of the developer tools, one can see that a request to /xwiki/rest/wikis/xwiki/aiLLM/v1/models?media=json failed with status 404 - the /xwiki/ prefix is invalid. The correct URL would have been /rest/wikis/xwiki/aiLLM/v1/models.