Steps to reproduce:
- Install the LLM application on an XWiki instance that is configured as root application, e.g., the standard Docker image of XWiki.
- Configure some models.
- Open the chat UI by clicking on the "LLM Chat" button at the bottom right of the page.
Expected result: The configured models are listed. Actual result: No models are listed. Looking at the network tab of the developer tools, one can see that a request to /xwiki/rest/wikis/xwiki/aiLLM/v1/models?media=json failed with status 404 - the /xwiki/ prefix is invalid. The correct URL would have been /rest/wikis/xwiki/aiLLM/v1/models. |