Steps to reproduce:
Configure an LLM server with a long API token with more than 768 characters (for example IONOS).
Expected result:
The configuration can be saved and works.
Actual result:
Saving the configuration fails.