When the list of models is provided in the configuration, the LLM app shouldn't query the API for available models. This fixes compatibility with providers like https://app.fireworks.ai/ that don't provide the models endpoint like https://app.fireworks.ai/. This is also a first step for Azure support in LLMAI-23. |