Add support for Aleph Alpha Luminous models #60
Labels
Prio: SHOULD
Important feature, would not prevent from completing the milestone.
Status: in progress
The task is currently being processed.
Aleph Alpha is a German company that provides a set of services using their Luminous series of LLMs. The benefit of using theses models over OpenAI would be their data privacy, European focus and possibility to self host.
First evaluations indicate that at least the biggest „supreme-control“ model may be usable with the „standard“ (non-agent) connectors (extract, decide, compose, translate).
This requires to add an option for these models in the connector template and a switch on the python side to select the right LLM (only ChatOpenAI until now). This is also the time to move away from the vendor and model name agnostic model descriptions in the model select. As more different models are added we need to call them by their names.
Furthermore, the models need specific prompts. They seem to follow the „Stanford/Alpaca“ prompting scheme, so these changes can be made generic to also support some open source LLMs in the future.
The text was updated successfully, but these errors were encountered: