-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama AI Chat Completion Example / OpenAI Compatibility #191
Labels
Comments
This would be great, is it just a matter of switching the endpoint? |
I do not know :) hence this example calls for self contained - so that it works out of the box and answer(s) the question(s) like that :) |
I'll give it a shot! |
matthewhaynesonline
pushed a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 17, 2025
matthewhaynesonline
pushed a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 17, 2025
matthewhaynesonline
pushed a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 17, 2025
matthewhaynesonline
pushed a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 17, 2025
matthewhaynesonline
added a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 17, 2025
matthewhaynesonline
added a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 24, 2025
matthewhaynesonline
added a commit
to matthewhaynesonline/async-openai
that referenced
this issue
Feb 24, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Given that officially Chat completion API is compatibale - this library should work without any changes. Hence it would be great to have a fully self contained example which works with local ollama using this crate
https://ollama.com/blog/openai-compatibility
The text was updated successfully, but these errors were encountered: