-
Notifications
You must be signed in to change notification settings - Fork 242
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add simple ollama chat example #336
base: main
Are you sure you want to change the base?
Add simple ollama chat example #336
Conversation
d4122cd
to
2ded27b
Compare
2ded27b
to
36493ac
Compare
async-openai/README.md
Outdated
@@ -59,6 +59,11 @@ $Env:OPENAI_API_KEY='sk-...' | |||
- Visit [examples](https://github.com/64bit/async-openai/tree/main/examples) directory on how to use `async-openai`. | |||
- Visit [docs.rs/async-openai](https://docs.rs/async-openai) for docs. | |||
|
|||
### Local LLM Usage |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not required. Ollama example has its own README
examples/ollama-chat/README.md
Outdated
@@ -0,0 +1,18 @@ | |||
## Prerequisites |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please consider adding fully contained example - in this case if there's dependency on ollama - please consider creating docker-compose file to bring it up in single command.
I'd prefer to have quick way to run an example, instead having them additional steps - which a user may already do on their own
Hi @64bit, I've updated my PR with your feedback, though the Docker setup has some caveats I didn't have elegant solutions for.
What do you think? |
Thanks for update, I just tried it and looks good! I think Your entrypoint stuff to download model looks good to me. So user has to do two steps:
|
@@ -0,0 +1,9 @@ | |||
FROM rust:1.85 AS builder |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not required
container_name: ollama_chat | ||
build: . | ||
command: ollama-chat | ||
restart: on-failure:3 | ||
environment: | ||
API_BASE: "http://ollama:11434/v1" | ||
API_KEY: "ollama" | ||
MODEL: "llama3.2:1b" | ||
depends_on: | ||
ollama: | ||
condition: service_healthy | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not required
sleep 5 | ||
|
||
echo "Retrieving model $MODEL..." | ||
ollama pull $MODEL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice
|
||
#[tokio::main] | ||
async fn main() -> Result<(), Box<dyn Error>> { | ||
let api_base = env::var("API_BASE").expect("API_BASE is not set in the environment."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be hard coded to what docker compose will bring up
so localhost:11434
Fixes #191.
I used the code examples from #173 as a starting point and tried to simplify them to match the existing chat example.