Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

_get_local_provider_call should return an AsyncOpenAI client when appropriate #899

Closed
yml opened this issue Mar 9, 2025 · 1 comment · Fixed by #901
Closed

_get_local_provider_call should return an AsyncOpenAI client when appropriate #899

yml opened this issue Mar 9, 2025 · 1 comment · Fixed by #901
Assignees
Labels
bug Something isn't working mirascope

Comments

@yml
Copy link

yml commented Mar 9, 2025

Description

_get_local_provider_call should return a client appropriate to the context of execution sync or async like it is the case for openAI for example.

This is probably not the best solution but illustrate the discussion.

 diff --git a/mirascope/llm/_call.py b/mirascope/llm/_call.py
index 48dc2098..ebec6992 100644
--- a/mirascope/llm/_call.py
+++ b/mirascope/llm/_call.py
@@ -2,6 +2,7 @@
 
 from __future__ import annotations
 
+import asyncio
 from collections.abc import AsyncIterable, Awaitable, Callable, Iterable
 from enum import Enum
 from functools import wraps
@@ -56,9 +57,13 @@ def _get_local_provider_call(
 
         if client:
             return openai_call, client
-        from openai import OpenAI
+        from openai import AsyncOpenAI, OpenAI
 
-        client = OpenAI(api_key="ollama", base_url="http://localhost:11434/v1")
+        try:
+            asyncio.get_running_loop()
+            client = AsyncOpenAI(api_key="ollama", base_url="http://localhost:11434/v1")
+        except RuntimeError:
+            client = OpenAI(api_key="ollama", base_url="http://localhost:11434/v1")
         return openai_call, client
     else:  # provider == "vllm"
         from ..core.openai import openai_call
(END)

Python, Mirascope & OS Versions, related packages (not required)

The issue occurs to me using Ollama but exactly the same issue probably exists with Vllm

@yml yml added the bug Something isn't working label Mar 9, 2025
@willbakst willbakst self-assigned this Mar 9, 2025
@willbakst willbakst mentioned this issue Mar 9, 2025
@willbakst
Copy link
Contributor

Fix released in v1.21.1! Thanks for pointing this out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mirascope
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants