Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add improved support for chat history message insertion + documentation #264

Closed
willbakst opened this issue May 21, 2024 · 7 comments · Fixed by #333
Closed

Add improved support for chat history message insertion + documentation #264

willbakst opened this issue May 21, 2024 · 7 comments · Fixed by #333
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@willbakst
Copy link
Contributor

Description

Right now users need to manually create the tools even though all of the information is generally already present in the tool. A method like tool.message_param would make this far more convenient (similar to how we can already to response.message.model_dump() for the assistant message.

We may also want to provide additional convenience, something like response.user_message_param, for inserting the user message back in. This one likely needs a little more thought than the tool call message feature.

@willbakst willbakst added the enhancement New feature or request label May 21, 2024
@willbakst
Copy link
Contributor Author

Updating this feature request to be separate from tool call message param since the tool stuff is clear cut and the other stuff needs a little more thought.

Will be tracking the tool stuff in #294

@willbakst willbakst changed the title Add improved support for chat history message insertion (namely tools) + documentation Add improved support for chat history message insertion + documentation Jun 1, 2024
@willbakst willbakst added this to the v0.17 milestone Jun 5, 2024
@willbakst
Copy link
Contributor Author

One thought I just had for this would be to add a new Mirascope Stream class instead of using the raw Generator. The added bonus of creating our own class (and not just a type alias) is that we can do some nice things around return values.

We can have the generator return a tuple of values and then add those values to the generator object for easy access. For example, consider a generator that returns assistant_message accessible through e.g. stream.assistant_message_param after the stream so that the assistant message param can be easily inserted into chat history. This holds for all streaming across things like tools and cost tracking as well. If we return a tuple, we can insert each item as a named property of the stream (e.g. stream.cost).

This would provide some much needed convenience around reinserting messages, which is particularly annoying when streaming. We can provide simlar convenience for normal calls as well e.g. response.assistant_message_param.

Naming needs some thought, but I think the idea is there.

@willbakst
Copy link
Contributor Author

Worth noting that we should also ensure that mypy and other type checkers don't get update with these chat history and MESSAGES flows.

@willbakst
Copy link
Contributor Author

willbakst commented Jun 5, 2024

Plan of attack after further thinking discussion:

  • Add a response.message_param property
    This will return the proper type since response.message is a ChatCompletionMessage pydantic model and not a message param. Returning the correct type will make the type checker happy.
  • Consider adding response.user_message_param.
    It's easy enough to write {"role": "user", "content": user_message} when the user message is fully templated. However, sometimes the user message is part of the prompt_template. The response.user_message_param convenience property would handle all of these cases for easy insertion, and if there is no user message it'll just return None.
  • Create a mirascope BaseStream class with provider extended generators with useful properties like message_param.
    Right now when streaming the user has to manually reconstruct the messages; however, we have access to all of that information internally and could reconstruct and return it as part of the generator. With a custom generator, we can then attach the StopIteration return value (i.e. a tuple of return values) into individual properties in the stream object for easy access like we have for response i.e. stream.message_param or stream.history.

@willbakst willbakst self-assigned this Jun 6, 2024
@willbakst
Copy link
Contributor Author

I've removed the response.history idea after further thought. There's enough convenience to assemble your own array to add, but there are two many flows for a single property to cover properly.

@willbakst
Copy link
Contributor Author

willbakst commented Jun 7, 2024

Add ProviderChatMessage e.g. OpenAIChatMessage types and coerce to message params under the hood
If reading the OpenAI docs or coming from their SDK, users will expect to be able to insert response.message into chat history. This is because OpenAI makes that work even though their typing doesn't match that. We should just allow for this insertion and under the hood convert any output pydantic models into their input typed dict counterparts. This way users don't have to learn the difference and mirascope just makes it easy.

@jbbakst I'm feeling less sold on this having implemented the message_param given that using message_param matches the typing of the SDK (even though it technically accepts the response message type) and provides a common interface across providers where sometimes you can't pass the message in directly.

@willbakst
Copy link
Contributor Author

Update: everything here is implemented and merged in. Only missing piece is docs, which I will add as part of the 0.17 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant