Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PY] feat: streaming support for Tools Augmentation #2215

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

BMS-geodev
Copy link
Contributor

Linked issues

closes: # 2212

Details

Provide a list of your changes here. If you are fixing a bug, please provide steps to reproduce the bug.

Change details

Describe your changes, with screenshots and code snippets as appropriate

code snippets:

screenshots:

Attestation Checklist

  • My code follows the style guidelines of this project

  • I have checked for/fixed spelling, linting, and other errors

  • I have commented my code for clarity

  • I have made corresponding changes to the documentation (updating the doc strings in the code is sufficient)

  • My changes generate no new warnings

  • I have added tests that validates my changes, and provides sufficient test coverage. I have tested with:

    • Local testing
    • E2E testing in Teams
  • New and existing unit tests pass locally with my changes

Additional information

Feel free to add other relevant information below

traceback.print_exc()

# Send a message to the user
await context.send_activity("The bot encountered an error or bug.")
# await context.send_activity("The bot encountered an error or bug.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

forgot to remove commented out lines in file

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BMS-geodev I think you accidentally uncommented instead of removing for latest updates

python/packages/ai/teams/ai/models/openai_model.py Outdated Show resolved Hide resolved
message.action_calls[index].function.name += curr_tool_call.function.name
if curr_tool_call.function.arguments:
message.action_calls[index].function.arguments += curr_tool_call.function.arguments
if curr_tool_call.type == "function": # Type must always match the expected value
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extra comment

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are we supposed to set the type regardless?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can experiment with leaving it empty, however to my understanding if its anything other than empty, it has to be "function"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, did it end up throwing an error?

python/packages/ai/teams/ai/models/openai_model.py Outdated Show resolved Hide resolved
if chunk.delta and (
(chunk.delta.action_calls and len(chunk.delta.action_calls) > 0) or
chunk.delta.action_call_id or
getattr(chunk.delta, "tool_calls", None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should keep it consistent with JS/C# and check chunk.delta.content instead

@@ -284,7 +291,20 @@ def chunk_received(

if streamer is not None:
await streamer.end_stream()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need to also remove L281-283 (check for is_streaming and res.status == "success")

@@ -284,7 +291,20 @@ def chunk_received(

if streamer is not None:
await streamer.end_stream()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also need to remove this line

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this not align with this line from Steve's updates?

await streamer.endStream();

else {
if (response.status == 'success') {
// Delete message from response to avoid sending it twice
delete response.message;
}

                // End the stream and remove pointer from memory
                // - We're not listening for the response received event because we can't await the completion of events.
                await streamer.endStream();
                memory.deleteValue('temp.streamer');
            }

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh I may have the wrong place

python/packages/ai/teams/ai/clients/llm_client.py Outdated Show resolved Hide resolved
res.message.content = ""
else:
if res.status == "success":
res.message = None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we also add in the comments that Steve has for JS so we remember the purpose of these edge cases in the future

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants