Skip to content

[Bug]: Agents based on BaseWorkflowAgent don't produce Agent events. #18776

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
daavoo opened this issue May 19, 2025 · 5 comments
Open

[Bug]: Agents based on BaseWorkflowAgent don't produce Agent events. #18776

daavoo opened this issue May 19, 2025 · 5 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@daavoo
Copy link

daavoo commented May 19, 2025

Bug Description

Maybe I am misunderstanding https://docs.llamaindex.ai/en/stable/module_guides/observability/instrumentation/ but I was expecting the agent classes (https://docs.llamaindex.ai/en/stable/api_reference/agent/) to produce events like AgentToolCallEvent.

It seems that the legacy? agent implementations were supposed to produce those:

dispatcher.event(
AgentToolCallEvent(
arguments=json.dumps({**reasoning_step.action_input}),
tool=tool.metadata,
)

Version

0.12.36

Steps to Reproduce

import asyncio

from llama_index.core.agent.workflow import FunctionAgent
from llama_index.core.tools import FunctionTool
from llama_index.core.instrumentation import get_dispatcher
from llama_index.core.instrumentation.events.llm import LLMChatInProgressEvent
from llama_index.core.instrumentation.event_handlers import BaseEventHandler

class DebugEventHandler(BaseEventHandler):
    @classmethod
    def class_name(cls) -> str:
        return "DebugEventHandler"

    def handle(self, event, **kwargs):
        if not isinstance(event, LLMChatInProgressEvent):
            print(type(event))

get_dispatcher().add_event_handler(DebugEventHandler())

def get_current_year() -> int:
    """Get the current year"""
    print("get_current_year call")
    return 2025

agent = FunctionAgent(
    tools=[FunctionTool.from_defaults(get_current_year)],
    system_prompt="Use the available tools.",
)

async def main():
    result = await agent.run("What year is it?")
    print(result)

if __name__ == "__main__":
    asyncio.run(main())

Relevant Logs/Tracbacks

<class 'llama_index.core.instrumentation.events.llm.LLMChatStartEvent'>
<class 'llama_index.core.instrumentation.events.llm.LLMChatEndEvent'>
get_current_year call
<class 'llama_index.core.instrumentation.events.llm.LLMChatStartEvent'>
<class 'llama_index.core.instrumentation.events.llm.LLMChatEndEvent'>
<class 'llama_index.core.instrumentation.events.span.SpanDropEvent'>
The current year is 2025.
@daavoo daavoo added bug Something isn't working triage Issue needs to be triaged/prioritized labels May 19, 2025
Copy link

dosubot bot commented May 19, 2025

The issue you're encountering seems to be related to the FunctionAgent not emitting the AgentToolCallEvent as expected. In the LlamaIndex library, events like AgentToolCallEvent are typically generated when a tool is called with input during the reasoning step. This requires the reasoning_step.action to be present in the tools_dict, and the tool corresponding to the action is retrieved, triggering the event with the AgentToolCallEvent containing the arguments and tool metadata [1].

In your setup, it appears that the FunctionAgent is not configured to emit these specific events, which might be due to differences in how the FunctionAgent handles tool calls compared to legacy implementations. The FunctionAgent is designed to handle tool calls by executing the tool asynchronously and processing the results, but it may not be emitting the AgentToolCallEvent in the current version (0.12.36) [2].

To address this, you might want to check if there are any updates or changes in the library's documentation or codebase that specifically mention modifications to event handling in the FunctionAgent. Additionally, reviewing the implementation of the dispatcher and how it interacts with agents to manage event emission could provide insights into why these events are not being generated [3].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@logan-markewich
Copy link
Collaborator

Ah yea, i did not transition these events into the newer agents. Mostly because

  1. Its a workflow, you can iterate over events as it runs: https://docs.llamaindex.ai/en/stable/understanding/agent/streaming/
  2. Its a workflow, so by default, each step is instrumented as a span to collect events (there's like 6 steps, if you run with something like arize enabled it shows up)

But if its helpful, I can probably pull those older instrumentation events back in

@daavoo
Copy link
Author

daavoo commented May 19, 2025

Its a workflow, so by default, each step is instrumented as a span to collect events (there's like 6 steps, if you run with something like arize enabled it shows up)

Ah, got it.
I was trying to put my own instrumentation, so I think I will implement it using the SpanHandler instead.

I thought it was a bug because I was getting some events (the LLM ones) but not the tool calls.

Feel free to close it if the older events are meant to be deprecated

@stdweird
Copy link

@logan-markewich can you clarify?
i am e.g. trying https://docs.llamaindex.ai/en/stable/understanding/agent/multi_agent/ but i get nothing (like really nothing, not even the llm traces) while using local arize phoenix (which seem to work working, because i can see a trace from e.g. direct calling llm.complete)

@logan-markewich
Copy link
Collaborator

@stdweird works fine for me in arize?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

3 participants