You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Langflow is an incredibly powerful workflow engine framework, and I’ve already developed a decent understanding of its source code. However, Langflow workflows are currently triggered via API calls (or via python simple_run_flow), which means each workflow is executed only once per call.
I'm wondering if there's a way to turn a Langflow workflow into a long-running task — for example, continuously consuming data from Kafka, processing it through a Langflow workflow, and outputting the results back to Kafka — essentially having the workflow run as a streaming processor.
More specifically, here’s what I hope to achieve:
Add a Kafka Component to Langflow and be able to drag it onto the canvas.
Reuse existing data processing components in Langflow to build a streaming message processing pipeline visually.
Output the processed messages back to Kafka.
The workflow may involve multiple reads and writes to Kafka — for example: Kafka Component -> Filter Component -> Map Component -> Kafka Component -> Reduce Component -> Redis Component, etc., and could also include branching and conditional logic.
However, since Langflow currently only supports execution via API calls (or simple_run_flow), it's not suitable for running continuous streaming tasks.
I’d love to hear if anyone has ideas or suggestions on how to make this work.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Langflow is an incredibly powerful workflow engine framework, and I’ve already developed a decent understanding of its source code. However, Langflow workflows are currently triggered via API calls (or via
python simple_run_flow
), which means each workflow is executed only once per call.I'm wondering if there's a way to turn a Langflow workflow into a long-running task — for example, continuously consuming data from Kafka, processing it through a Langflow workflow, and outputting the results back to Kafka — essentially having the workflow run as a streaming processor.
More specifically, here’s what I hope to achieve:
Kafka Component -> Filter Component -> Map Component -> Kafka Component -> Reduce Component -> Redis Component
, etc., and could also include branching and conditional logic.However, since Langflow currently only supports execution via API calls (or
simple_run_flow
), it's not suitable for running continuous streaming tasks.I’d love to hear if anyone has ideas or suggestions on how to make this work.
Beta Was this translation helpful? Give feedback.
All reactions