Online Flow processing

Using Workflow can I build persistent flow of processing i.e. online learning.



Every input is a Tensor (torch or TF).
Module3 has to be able to distinguish between the input from M1&M2.

Asking because the Docs say that functions are fire and forget i.e. not persistent

Sorry I think I probably need more details about what you are asking here.

So, does the module represent a workflow step here? For example,
We’ll have a pipeline

  M1 -> M3 -> M4. 
  M2 - /

For this, we can distinguish them from the inputs:

M4.step(M3.step(M1.step(), M2.step()))

where M3 might be:

def m3(m1_input, m2_input): ...

Or do you mean module3 will be a workflow job and module1, module2 are the place where trigger module3 to update it’s internal state?

First of all, nothing is shared across runs. And you can get the step output with the named step which has been added recently.

For example,

def f(): ...


workflow.get_output("id", name="A")

We don’t support stateful steps directly right now.