Skip to content

Pipeline overhaul

Dynamic Pipelines

Pipelines must declare how "long" they are / how many clock cycles it takes them to produce a valid output.

For dynamic pipelines, e.g. where conditional registers like reg[stall] are used, this rule simply doesn't work anymore. You simply can't infer in compile time when a value is valid to use.

For that, there should be a new pipeline(dyn) syntax that must be used for conditional registers to work. Those will be instantiated with inst(dyn) my_pipeline().

What is special about this syntax is, that it will wrap the output type in a Rv<T> and use stage.valid from the last stage and stage(0).ready (unsure if this must be the first or last stage) which are already present. This is similar to calyx' go-done interface (i think).

This forces the user to handle dynamic pipelines correctly. While most of the time, at the point one pipeline is dynamic at least the next few ancestors will also be dynamic, instantiating a dynamic pipeline doesn't have to happen in a dynamic pipeline. An example for this would be some sort of timeout/or_default pipeline:

// Return value implicitly wrapped as Rv<int<8>>
pipeline(dyn) my_worker(clk: clock) -> int<8> { .. }

// This lets a dynamic pipeline work for 8 clock cycles and returns a default value if it still hasn't finished.
// TODO: where do we have to use ready here?
pipeline(8) work_or_default(clk: clock, val: int<8>) -> int<8> {
    let res = inst dyn my_worker(clk);
reg*8;
    match stage(-8).res.data { // the stage(-8) feels weird..
        Some(done) => done,
        None => val,
    }
}

Note that this would require moving Rv<T> into the standard library.

Pipelines are weirdly safe

Ignoring current dynamic pipelining and surprise features, pipelines are weirdly safe. In fact, they provide safety guidelines that occur in no other place in the language: Validness of a value.

Imagine a pipeline which needs 5 clock cycles to calculate a value:

pipeline(5) pipe(clk: clock) -> int<8> {
reg*5;
    42
}

When using this in another pipeline before those 5 cycles, we would get an error:

pipeline(3) use_pipe(clk: clock) -> int<8> {
  let res = inst(5) pipe(clk);
reg*3;
  res
}
error: Use of playground::playground::res before it is ready
  ┌─ src/playground.spade:9:3

9 │   res
  │   ^^^ Is unavailable for another 2 stages

  = note: Requesting playground::playground::res from stage 3
  = note: But it will not be available until stage 5

However, this rule is easily to violate when using an entity instead:

entity pipe_violated(clk: clock) -> int<8> {
    let res = inst(5) pipe(clk);
    res
}

pipeline(3) use_pipe(clk: clock) -> int<8> {
  let res = inst pipe_violated(clk);
reg*3;
  res
}

This compiled completely fine.

Funny enough, it is quite simple to restore the violated pipeline rules with another pipeline:

pipeline(5) pipe_restored(clk: clock) -> int<8> {
  let res = inst pipe_violated(clk);
reg*5;
  stage(-5).res
}

Now the question here is: How easy would it be to bring this compile-time inferred "when is a value valid" guards to the whole language?

What I mean by this is that every value/type should know when it is valid. Pipelines wouldn't specify their length next to the keyword anymore, but with the output, just like entities would can. It is yet unclear to me how this would look. Dynamic pipelines could return an Rv that itself is instantly valid.

To put it stupidly in ugly grammar:

entity multiply_squared_behing_reg(clk: clock, input: int<8>, other: defer(+1) int<8>) -> defer(+1) int<16> {
    reg(clk) res = input * input;
    res * other // other is only used after the first reg so it can also be declared as defer(+1) - reads as one more than the "base-defer" of this entity
}

Here, a reg statement makes the value one more defered than the expression. Every type can be implicitly longer defered, but not the other way around. This means int<8> + defer(+2) int<8> will result in defer(+2) int<8> + defer(+2) int<8>. On the other hand, this means calling e.g. multiply_squared_behing_reg could be done in a pipeline the stage before other is valid.

One might argue that this makes uninitialized stuff impossible, where you currently write

reg(clk) uninit = { uninit };

so that would be blocked on #328

Another question is how this will work with self referential registers in general.

But the problem here is that, while pipelines provide this safety, they're just oftenly not used. Here a quiz for you: Which of those two units exist in the standard library?

entity sync2<T>(clk: clock, in: T) -> T {
    reg(clk) sync1 = in;
    reg(clk) sync2 = sync1;
    sync2
}

or

pipeline(2) sync2<T>(clk: clock, in: T) -> T {
    let sync = in;
reg*2;
    sync
}

Hint: It is not the pipelined one.

Other issues that should be fixed

Edited by DasLixou