Offset to time relationship

Is there a way for the ledger participant node to convert offsets to time, maybe a way to efficiently query for start and stop offsets for subsequent transaction stream requests?

1 Like

The pruning APIs have a command to get an approximated time → offset mapping which may be helpful Console Commands — Daml SDK 2.6.0 documentation. I’m not ware of any other way to map between the two atm.

1 Like

Some of those commands (marked ‘experimental’) look quite useful. Do you know if there is any plan to expose them through the admin API? They’re currently not documented.

I don’t know but I’ll reach out to the team responsible for this.

Hey,
As far as I know, there are no plans to expose such methods in the short term.

One reason for not exposing a map from offset to time is that it is not clear which clock to use. As an example, global offsets are assigned when different streams are merged together. In particular, even if you have perfectly synchronized clocks, it is not guaranteed that o_1 < o_2 implies ts(o_1) < ts(o_2).
Such a property is guaranteed for local offsets but we don’t want to commit to that in the long run.

Raf

Imagine a Party who needs to track what transactions occurred on a given day; for example they need to answer questions like what were all of the instances of template T between 2023-01-01 and 2023-01-02 (ie. between arbitrary dates). For now, pruning is not an option. Should they

  1. Create (Archive too?) an EndOfDay {date : Date, lastSeenOffset: Text } ... template. Use them to figure out what offsets to query a filtered transaction stream?
  2. Do the same but in a DB?
  3. ???

Would it be possible to just query for the ledger end ?

I am looking for the historical relationship between offset and real time, as observed by the participant node (or potentially a consumer of the transaction stream).

There are many problems with time in a truly global system, some of them have already been mentioned in this thread, like difficulty with synchronizing the clocks and non monotonicity of time in the stream of transactions with increasing offsets.

I would add to this the fact that information about some of the transactions may be temporarily unavailable due to network partition. Let’s consider what that would mean if there was a way to query transaction streams by date. A client would query for all templates preceding certain date-time and receive 10 answers. As later the partition would get fixed and the missing information become available, the client would receive 17 answers.

That is why we would be hesitant to add ledger api methods that would allow querying by time. It is just asking for trouble and pretending to give guarantees that are simply not possible to fulfill.

What you are suggesting with an EOD contract is going in the right direction and in fact has been done in the past. Causal ordering is something that can be reliably guaranteed and if you know the offset of the EOD contract, you can use it to query for all preceding ones.

Here is how you could do it in practice:
The contracts have to be somehow causally related. We can do it via a date contract. At the end of the preceding day you roll the date contract say to Jan-1-2024. All contract throughout the day mention that Jan-1 contract. It is spent in a date rolling transaction sometime around midnight to Jan-2-2024. Now you have a guarantee that no contracts pertaining to Jan-1 will arrive prior or past the lifespan of the corresponding date contract.

@Marcin_Ziolek I am interested in the time when an offset is committed to a given participant node, does that ever change?

An offset once assigned will not change.
It is determined at the time when a transaction is incorpoated into the multi-domain event log within the canton particpant.

Then can we have an API that maps those times to offsets?