Template JSON payload identification

Hi there.

I was wondering if there was a way the retrieve the JSON structure of a Template, might like the Navigator. I want to build a UI to create contracts, but I’d also like the same UI to handle any DAR file, or files I upload. So I was wondering if there is a way to get the JSON of a Template with empty values, and then to just jump over the keys and create dynamic UI based on the keys. Then just package the correct information together and send it to the HTTP JSON API.

For example, let’s say we use the IOU contract from the documentation. It’s running in a Fabric network, and I’ve got the UI done. Now I upload a second chaincode (DAR file) to the channel, but instead of the doing the UI, the UI just queries the world ledger and gets empty JSON strings for all the available templates/assets. Then the UI can dynamic build a form and offer it to the clients if the want to add details before submitting it.

Kind regards,
Hendre

1 Like

I’m afraid this is not possible out of the box, but the HTTP JSON API exposes a couple of endpoints to list and download packages. They are more or less analogous to the Ledger API PackageService endpoints. Using either the HTTP JSON API or the Ledger API PackageService you can get the Daml packages in their compiled form, interpret them and build a form based on the template signatures. This is more or less what Navigator does to implement its dynamic introspection capabilities. Please note that Daml-LF (the Daml compiler output) is subject to evolution over time, so you might have to adapt your code as new Daml-LF versions are published to make sure to handle new features (which is why we have a library we use internally – i.e. no compatibility guarantees – to read and interpret Daml-LF packages). This is a possible way to go but I must say it’s not something particularly trivial.

3 Likes

Thanks for the advice. I’ll most likely look at the Daml-LF package, thanks for the link, but not use it. We’ve been thinking of writing a parser for daml code to add comment tags to provide some generic document represention of the daml template as a form/contract. Something people can print, see, email, handle and if they fill it in, either OCR it, or provide some online form to extract the information. I’ll most likely extend this parser and create a contract to hold all key-value pairs of the templates and their JSON structure. Querying the JSON contract should then be enough to provide me with all the templates on the channel and their empty JSON payloads.

Was hoping for a less hacky method.

But again. Thank you very much for the help.

1 Like

Sounds like a fun project, good luck and feel free to share your progress but definitely make sure to heed the advice here!

I definitely will. I’ll most likely be soloing the project anyway, so I’ll share a link when I’ve got something to show. Input and any more advice is always welcome.

And I most definitely appreciate the advice @stefanobaghino-da gave. My boss likes playing with mini Scala projects so I’ll mention it to him.

1 Like

Awesome, considering the difficulty and tedium in parsing Daml you might want to manually generate the JSON based off of the Daml templates. For even a large set of contracts it’d be significantly quicker than implementing a parser from scratch (which is really very hard).

Or at the very least try manually generating the JSON first and write the parser last. Also I’ve very recently learned that Daml-LF is easier to parse than Daml.

1 Like

From easier to more difficult, I would say:

  1. Using the daml-lf interface library to load LF packages. While changes will happen, you are likely to get help from scalac for adapting to them.
  2. Loading the LF packages with protobuf (preferred protobuf gens work) and basically writing your own version of the interface library based on that. Not trivial, but way better than…
  3. Parsing daml code. Seriously not recommended. There’s a reason all our navigator and codegen stuff is based on interpreting LF packages rather than daml code directly.
4 Likes

@Stephen @anthony @stefanobaghino-da Thank you so much for the assistance. It seems like the consensus is to use the Daml-lf interface. So then this is the approach I’ll take.

Quick question: how do I use the interface? I’ve downloaded Bazel with the correct version 4.0.0. I suppose I can build it using bazel build //:interface?

The library is designed for internal use in that there are no compatibility guarantees we can provide, but the artifact is available for you to download from Maven Central.

If for any reason you want to build it locally from source using Bazel, the command you want to run from the repository root directory is bazel build //daml-lf/interface. This will build the library jar file. You can either integrate that in your Bazel build or fetch the artifact from its location, which is returned by the the bazel build command.

2 Likes

This is less a meaningful difference than it sounds, because it is also the case that if you write an interface based on Daml-LF 1.x (my option 2), there is no guarantee that it will work against 1.(x+1); by contrast, the spec rules say that LF consumers must assume there are incompatible changes and not even try to parse them.

As a practical matter, when the interface library changes, you are likely to get compiler errors pointing to what you need to change; when the LF protobuf spec adds a minor version and you are interacting with protobuf directly, you will get no errors, only incompatible runtime semantics. Neither is forward-compatible, but the incompatibilities in the interface library are better for your programs’ health.

For example, suppose you have written a LF parser based on the 1.6 spec. You would assume that an external package ref uses package_id_str. However, in 1.7, this case is never used; instead, you must indirectly look up the package_id_interned_str. And similarly for module names. Some protobuf gens, like Haskell’s, will warn you about the new cases you need to handle. Some…will not, and you will see the “bug” where package names appear to be present, but empty (because they moved somewhere else). But if you simply upgrade your interface library version, it will handle the de-interning for you.

e: This illustrates the unidirectionality of LF compatibility nicely. There is no way you can “gracefully recover” from the introduction of interning, as an author of an LF 1.6 parser. The only correct thing you could have done at the time was to reject any LF >1.6.

In that respect, I think the interface library actually forms a more stable foundation than protobuf for most LF consumers. With the obvious caveat that you must use Scala or Java to use it.

2 Likes

Sure, I just wanted to make clear to not expect the same level of backward-compatibility that one enjoys for libraries that we publish as part of Daml Connect. Thanks for clarifying.

3 Likes

@hendrehayman I’m a bit late to the party, but you may also want to look at the :json command that can be run inside the daml repl. It’s not giving you a mapping from a template to JSON, but rather concrete instance of a template (a contract). If you look through the code-base it may give you some ideas.

2 Likes

@Luciano Thanks. We haven’t start with this project yet. We’re still planning and watching and process the Hyperledger events.

1 Like