Java Bindings Feedback - Use of Final Classes


I wanted to share some feedback about real-world usage/dev when working with the java-bindings / java codegen:

The use of Final Classes!

The java bindings in their current state are essentially the equivalent to OpenAPI spec bindings that generate a “Client”: In this case it is a gRPC client. This means the classes are used on the client side and not submitted into a untrusted system (from the point of view of the intended use of the codegen output).

The quality of life issue that comes up for the dev is the current use of Final classes… They are everywhere…

Generated template classes are final

but Choice classes are not final?

and common use classes are final such as the main client:

The issue is about the dev’s ability to extend and enhance:

The generated classes are essentially dtos and are similar to classes that you would generate for a UI or for a HTTP client using OpenAPI codegen. having final on these generated classes does not “secure” anything as it code that would be re-generated on a continual basis as upgrades to the templates occur. We should be able to extend these classes so we can use our own variants. Allowing us to extend or if equivalent interfaces were generated by codegen would allow us to leverage easily leverage IDE codegen and other downstream generation utils as part of the build process.

Second notable example is the DamlLedgerClient. This client wraps all of the interfaces for interacting with the ledger and provides the thread pool, etc.

Given that some interfaces do not exist yet such as the Party Allocation service, it would be nice if we could just extend the DamlLedgerClient instead building one off classes.

Instead of building classes such as:

class PartyManagementService(
    private val damlConfig: DamlConfig

    private fun getPartyManagementService(accessToken: String? = null): PartyManagementServiceGrpc.PartyManagementServiceFutureStub {
        val channel = ManagedChannelBuilder.forAddress(

        val pms = if (accessToken == null){

        } else {
            LedgerCallCredentials.authenticatingStub(PartyManagementServiceGrpc.newFutureStub(channel), accessToken)

        return pms

    fun getKnownParties(accessToken: String? = null): Uni<List<PartyManagementServiceOuterClass.PartyDetails>> {
        val req = getPartyManagementService(accessToken).listKnownParties(

        return Uni.createFrom().converter(UniRxConverters.fromSingle(), Single.fromFuture(req)).map {

    fun getParties(parties: List<String>, accessToken: String? = null): Uni<List<PartyManagementServiceOuterClass.PartyDetails>> {
        val req = getPartyManagementService(accessToken).getParties(

        return Uni.createFrom().converter(UniRxConverters.fromSingle(), Single.fromFuture(req)).map {

    * throw PartyNotFoundException if requested party does not exist / could not be found.
    * */
    fun getParty(party: String, accessToken: String? = null): Uni<PartyManagementServiceOuterClass.PartyDetails> {
        return getParties(listOf(party), accessToken).map(Unchecked.function { it ->
            if (it.isEmpty()) {
                throw PartyNotFoundException(party)
            } else {

    fun getParticipantId(accessToken: String? = null): Uni<String> {
        val req = getPartyManagementService(accessToken).getParticipantId(

        return Uni.createFrom().converter(UniRxConverters.fromSingle(), Single.fromFuture(req)).map {

    fun allocateParty(displayName: String, partyHint: String? = null, accessToken: String? = null): Uni<PartyManagementServiceOuterClass.PartyDetails> {
        val builder = PartyManagementServiceOuterClass.AllocatePartyRequest.newBuilder()

        if (partyHint != null){
            builder.partyIdHint = partyHint

        val req = getPartyManagementService(accessToken).allocateParty(

        return Uni.createFrom().converter(UniRxConverters.fromSingle(), Single.fromFuture(req)).map {

we could have just extended the client and integrate the code, reuse the thread pool, and the other configs.

Recap: The use of final on classes in the java-bindings feels very heavy handed on the productivity of the development when trying to build from the generated classes and the common classes.



Thnx for the feedback @StephenOTT :pray:

I don’t think this would be a good idea. It would make the path towards re-using those classes through the back-end systems that stand between the ledger and the client easier to achieve and I believe this would make it easier for users to fall into the trap of building one “ball of mud” that has dependencies that connect directly together clients to the ledger. I believe the role of the ledger is not that of being exposed directly to the client, but rather that of serving as a message bus to which several back-ends connect, each with its specific view of the ledger over which it has full control. It’s more work when you kick-off and it means more maintenance to the code, but simplifies the overall system and its evolution (which is the part to get right and arguably has a higher maintenance cost).

Instead, I would find it very valuable to expose a simplified library (and possibly a Ledger API endpoint) to allow to introspect more easily the Ledger, instead of making this a concern of the codegen. This would allow users to build system to interact generically with the ledger.

I think here the comment should be that the party management service should be exposed through the Java bindings, with which I’d agree. I do not agree with the fact that DamlLedgerClient should be open for arbitrary extensions.

@stefanobaghino-da thanks for the detailed feedback.

The Java Bindings are just a GRPC Client. it is no different then using a OpenAPI spect to generate a HTTP Client. You are suggesting that “we can’t trust developers” to use a GRPC client and adapt that client to the business needs. The issue with the final classes is a developer level code conversation around creating productive developers in the community. Your description is more around product usage at a sales level of the ledger.

The bindings / codegen is generating a gRPC client for the Ledger API. It is not a application or really any sort of opinionated interface. It is a gRPC client.

This builds on my comment above about code productivity conversation vs “product” conversation: The example of the DamlClient not having the party allocation is an example of developer’s need to enhance the generated code for productivity purposes: People have been asking about the party allocation service in the java bindings since at least April 2020 (Java Bindings do not have all Ledger API services). No one would ever expect that every feature, function, and capability be provided on day 1. Product takes time, limited resources, etc. But by applying final on classes (inconsistently in many cases: Template classes have final, but choice classes do not, and Record classes do not), we are killing developer productivity to build business applications and re-use the provided classes / baseline.

I did not suggest that and I apologize if the message came across that way. The fact that a class is marked as final doesn’t prevent to compose the class and re-use it alongside the LedgerClient interface to make your own implementation. The point of it being final is making clear that that class is not intended for extension. I think it’s a good practice to be explicit about the extensibility of classes (you see more programming languages moving towards preferring making final the default and requiring classes to be explicitly marked as open – among which Kotlin and more recently Scala).

Are there other specific additions you would make to the DamlLedgerClient apart from adding party management?

@stefanobaghino-da if you want to follow those languages programming paradigms then why is “Composition over Inheritance” ignored?

  1. Tuple.class is open and does not have an interface
  2. Archive.class is open and does not have an interface
  3. generated Choice classes are open and do not have interfaces
  4. generated Template classes are final and have a Template interface, but the interface is ~empty
  5. CreatedEvent.class is final and has templates, but the contract fields are missing from the interfaces
  6. ArchivedEvent.class is final and has templates, but the contract fields are missing from the interfaces
  7. ExercisedEvent.class is open and has templates, but the contract fields are missing from the interfaces
  8. DamlRecord.class is open
  9. TransactionTree.class is open
  10. User.class is open
  11. etc etc etc.

If you want to apply constant final on everything, then please apply composition. At the moment it is a weird middle where you neither apply inheritance nor composition in a dev-productive manner.

The above means that there are systems that need to route ledger events into this “message bus”, so these systems need to transform the ledger event data into a format consumed by a “message bus”. So there is some sort of conversion process and devs need to understand and define what fields and types get converted. Without inheritance and without composition, you leave the developer to manually track each field defined by the ledger. If a daml template updates and new codegen is produced, a developer has to manually track the changes in the data as there is no composition to use. If the ledger api is updated and additional contract fields are added there are no facilities to leverage composition and the developer again needs to manually back track all data changes instead of having compile time identification of data-contract changes.

The codegen is generating a grpc client that is the “contract” between the ledger api and the application that would consume the events and transfer to your “message bus”. But in the current state the developer must manage the contract manually, without composition nor inheritance, on common classes and on data classes such as templates, choices, records/daml-data-classes, etc.

Sending over a message bus still requires a dev to build the “glue” between what the ledger sends and what the bus consumes. This is no different than someone building a business HTTP interface around the ledger API: It is an additional contract that abstracts the ledger api into a business context, and the ledger api response is being converted into a HTTP formatted response under the business contract/interface.

There may be some missing uniformity, which we should look into. In general, we try to be explicit about the possibility to inherit from a class. But I’m not sure about the point about not having neither inheritance nor composition. An advantage of composition is that you can always use that, regardless of whether the class you are dealing with is final or not. Can you help me understand your point?

Let us image that you have a perfect codegen with final classes and feature complete damlClient with all ledger api endpoints supported:

let us use my “Organization” example:

public final class Organization extends Template {
  public static final Identifier TEMPLATE_ID = new Identifier("SomePackage", "SomeModule", "Organization");

  public final String name;

  public final String root;

  public Organization(String name, String root) { = name;
    this.root = root;

public static class Contract implements {
    public final ContractId id;

    public final Organization data;

    public final Optional<String> agreementText;

    public final Optional<String> key;

    public final Set<String> signatories;

    public final Set<String> observers;

    public Contract(ContractId id, Organization data, Optional<String> agreementText,
        Optional<String> key, Set<String> signatories, Set<String> observers) { = id; = data;
      this.agreementText = agreementText;
      this.key = key;
      this.signatories = signatories;
      this.observers = observers;

      public static Contract fromCreatedEvent(CreatedEvent event) {
         return fromIdAndRecord(event.getContractId(), event.getArguments(), event.getAgreementText(), event.getContractKey().map(e -> e.asParty().orElseThrow(() -> new IllegalArgumentException("Expected e to be of type")).getValue()), event.getSignatories(), event.getObservers());

The Template interface provided by the java-bindings:


public abstract class Template {
    public Template() {

    public abstract CreateCommand create();

the Contract Interface:


public interface Contract {

and the CreatedEvent.class:

public final class CreatedEvent implements Event, TreeEvent {
    private final @NonNull List<@NonNull String> witnessParties;
    private final String eventId;
    private final Identifier templateId;
    private final String contractId;
    private final DamlRecord arguments;
    private final Optional<String> agreementText;
    private final Optional<Value> contractKey;
    private final @NonNull Set<@NonNull String> signatories;
    private final @NonNull Set<@NonNull String> observers;

Now let us assume your message bus scenario:

You are a developer and need to convert your CreatedEvent into some sort of message bus compliant object, and let us assume that the object the message bus will receive is the business data from Organization.class, the contract Id, and the the EventId.

What would this dto class look like? How would you use composition (or inheritance, or nothing?) to build this dto?

It heavily depends on what the downstream application needs. In your examples you assume that all of the main business logic of your entire application is stored directly on ledger, which I would say it’s an anti-pattern. On ledger you should have the minimum amount of logic which is necessary to create a synchronization layer that spans across services and organizations.

If for whatever reason, your specific application needs all the data in a contract, I would recommend to recreate the entirety of it as a specific domain object. Re-using ledger-specific data on the application means that you are making your back-end brittle and subject to upstream breaking changes.

One important use case to consider is that in which you are not in control of the Daml model, but you are subscribing to one produced by a separate organization: if you were fully in control of the domain classes, all you have to do is change the side that reads from Daml (and possibly adapt what’s on your service, in case the changes affect your application). If you were re-using the same classes across your entire stack, you would have no choice but to propagate those changes.

@stefanobaghino-da I did not assume all business logic was on the ledger. I gave you a very realistic and common example where 4 data points were passed into a “message bus”: 2 data fields inside of a contract and 2 metadata fields for finding the contract at a later point". You are dancing around theory and I am talking about real-world-practice.

I gave you an example of a single contract/template that had two fields. Asking for you to create the few lines of sample code that would create this “domain object”. The entirety of the issue being discussed is the lack of interfaces and/or extensibility (final classes) to manage these domain objects. I have not suggested that the codegen’ed classes should be re-used by other systems. I have suggested that the codegen’ed classes have compile time use to identify to the devs where data contract changes have occurred so the domain objects can be updated.

Everything being passed into another system is going to be a domain object. You have described your theory of the ledger-message-bus:

If this is your product vision, and the mediating applications connect to the ledger, and the mediator application are sending and receiving information into the ledger, then your mediator applications will all require “domain objects” to convert between the formats being sent AND received by the ledger. Kafka does this with AVRO for example where that avro format is used to generate typed classes for the “mediator application”.

Your teams have gone to the trouble to build a “Typed” client. This gives a contract for communicating with the ledger, and you then receive that data, but all of the benefits of the typed client become lost as soon as you want to send the data somewhere. Why would you go to all that effort if you could have just done a simple protobuf format and used JsonFormat.printer.print(theProtoMessage).

Instead you have built a client that generates domain specific classes based on the business data in the ledger and domain specific classes for all metadata in the ledger/on a contract. When the template/business data changes on the ledger (template is upgraded) the Domain objects needs to be updated to account for this change. The codegen would re-create the classes as it currently does, but you are missing the ability to have compile time connections between the domain object and the the Typed Classes you are generating.
Instead the developer must manually track the connectivity between their domain object and the Typed class being returned by ledger. Which is essentially the equivalent of getting raw json and having to re-build a mapping between the json/protobuf message and the domain object. This can be doubled in effort when we follow the path of data originating from the mediator application and being pushed into the ledger: you now have to remap all of the fields instead of having interfaces to follow.

If you write the sample class i requested above, and then walk through the theory of “okay i add 3 more fields into the template, how do i update my domain classes” you will see that it is all manual: the developer must walk through the code to find where the classes changed. If you had “composition” (java interfaces) then the developer can apply those interfaces on their domain objects so they will have compile time knowledge of changes to the what the “ledger-message-bus” is sending back. (Using the interface does not mean all data / methods are used, but it allows the developer to map the specific data points of interest for their domain objects)