Daml Script fails - INVALID_ARGUMENT: Command interpretation error in LF-DAMLe: Couldn't find package

Recently we upgraded our SDK to 1.13.1 and switched from daml sandbox to daml-on-sql connector
However now our Initialization script fails

Here is the output

daml script --dar .daml/dist/xxx.dar --script-name InitializeScript:initializeFixed --ledger-host localhost --ledger-port 6865
SDK 1.16.0 has been released!
See https://github.com/digital-asset/daml/releases/tag/v1.16.0 for details.

Exception in thread "main" com.daml.lf.engine.script.ScriptF$FailedCmd: Command submit failed: INVALID_ARGUMENT: Command interpretation error in LF-DAMLe: Couldn't find package 677c9cf68f35d95037060dde428263dcbe979e282ae16597e5a2d71780228462. Details: N/A.
Daml stacktrace:
submit at 677c9cf68f35d95037060dde428263dcbe979e282ae16597e5a2d71780228462:InitializeScript:98
        at com.daml.lf.engine.script.Runner.$anonfun$runWithClients$11(Runner.scala:431)
        at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
        at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:56)
        at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:93)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
        at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:93)
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:53)
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
        at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
        at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1016)
        at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1665)
        at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1598)
        at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
Caused by: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Command interpretation error in LF-DAMLe: Couldn't find package 677c9cf68f35d95037060dde428263dcbe979e282ae16597e5a2d71780228462. Details: N/A.
        at io.grpc.Status.asRuntimeException(Status.java:534)
        at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
        at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:553)
        at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:68)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:739)
        at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:718)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:830)
daml-helper: Received ExitFailure 1 when running
Raw command: java "-Dlogback.configurationFile=C:\\Users\\bartc\\AppData\\Roaming\\daml\\sdk\\1.13.1\\daml-sdk/script-logback.xml" -jar "C:\\Users\\bartc\\AppData\\Roaming\\daml\\sdk\\1.13.1\\daml-sdk/daml-sdk.jar" script --dar .daml/dist/xxx.dar --script-name InitializeScript:initializeFixed --ledger-host localhost --ledger-port 6865

Any thoughts on how to fix this ?

The error means that our script, specifically line InitializeScript:98 is trying to send a command for a template in a package that is not known on the ledger.
There are a few reasons why this can happen:

  1. You never uploaded your templates to the ledger. In this case, upload them via daml ledger upload-dar.
  2. You uploaded your templates to the ledger but then you changed them locally. Changing locally here can go from actual code changes (any change in the same package matters, doesn’t have to be a direct change to the template) to changing the SDK version and recompiling. In both cases to the ledger you have two packages which have a template with the same name but a different package id. If the change was intentional then you can upload the new DAR via daml ledger upload-dar. If the change was unintentional, revert it and the error should go away.

If you want to iterate on your scripts, it can make sense to put them in a separate package from your templates which depends on the package with the templates via data-dependencies. That way you can freely change your scripts without changing the package id of your templates.

To figure out which packages the ledger knows about you can query the JSON API endpoint /v1/packages or the gRPC package service.

1 Like

Thanks @cocreature`

daml ledger upload-dar
`
solved the issue