Problem calling daml script directly


Given that daml start does not accept the max size parameter, I am now starting my prototype with a sandbox and then I execute a script, but what works fine in daml start (not the size though) fails when run independently. The message is a bit cryptic and I would appreciate some help on that:

I am running the following

  1. sandbox
daml sandbox --maxInboundMessageSize=18584556
INFO: Slf4jLogger started
INFO: Listening on localhost:6865 over plain text.
   ____             ____
  / __/__ ____  ___/ / /  ___ __ __
 _\ \/ _ `/ _ \/ _  / _ \/ _ \\ \ /

INFO: Initialized sandbox version 1.7.0 with ledger-id = af42ab39-e53b-49c0-b972-bfece4b04c51, port = 6865, dar file = List(), time mode = wall-clock time, ledger = in-memory, auth-service = AuthServiceWildcard$, contract ids seeding = strong
  1. The script
daml script --dar .daml/dist/test-0.5.0.dar --script-name Setup:initialize --ledger-host localhost --ledger-port 6865
Error: User abort: Submit failed with code 3: Command interpretation error in LF-DAMLe: Couldn't find package 775ff7ee113954ccc1b46f41783243a86e912a06903bd3c1f0ac31d548cf7a03. Details: N/A.
Exception in thread "main" com.daml.lf.speedy.SError$DamlEUserError
	at com.daml.lf.speedy.SBuiltin$SBError$.executePure(SBuiltin.scala:1379)
	at com.daml.lf.speedy.SBuiltinPure.execute(SBuiltin.scala:55)
	at com.daml.lf.speedy.Speedy$Machine.enterApplication(Speedy.scala:483)
	at com.daml.lf.speedy.SExpr$SEAppAtomicGeneral.execute(SExpr.scala:139)
	at com.daml.lf.speedy.Speedy$
	at com.daml.lf.engine.script.Runner.stepToValue$1(Runner.scala:386)
	at com.daml.lf.engine.script.Runner$$anonfun$$nestedInanonfun$runWithClients$7$1.$anonfun$applyOrElse$13(Runner.scala:461)
	at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307)
	at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
	at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$
	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85)
	at akka.dispatch.BatchingExecutor$
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:47)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(
	at java.base/java.util.concurrent.ForkJoinPool.scan(
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(
	at java.base/
daml-helper: Received ExitFailure 1 when running
Raw command: java -Dlogback.configurationFile=/home/jean/.daml/sdk/1.7.0/daml-sdk/script-logback.xml -jar /home/jean/.daml/sdk/1.7.0/daml-sdk/daml-sdk.jar script --dar .daml/dist/test-0.5.0.dar --script-name Setup:initialize --ledger-host localhost --ledger-port 6865

Yet in my dar file I do have:

1 Like

Hi @Jean_Safar,

I’m not quite sure which size parameter you mean but I expect it’s the max inbound message size on either DAML Script or the ledger. You can control both of those via --sandbox-option and --script-option in daml start:

daml start --sandbox-option --max-inbound-message-size=10000000 --script-option --max-inbound-message-size=10000000

As for the error you get if you start it separately, it looks like the DAR has not been uploaded to the ledger. The easiest way is to pass it on startup:

daml sandbox .daml/dist/test-0.5.0.dar

Alternatively you can start the ledger without the DAR and then upload it via daml ledger upload-dar.


Thanks a lot for this @cocreature… I thought it was taking it by default and I can’t believe I did not think of passing the dar directly … I will blame that lapse on the prolonged lock down …