We are using DamlHub. We have upgraded our Daml code to SDK 2.10.0 specifying LF=1.17. We uploaded a dar to a DamlHub scratchpad on Protocol 7. No problems there.
When we upload the next version of this dar with changes that do NOT meet the criteria of SCU, we see a DAR_NOT_VALID_UPGRADE message in the logs. Which is ok.
But what I had thought was that, even though it is not a valid upgrade according to SCU, we could still use package-name:module-name:template-name to reference templates in the uploaded dar. Instead (using the JSON API), we get a JsonError: Cannot resolve template ID.
Not all our dar changes will meet the SCU criteria, even after we activate SCU (by using Protocol 7, SDK 2.10.x, and LF 1.17) .
Is there something special we have to do to get DamlHub to “recognize” an uploaded dar that does not meet the SCU criteria?
To answer my question, if I know that a set of package changes will NOT satisfy the SCU criteria, then I can remove the LF=1.17 setting, which will revert the dar to the default LF of 1.15, and that dar will not come under the purview of the SCU system. Then DamlHub will recognize the dar, and I can access templates using the package id. Then if the next change following that DOES satisfy the SCU criteria (relative to the last SCU-recognized dar), I set LF=1.17 again, and the SCU-recognized series of dars resumes (minus the intervening LF=1.15 one).
This is correct. SCU only applies to LF 1.17 Dar’s so it’s possible to opt out by remaining on LF 1.15.
In general, when LF 1.17 DAR’s are loaded into an SCU ledger, the ledger will enforce the SCU upgrade constraints. There is no way to create a 1.17 DAR that is not subject to these checks in an SCU-enabled ledger.
@Ben_McAllister From a functional point of view it doesn’t really matter, but technically speaking, these DAR’s are in fact being deployed to the ledger. The functional problem stems from the fact that the DAR’s are invalid from an SCU perspective and therefore not functional on the ledger even though they’ve been deployed.
The best practice for handling this case is to avoid it in a Hub context and use something like a local sandbox to test and verify DAR’s before deploying to Hub.