Documentation Index
Fetch the complete documentation index at: https://docs.canton.network/llms.txt
Use this file to discover all available pages before exploring further.
Work with dependencies
The application fromcompose is a complete and secure model for atomic swaps of assets, but there is plenty of room for improvement. However, one can’t implement all features before going live with an application so it’s important to understand how to change already running code. There are fundamentally two types of change one may want to make:
- Upgrades, which change existing logic. For example, one might want the
Assettemplate to have multiple signatories. - Extensions, which merely add new functionality through additional templates.
compose model with a simple second workflow: a multi-leg trade. In doing so, you’ll learn about:
- The software architecture of the Daml stack
- Dependencies and data dependencies
- Identifiers
compose, the setup for this chapter is slightly more complex:
- In a base directory, load the
composeproject usingdpm new intro-compose --template daml-intro-compose. The directoryintro7here is important as it’ll be referenced by the other project we are creating. - In the same directory, load this chapter’s project using
dpm new intro-9 --template daml-intro-9.
Dependencies contains a new module Intro.Asset.MultiTrade and a corresponding test module Test.Intro.Asset.MultiTrade.
DAR files, Daml-LF, and the engine
Incompose you already learnt a little about projects, Daml-LF, DAR files, and dependencies. In this chapter we will actually need to have dependencies from the current project to the compose project so it’s time to learn a little more about all this.
Let’s have a look inside the DAR file of compose. DAR files, like Java JAR files, are just ZIP archives, but the SDK also has a utility to inspect DARs out of the box:
- Navigate into the
intro7directory. - Build using
dpm build -o assets.dar - Run
dpm damlc inspect-dar assets.dar
*.dalffiles for the project and all its dependencies- The original Daml source code
*.hiand*.hiefiles for each*.damlfile- Some meta-inf and config files
intro7-1.0.0-887056cbb313b94ab9a6caf34f7fe4fbfe19cb0c861e50d1594c665567ab7625.dalf which is the actual compiled package for the project. *.dalf files contain Daml-LF, which is Daml’s intermediate language. The file contents are a binary encoded protobuf message from the daml-lf schema. Daml-LF is evaluated on the ledger by the Daml Engine, which is a JVM component that is part of tools like the IDE’s script runner, the Sandbox, or proper production ledgers. If Daml-LF is to Daml what Java Bytecode is to Java, the Daml Engine is to Daml what the JVM is to Java.
Hashes and identifiers
Under the heading “DAR archive contains the following packages:” you get a similar looking list of package names, paired with only the long random string repeated. That hexadecimal string,887056cbb313b94ab9a6caf34f7fe4fbfe19cb0c861e50d1594c665567ab7625 in this case, is the package hash and the primary and only identifier for a package that’s guaranteed to be available and preserved. Meta information like name (“intro7”) and version (“1.0.0”) help make it human readable but should not be relied upon. You may not always get DAR files from your compiler, but be loading them from a running ledger, or get them from an artifact repository.
We can see this in action. When a DAR file gets deployed to a ledger, not all meta information is preserved.
- Note down your main package hash from running
inspect-darabove - Start the project by running a ledger and uploading the
assets.darDAR:
- Open another terminal and use the gRPC Ledger API to download the dar, making sure to replace the hash with the appropriate one
- Run
dpm damlc inspect-dar assets_ledger.dar
intro7-1.0.0 with completely different contents so even when name and version are available, package hash is the only safe identifier.
That’s why over the Ledger API, all types, like templates and records are identified by the triple (entity name, module name, package hash). Your client application should know the package hashes it wants to interact with. To aid that, inspect-dar also provides a machine-readable format for the information it emits: dpm damlc inspect-dar --json assets_ledger.dar. The main_package_id field in the resulting JSON payload is the package hash of our project.
Secondly, you’ll notice that all the *.daml, *.hi and *.hie files are gone. This leads us to data dependencies.
Dependencies and data dependencies
Dependencies under thedaml.yaml dependencies group rely on the *.hi files. The information in these files is crucial for dependencies like the Daml standard library, which provide functions, types, and typeclasses.
However, as you can see above, this information isn’t preserved. Furthermore, preserving this information may not even be desirable. Imagine we had built intro7 with SDK 1.100.0, and are building intro9 with SDK 1.101.0. All the typeclasses and instances on the inbuilt types may have changed and are now present twice — once from the current SDK and once from the dependency. This gets messy fast, which is why the SDK does not support dependencies across SDK versions. For dependencies on contract models that were fetched from a ledger, or come from an older SDK version, there is a simpler kind of dependency called data-dependencies. The syntax for data-dependencies is the same, but they only rely on the “binary” *.dalf files. The name tries to confer that the main purpose of such dependencies is to handle data: Records, Choices, Templates. The stuff one needs to use contract composability across projects.
For an extension model like this one,data-dependencies are appropriate, so the current project includes compose that way:
You’ll notice a module Test.Intro.Asset.TradeSetup, which is almost a carbon copy of the compose trade setup Scripts. data-dependencies is designed to use existing contracts and data types. Daml Script is not imported. In practice, we also shouldn’t expect that the DAR file we download from the ledger using the Ledger API to contain test scripts. For larger projects it’s good practice to keep them separate and only deploy templates to the ledger.
About project structures
As you’ve seen here, identifiers depend on the package as a whole and packages always bring all their dependencies with them. Thus changing anything in a complex dependency graph can have significant repercussions. It is therefore advisable to keep dependency graphs simple, and to separate concerns which are likely to change at different rates into separate packages. For example, in all our projects in this intro, including this chapter, our scripts are in the same project as our templates. In practice, that means changing a test changes all identifiers, which is not desirable. It’s better for maintainability to separate tests from main templates. If we had done that in Design Patterns, that would also have saved us from copying those modules. Similarly, we includedTrade in the same project as Asset in Design Patterns, even though Trade is a pure extension to the core Asset model. If we expect Trade to need more frequent changes, it may be a good idea to split it out into a separate project from the start.
Next up
TheMultiTrade model has more complex control flow and data handling than previous models. In Language Fundamentals you’ll learn how to write more advanced logic: control flow, folds, common typeclasses, custom functions, and the Daml standard library. We’ll be using the same projects so don’t delete your folders just yet.
Building with dpm
When working on Canton Network projects, use thedpm tool for all build operations:
dpm build— Compile your Daml project and produce a DAR filedpm build --all— Build all packages in a multi-package projectdpm test— Run all Daml Script tests in the current packagedpm codegen-java— Generate Java bindings from your Daml modeldpm codegen-js— Generate TypeScript/JavaScript bindings from your Daml modeldpm sandbox— Start a local Canton sandbox for integration testing
dpm tool wraps the Daml compiler and build system with Canton Network defaults, handling SDK versioning and dependency resolution automatically.
How to build Daml Archive (.dar) files
This guide shows you how to organize the source code defining your Daml workflows and how to build and package that code as Daml Archive (.dar) files, which you can deploy to the ledger or use to develop applications against. The guide is organized into the following smaller how-tos:- How to define and build one or more Daml packages
- How to manage dependencies on third-party Daml packages
- How to decide what Daml code to put into what package
How to define Daml packages
Single package
All Daml packages require a daml.yaml file. Create this file at the root of your package directory. You will need the following information to populate this file:- SDK Version: call
dpm versionto determine the installed SDK versions - Package name: lower-skewer-case name that is unique to your package and company.
daml.yaml, replacing the <place-holders> as appropriate.
.daml files) for the package are placed in the directory specified by the source field above. Create a daml folder at the root of your package. Write your .daml files in this directory, the file name must match the module header in the file, treating dots as directories, as shown:
MyModule.damlcontainsmodule MyModule wherePath/To/My/Module.damlcontainsmodule Path.To.My.Module where
Multiple packages
Your Daml project will usually need at least two packages, for workflows and testing. Daml provides support for building and developing these packages via themulti-package.yaml file. In a directory above your package(s) directories, create a multi-package.yaml file. List the relative paths to your packages in this file using the following structure:
daml.yaml, not to the daml.yaml itself.
Environment variables in configuration files
When your project has more than one package, consider using environment variables to avoid duplication of information like thesdk-version. Replace the sdk-version field with sdk-version: $SDK_VERSION (or any other valid environment variable name), and ensure this variable is set before building. SDK_VERSION=3.4.9 dpm build --all
Variables can also be placed inline, and are supported on all string fields in the daml.yaml, as the following example shows:
How to build Daml packages
To build a single package, navigate to its root directory and rundpm build. To build all packages in a multi-package project, navigate to the directory containing the multi-package.yaml and run dpm build --all. By default these will create a Daml Archive (.dar) file for each package built in <package-directory>/.daml/dist/<package-name>-<package-version>.dar. .dar files are used both for uploading to the Canton Ledger, and for package dependencies. The location where the .dar is created can be overridden using the --output flag for dpm build, which can also be provided in the daml.yaml file under the build-options field:
dpm build --help, which includes options for changing the LF version and configuring warnings. All of these options can also be provided via build-options above. Consider reading Recommended Build Options for our recommended set of warning flags.
If you face issues when changing configuration options like the sdk-version, or the LF version, cleaning the package(s) may help. To clean a single package, run daml clean from the package directory. To clean all packages in a project, run daml clean --all from the directory containing the multi-package.yaml
How to depend on Daml packages
Dependencies in Daml are specified by their Daml Archive (.dar) file. To add a dependency to your package, add the paths to your dependency .dar files to yourdaml.yaml as follows:
data-dependencies instead of the previously covered dependencies field, the latter is reserved for daml-prim, daml-stdlib, and the optional testing library daml-script. Once added to the daml.yaml, modules from the dependency .dar can be imported from the modules of this package. In the event of collision between module names, either with this package or other dependencies, see module-prefixes.
When depending on .dar files from packages listed in the multi-package.yaml, calling dpm build and dpm build --all will build the relevant packages in the correct order for you.
How to manage dependencies on third-party Daml packages
To build composed transactions, you will need to depend on the .dar files of third-party applications. At the time of writing there is no dedicated package repository for Daml Archives. However .dar files are reasonably small and change infrequently. You thus best check them into your repository, in a dars/vendored directory. If you instead retrieve the .dar files as part of a build step, check the hashes of these dars as part of this step. If you intend to distribute your .dar files for others to build on, include the retrieval process in your documentation.Depending on daml-script test libraries
Thedaml-script library is not cross compatible with other releases from different Daml SDK versions. Therefore, when using Daml script test code shared by third-party apps, we recommend you to vendor in that Daml script code. For example, by checking it into a daml/vendored/ directory in your repository. A good example is the Canton Network Token Standard test harness provided by splice here: https://github.com/DACH-NY/canton-network-node/tree/main/token-standard/splice-token-standard-test. Adding these packages to your multi-package.yaml will ensure they are built as needed.