Maximum number of record fields?

Hi

it seems to me the language server is running into an issue if there is a data type with more than 162 fields.

damlc is fine with this. But within DAML Studio an error is popping up stating

ScenarioBackendException {​​​​​scenarioNote = "Failed to create scenario context",
scenarioBackendError = BErrorClient (ClientIOError (GRPCIOBadStatusCode
StatusInternal (StatusDetails {​​​​​unStatusDetails = "com.google.protobuf.InvalidProtocolBufferException: 
Protocol message had too many levels of nesting. May be malicious. Use 
CodedInputStream.setRecursionLimit() to increase the depth limit."}​​​​​)))}​​​​​

Removing one field from the data type makes this error go away.

There is a a hint on using setRecursionLimit, but it is not clear to me where I would set this.

I am also wondering what this has to do with recursion, given that the large data type is flat.

Any suggestions on how to work with large data types?

Regards,
Alex

3 Likes

Hi @Alexander_Bernauer,

There is no theoretical limitation in the number of fields a record/template can have, however we have a hard limit on the depth of Daml-LF expressions. We are aware of this annoying limitation. Unfortunately, solving this shortcoming is currently not our highest priority.

In your case, I suspect the trouble comes from code automatically generated by the compiler for a type class associated to your data structure, in particular the code to derive the Show instance. Take into account that a Template definition implicitly derives Eq and Show.

A workaround would consist to write the code of the Show instance yourself. I paste below an example of what I have in mind.

Hope that will help,

Cheers,
Remy

module Mod where

data BigRecord = BigRecord with
    x000: Int, x001: Int, x002: Int, x003: Int, x004: Int, x005: Int, x006: Int, x007: Int, x008: Int, x009: Int
    x010: Int, x011: Int, x012: Int, x013: Int, x014: Int, x015: Int, x016: Int, x017: Int, x018: Int, x019: Int
    x020: Int, x021: Int, x022: Int, x023: Int, x024: Int, x025: Int, x026: Int, x027: Int, x028: Int, x029: Int
    x030: Int, x031: Int, x032: Int, x033: Int, x034: Int, x035: Int, x036: Int, x037: Int, x038: Int, x039: Int
    x040: Int, x041: Int, x042: Int, x043: Int, x044: Int, x045: Int, x046: Int, x047: Int, x048: Int, x049: Int
    x050: Int, x051: Int, x052: Int, x053: Int, x054: Int, x055: Int, x056: Int, x057: Int, x058: Int, x059: Int
    x060: Int, x061: Int, x062: Int, x063: Int, x064: Int, x065: Int, x066: Int, x067: Int, x068: Int, x069: Int
    x070: Int, x071: Int, x072: Int, x073: Int, x074: Int, x075: Int, x076: Int, x077: Int, x078: Int, x079: Int
    x080: Int, x081: Int, x082: Int, x083: Int, x084: Int, x085: Int, x086: Int, x087: Int, x088: Int, x089: Int
    x090: Int, x091: Int, x092: Int, x093: Int, x094: Int, x095: Int, x096: Int, x097: Int, x098: Int, x099: Int
    x100: Int, x101: Int, x102: Int, x103: Int, x104: Int, x105: Int, x106: Int, x107: Int, x108: Int, x109: Int
    x110: Int, x111: Int, x112: Int, x113: Int, x114: Int, x115: Int, x116: Int, x117: Int, x118: Int, x119: Int
    x120: Int, x121: Int, x122: Int, x123: Int, x124: Int, x125: Int, x126: Int, x127: Int, x128: Int, x129: Int
    x130: Int, x131: Int, x132: Int, x133: Int, x134: Int, x135: Int, x136: Int, x137: Int, x138: Int, x139: Int
    x140: Int, x141: Int, x142: Int, x143: Int, x144: Int, x145: Int, x146: Int, x147: Int, x148: Int, x149: Int
    x150: Int, x151: Int, x152: Int, x153: Int, x154: Int, x155: Int, x156: Int, x157: Int, x158: Int, x159: Int
    x160: Int, x161: Int, x162: Int, x163: Int, x164: Int, x165: Int, x166: Int, x167: Int, x168: Int, x169: Int
    x170: Int, x171: Int, x172: Int, x173: Int, x174: Int, x175: Int, x176: Int, x177: Int, x178: Int, x179: Int
    x180: Int, x181: Int, x182: Int, x183: Int, x184: Int, x185: Int, x186: Int, x187: Int, x188: Int, x189: Int
    x190: Int, x191: Int, x192: Int, x193: Int, x194: Int, x195: Int, x196: Int, x197: Int, x198: Int, x199: Int
  deriving Eq

instance Show BigRecord where show _ = "<BigRecord>"

template Templ
  with
   p: Party
   r: BigRecord
  where
   signatory p

record = 
  BigRecord
    000 001 002 003 004 005 006 007 008 009
    010 011 012 013 014 015 016 017 018 019
    020 021 022 023 024 025 026 027 028 029
    030 031 032 033 034 035 036 037 038 039
    040 041 042 043 044 045 046 047 048 049
    050 051 052 053 054 055 056 057 058 059
    060 061 062 063 064 065 066 067 068 069
    070 071 072 073 074 075 076 077 078 079
    080 081 082 083 084 085 086 087 088 089
    090 091 092 093 094 095 096 097 098 099
    100 101 102 103 104 105 106 107 108 109
    110 111 112 113 114 115 116 117 118 119
    120 121 122 123 124 125 126 127 128 129
    130 131 132 133 134 135 136 137 138 139
    140 141 142 143 144 145 146 147 148 149
    150 151 152 153 154 155 156 157 158 159
    160 161 162 163 164 165 166 167 168 169
    170 171 172 173 174 175 176 177 178 179  
    180 181 182 183 184 185 186 187 188 189  
    190 191 192 193 194 195 196 197 198 199  

main = do
  alice <- getParty "Alice"
  submit alice do create $ Templ alice record
5 Likes

From a design perspective, generally, grouping fields in sub-record types is advisable. Fields can be grouped arbitrarily or logically. There is no information-theoretic difference between these two versions of Foo:

data Foo1 = Foo1 { a: Int, b: Int, c: Int }

data Foo2 = Foo2 { ab: AB, c: Int }
data AB = AB { a: Int, b: Int }

In general if a set of fields occurs in multiple record types or templates in a Daml program, I think it is generally a nice factoring to define a grouping record type for that set of fields.

That is not the easiest solution if you are code-generating the set of >162 fields, in which case overriding Show or similar may be more expedient. But automatic field set factoring may actually be a nice code-size-reducing optimization for any Daml code generator that has a lot of similar output.

3 Likes