Hi team,
How do you estimate the size of an infrastructure for a node which will only host a DAML JSON API ? let’s say for 100, 300, 1000 or 10 000 constant connections ? Will upscaling the machines increase the processing drastically or the approach should be horizontally scaling with multiple JSON API nodes pointing to the ledger api ?
Are there any best practices or recommended sizing ?
Cheers,
Jean-Paul
1 Like
This heavily depends on the load you expect on those connection, what “constant” means, how you are deploying individual components, how you use the HTTP JSON API, etc. Basically this means that there are so many factors at play that any sensible sizing recommendation more specific than “something between 1MB and 1TB” would be (at best) misleading. The best recommendation I can give is to start with a small deployment and use metrics from the HTTP JSON API, the JVM and the operating system to measure and visualize how your system responds to typical access patterns, stress test the system and change one variable at a time to see how this affects the usage of resources and use this data to inform what are the best strategies to address sizing and scaling for your particular use case.
If you have concrete questions about how and why varying certain resource allocations changes the behavior in unexpected ways, this forum is definitely the right place to ask.
1 Like