It's been a wild year, and certainly one that will go down in history for
So with node metrics up and running, one of the first things you might notice is that memory is close to 100%. Most of you will know that being low on RAM is most certainly a bad thing. So I just wanted to make this post to explain why I'm allowing the RAM to fill to near its capacity.
Cardano Node is a relatively lightweight application. It's CPU and Memory usage are typically quite low. However as we transition from one epoch to the next, a large amount of calculations need to be processed by the node. This is leading to very large spikes at epoch transitions, causing some nodes to crash or not keep up to date with the network.
So we can see how the node is effected here, especially in memory usage. I've seen reports in the community that upping the amount of RAM isn't necessarily the answer to this, as it might be a problem in the cardano-node software.
In the meantime, for my nodes I'm utilising a large swap file on high speed storage, to provide the overflow memory that is needed during the transition. So when you see the nodes memory looking like it might be nearing capacity, just note that there is a large swap file of data that isn't represented in these metrics, that the node will dump files to once RAM capacity is filled. I'll monitor how this performs in the next transition and report back.