Raise warnings when users try to compute extremely large kernel matrices
Subject of the issue
Warnings should be given when users (often unknowingly) try to compute similarity matrices whose sizes exceed that can be held in memory.
Current situation
The size of the kernel matrix grows quadratic with respect to the number of graphs and the number of nodes in each graph. For example, a full kernel matrix among 100,000 graphs would contain 10 billion elements and take up 40 GB of memory! A nodal similarity matrix could grow even faster and thus break the computation.
Expected behaviour
The graph kernel should compare the size of the kernel matrix with the amount of available RAM in the system and raise warnings if the computation would end up using a high percentage of that.
Edited by Yu-Hang "Maxin" Tang