Measuring complexity of a graph: Tsallis Entropy of a graph

In 2008, Passerini and Saverini introduced the notion of Von-Neumann entropy of a graph here:

Here, the normalized eigenvalues of the Laplacian of the graph masquerade as probabilities and then Shannon Entropic formulation is sprinkled on them to get a complexity measure called the 'Von-Neumann entropy of a graph'.

My idea: Isn't it better to use the Tsallis non-extensive entropy measure on these 'probabilities'. The benefits of doing this are two fold:

1: Since the entropy involves sum of probabilities raised to some power, this sum, in my hunch is waay easier to evaluate compared to the Shannon entropy, where precise knowledge of the eigenvalues is needed. For large graphs, it will be mighty hard to compute Von-Neumann entropy, which is why I prefer the Tsallis variant.

2: This sum of probabilities raised to some power appears also in many graph characterizing indices such as Mean First passage time related metrics (read anything coming from the family of Random Walks derived graph coverage metrics) as well as the Estrada index family. It will be quite trivial to map metrics from these metric families to the Tsallis(q) family leading to, plausibly, very nice interpretations of what it means for a graph to be too complex to be walked upon easily.

Now the big question: Does Tsallis entropy actually give you nice increase in complexity curves like it does in ?

Answer: Yep!

Here is the MATLAB code for these entropies. Try it out yourself!

function [s_vn,s_tsallis]=graph_entropy(A,q)
% %This function evaluates the Tsallis Entropy (s_tsallis)
% and the Von neumann entropy, s_vn, of a given graph with Adjacency matrix
% A.
% The input variables are the adjacency matrix of the graph and the Tsallis
% non-extensive q-indexA is the adjacency matrix
n= length(A);
for i=1:n
R=L/sum(d);%R is the density matrix
for i=1:n
end %End of function