Abstract
The desire to understand network systems in increasing detail has resulted in the development of a diversity of generative models that describe large-scale structure in a variety of ways, and allow its characterization in a principled and powerful manner. Current models include features such as degree correction, where nodes with arbitrary degrees can belong to the same group, and community overlap, where nodes are allowed to belong to more than one group. However, such elaborations invariably result in an increased number of parameters, which make these model variants prone to overfitting. Without properly accounting for the increased model complexity, one should naturally expect these larger models to "better" fit empirical networks, regardless of the actual statistical evidence supporting them. Here we propose a principled method of model selection based on the minimum description length principle and posterior odds ratios that is capable of fully accounting for the increased degrees of freedom of the larger models, and selects the best model according to the statistical evidence available in the data. Contrary to other alternatives such as likelihood ratios and parametric bootstrapping, this method scales very well, and combined with efficient inference methods recently developed, allows for the analysis of very large networks with an arbitrarily large number of groups. In applying this method to many empirical datasets from different fields, we observed that while degree correction tends to provide better fits for a majority of networks, community overlap does not, and is selected as better model only for a minority of them.
Original language | English |
---|---|
Article number | 011033 |
Number of pages | 20 |
Journal | Physical Review X |
Volume | 5 |
Issue number | 1 |
Early online date | 25 Mar 2015 |
DOIs | |
Publication status | Published - 31 Mar 2015 |
Keywords
- Computer sceince
- social and information networks
- Condensed Matter
- Disordered systems and neural networks
- Statistical mechanics
- Physics
- Data analysis
- Statistics
- Probability
- Physics and society
- Machine Learning