Accuracy‐driven recommender systems risk confining users to "filter‐bubbles'' of familiar content. Recent work on coVariance Neural Networks (VNNs) provides a scalable alternative to Principal Component Analysis (PCA) for modelling high-order correlations, but their impact on be
...
Accuracy‐driven recommender systems risk confining users to "filter‐bubbles'' of familiar content. Recent work on coVariance Neural Networks (VNNs) provides a scalable alternative to Principal Component Analysis (PCA) for modelling high-order correlations, but their impact on beyond-accuracy metrics (BAMs), such as Novelty and Diversity, remains unexplored.
We use the user–user covariance (or its inverse, the precision matrix) as a graph shift operator (GSO) and train SelectionGNN-based VNNs on the MovieLens-100K dataset.
Two training regimes are evaluated: (i) RMSE-only (No-BAM-SVNN) and (ii) a compound loss that also includes novelty and diversity terms (BAM-SVNN).
For each regime we sweep six graph configurations: covariance/precision crossed with {dense, hard-threshold, soft-threshold} sparsification, under five random seeds, yielding 30 runs per regime.
Baseline comparisons include PCA, a naive mean–std model, and a random predictor.
The best SVNN configuration increases recommendation Novelty by 2.8 percentage points and matches PCA’s Diversity while incurring only a 0.03 RMSE penalty.
Hard-thresholded precision graphs provide the lowest SVNN RMSE (0.952), whereas dense covariance graphs maximise diversity (0.868).
Integrating novelty/diversity directly into the loss offers no additional benefit yet multiplies runtime by x33.
One-way ANOVA indicates that model family explains 97.6% of RMSE variance (\(\eta^2=0.976\)) and 77.8% of novelty variance.
This work is the first to benchmark (sparsified) VNNs on beyond-accuracy metrics, demonstrating a favourable accuracy–novelty trade-off and clarifying when sparsification and BAM-weighted training pay off.
All code, data splits and statistical notebooks are released for full reproducibility.