yog/centrality
Centrality measures for identifying important nodes in graphs.
Provides degree, closeness, harmonic, betweenness, and PageRank centrality. All functions return a Dict(NodeId, Float) mapping nodes to their scores.
Overview
| Measure | Function | Best For |
|---|---|---|
| Degree | degree/2 | Local connectivity |
| Closeness | closeness/5 | Distance to all others |
| Harmonic | harmonic_centrality/5 | Disconnected graphs |
| Betweenness | betweenness/5 | Bridge/gatekeeper detection |
| PageRank | pagerank/2 | Link-quality importance |
Types
A mapping of Node IDs to their calculated centrality scores.
pub type Centrality =
dict.Dict(Int, Float)
Specifies which edges to consider for directed graphs.
pub type DegreeMode {
InDegree
OutDegree
TotalDegree
}
Constructors
-
InDegreeConsider only incoming edges (Prestige).
-
OutDegreeConsider only outgoing edges (Gregariousness).
-
TotalDegreeConsider both incoming and outgoing edges.
Configuration options for the PageRank algorithm.
PageRank models a “random surfer” who follows links with probability
damping and jumps to a random page with probability 1 - damping.
Fields
damping: Probability of continuing to follow links (typically 0.85). Higher values mean the surfer follows more links before random jumping.max_iterations: Maximum iterations before returning current scores.tolerance: Convergence threshold. Algorithm stops when the L1 norm of score changes falls below this value.
Default Options
Use default_pagerank_options() for standard settings:
- damping: 0.85
- max_iterations: 100
- tolerance: 0.0001
pub type PageRankOptions {
PageRankOptions(
damping: Float,
max_iterations: Int,
tolerance: Float,
)
}
Constructors
-
PageRankOptions( damping: Float, max_iterations: Int, tolerance: Float, )
Values
pub fn alpha_centrality(
graph: model.Graph(n, e),
alpha: Float,
initial: Float,
max_iterations: Int,
tolerance: Float,
) -> dict.Dict(Int, Float)
Calculates Alpha Centrality for all nodes.
Alpha centrality is a generalization of Katz centrality for directed graphs. It measures the total number of paths from a node, weighted by path length with attenuation factor alpha.
Unlike Katz, alpha centrality does not include a constant beta term and is particularly useful for analyzing influence in directed networks.
Formula: C(v) = α * Σ C(u) for all predecessors u (or neighbors for undirected)
Time Complexity: O(max_iterations * (V + E))
Parameters
alpha: Attenuation factor (typically 0.1-0.5)initial: Initial centrality value for all nodesmax_iterations: Maximum number of iterationstolerance: Convergence threshold
Example
centrality.alpha(graph, alpha: 0.3, initial: 1.0, max_iterations: 100, tolerance: 0.0001)
// => dict.from_list([#(1, 2.0), #(2, 3.0), #(3, 2.0)])
pub fn betweenness(
graph: model.Graph(n, e),
with_zero zero: e,
with_add add: fn(e, e) -> e,
with_compare compare: fn(e, e) -> order.Order,
with_to_float to_float: fn(e) -> Float,
) -> dict.Dict(Int, Float)
Calculates Betweenness Centrality for all nodes.
Betweenness centrality of a node v is the sum of the fraction of all-pairs shortest paths that pass through v.
Time Complexity: O(VE) for unweighted, O(VE + V²logV) for weighted.
pub fn betweenness_float(
graph: model.Graph(n, Float),
) -> dict.Dict(Int, Float)
Betweenness centrality with Float weights.
pub fn betweenness_int(
graph: model.Graph(n, Int),
) -> dict.Dict(Int, Float)
Betweenness centrality with Int weights.
pub fn closeness(
graph: model.Graph(n, e),
with_zero zero: e,
with_add add: fn(e, e) -> e,
with_compare compare: fn(e, e) -> order.Order,
with_to_float to_float: fn(e) -> Float,
) -> dict.Dict(Int, Float)
Calculates Closeness Centrality for all nodes.
Closeness centrality measures how close a node is to all other nodes in the graph. It is calculated as the reciprocal of the sum of the shortest path distances from the node to all other nodes.
Formula: C(v) = (n - 1) / Σ d(v, u) for all u ≠ v
Note: In disconnected graphs, nodes that cannot reach all other nodes will have a centrality of 0.0. Consider harmonic_centrality for disconnected graphs.
Time Complexity: O(V * (V + E) log V) using Dijkstra from each node
Parameters
zero: The identity element for distances (e.g., 0 for integers)add: Function to add two distancescompare: Function to compare two distancesto_float: Function to convert distance type to Float for final score
Example
centrality.closeness(
graph,
with_zero: 0,
with_add: int.add,
with_compare: int.compare,
with_to_float: int.to_float,
)
// => dict.from_list([#(1, 0.666), #(2, 1.0), #(3, 0.666)])
pub fn closeness_float(
graph: model.Graph(n, Float),
) -> dict.Dict(Int, Float)
Closeness centrality with Float weights. Uses 0.0 as zero, float.add, float.compare, and identity.
pub fn closeness_int(
graph: model.Graph(n, Int),
) -> dict.Dict(Int, Float)
Closeness centrality with Int weights (e.g., unweighted graphs). Uses 0 as zero, int.add, int.compare, and int.to_float.
pub fn default_pagerank_options() -> PageRankOptions
Default PageRank options (damping=0.85, max_iterations=100, tolerance=0.0001).
pub fn degree(
graph: model.Graph(n, e),
mode: DegreeMode,
) -> dict.Dict(Int, Float)
Calculates the Degree Centrality for all nodes in the graph.
For directed graphs, use mode to specify which edges to count.
For undirected graphs, the mode is ignored.
pub fn degree_total(
graph: model.Graph(n, e),
) -> dict.Dict(Int, Float)
Degree centrality with default options for undirected graphs. Uses TotalDegree mode.
pub fn eigenvector(
graph: model.Graph(n, e),
max_iterations: Int,
tolerance: Float,
) -> dict.Dict(Int, Float)
Calculates Eigenvector Centrality for all nodes.
Eigenvector centrality measures a node’s influence based on the centrality of its neighbors. A node is important if it is connected to other important nodes. Uses power iteration to converge on the principal eigenvector.
Time Complexity: O(max_iterations * (V + E))
Parameters
max_iterations: Maximum number of power iterationstolerance: Convergence threshold for L2 norm
Example
centrality.eigenvector(graph, max_iterations: 100, tolerance: 0.0001)
// => dict.from_list([#(1, 0.707), #(2, 1.0), #(3, 0.707)])
pub fn harmonic_centrality(
graph: model.Graph(n, e),
with_zero zero: e,
with_add add: fn(e, e) -> e,
with_compare compare: fn(e, e) -> order.Order,
with_to_float to_float: fn(e) -> Float,
) -> dict.Dict(Int, Float)
Calculates Harmonic Centrality for all nodes.
Harmonic centrality is a variation of closeness centrality that handles disconnected graphs gracefully. It sums the reciprocals of the shortest path distances from a node to all other reachable nodes.
Formula: H(v) = Σ (1 / d(v, u)) / (n - 1) for all u ≠ v
Time Complexity: O(V * (V + E) log V)
pub fn harmonic_centrality_float(
graph: model.Graph(n, Float),
) -> dict.Dict(Int, Float)
Harmonic centrality with Float weights.
pub fn harmonic_centrality_int(
graph: model.Graph(n, Int),
) -> dict.Dict(Int, Float)
Harmonic centrality with Int weights.
pub fn katz(
graph: model.Graph(n, e),
alpha: Float,
beta: Float,
max_iterations: Int,
tolerance: Float,
) -> dict.Dict(Int, Float)
Calculates Katz Centrality for all nodes.
Katz centrality is a variant of eigenvector centrality that adds an attenuation factor (alpha) to prevent the infinite accumulation of centrality in cycles. It also includes a constant term (beta) to give every node some base centrality.
Formula: C(v) = α * Σ C(u) + β for all neighbors u
Time Complexity: O(max_iterations * (V + E))
Parameters
alpha: Attenuation factor (must be < 1/largest_eigenvalue, typically 0.1-0.3)beta: Base centrality (typically 1.0)max_iterations: Maximum number of iterationstolerance: Convergence threshold
Example
centrality.katz(graph, alpha: 0.1, beta: 1.0, max_iterations: 100, tolerance: 0.0001)
// => dict.from_list([#(1, 2.5), #(2, 3.0), #(3, 2.5)])
pub fn pagerank(
graph: model.Graph(n, e),
options: PageRankOptions,
) -> dict.Dict(Int, Float)
Calculates PageRank centrality for all nodes.
PageRank measures node importance based on the quality and quantity of incoming links. A node is important if it is linked to by other important nodes. Originally developed for ranking web pages, it’s useful for:
- Ranking nodes in directed networks
- Identifying influential nodes in citation networks
- Finding important entities in knowledge graphs
- Recommendation systems
The algorithm uses a “random surfer” model: with probability damping,
the surfer follows a random outgoing link; otherwise, they jump to any
random node. This models both link-following behavior and the possibility
of starting a new browsing session.
Time Complexity: O(max_iterations × (V + E))
When to Use PageRank
- Directed graphs where link direction matters
- When you care about link quality (links from important nodes count more)
- Citation networks, web graphs, recommendation systems
For undirected graphs, consider eigenvector/3 instead.
Example
// Use default options (recommended for most cases)
let options = centrality.default_pagerank_options()
let scores = centrality.pagerank(graph, options)
// => dict.from_list([#(1, 0.256), #(2, 0.488), #(3, 0.256)])
// Custom options for faster convergence or different damping
let custom = centrality.PageRankOptions(
damping: 0.9, // Follow more links before jumping
max_iterations: 50, // Faster but less precise
tolerance: 0.001, // Less strict convergence
)
let scores = centrality.pagerank(graph, custom)