SlideShare a Scribd company logo
Introduction to Complex Networks
V.A. Traag
KITLV, Leiden, the Netherlands
e-Humanities, KNAW, Amsterdam, the Netherlands
March 30, 2014
eRoyal Netherlands Academy of Arts and Sciences
Humanities
Overview
1 What are networks?
2 Classics: scale free & small worlds.
3 Percolation: giant components, failure & attack and epidemics.
Probability generating functions.
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Examples
• Neural networks
• Power grids
• Gas networks
• Internet router network
• World Wide Web
• Road networks
• Airline networks
• Call networks
• Social networks
• Social media networks
Basics
Network
• Graph or networks G = (V , E)
• Nodes V = 1, . . . , n (vertices)
Power station, webpage, intersection, person.
• Edges E ⊆ V × V (links, ties)
Power cables, hyperlinks, roads, friendships.
Can be directed, and possibly weighted
Essentials
• Degree ki is number of links at node i.
• If |E| = m number of edges, then i ki = 2m.
• Average degree k = 2m
n .
• Density p = m
(n
2)
= k
n−1 ≈ k
n .
• Most networks sparse: k , p low.
Picture worth . . . words
Visualisations essentially wrong, but sometimes insightful.
Need to assess statistics to understand networks.
Analysed properties
Analysis strategy
• Focus on some key (statistical) ingredients.
• Only overall general properties, no particulates.
• Compare to random graph: what can we expect?
• Modelling ⇒ replicate key properties.
Some key properties
• Degree distribution
• Degree correlations
• Path lengths
• Clustering
• Modularity
• Dynamics: inter event times
Small world
Milgram’s experiment (1960s)
• Ask people to reach specific person:
John Doe, Journalist, Kansas
• Send letter to acquaintance, who forwards, and so on
• Result: about 5 intermediaries to reach destination.
• Six degrees of separation.
Key question: is this different from a random graph?
Erd¨os-R´enyi (ER) graphs
Random graph
• Create empty graph G with n nodes.
• Every edge probability p of appearing.
• On average p n
2 = m edges.
• Average degree k = pn.
• Random graph essentially a (very simple) model.
• Was (and still is) used frequently.
Biology, epidemiology: well mixed population.
• Many interesting questions still.
Small world?
Path length
• Every node ki ≈ k , in steps, reach about k .
• When k = n reached whole network.
• Hence ≈ log n
log k : grows slowly!
Random edges create short paths.
Clustering
• Clustering, Ci =
ei
ki
2
.
• In ER graph Ci =
p3n(n − 1)
p2n(n − 1)
= p.
• Networks are sparse, low p, so low Ci .
Real world: both short paths & clustering. How to get that?
Small world?
Path length
• Every node ki ≈ k , in steps, reach about k .
• When k = n reached whole network.
• Hence ≈ log n
log k : grows slowly!
Random edges create short paths.
Clustering
• Clustering, Ci =
ei
ki
2
.
• In ER graph Ci =
p3n(n − 1)
p2n(n − 1)
= p.
• Networks are sparse, low p, so low Ci .
Real world: both short paths & clustering. How to get that?
Watts & Strogatz
Small world model
• Create lattice (connect to nearest neighbours).
• Rewire edge (or add) with probability p.
Watts & Strogatz
Watts & Strogatz
Small world model
• Create lattice (connect to nearest neighbours).
• Rewire edge (or add) with probability p.
Few shortcuts enough to create short paths
Degree distribution
0 20 40 60 80 100
ki ≈ k
Degree
Probability
• In real networks, power-law ki ∼ k−α, usually 2 < α < 3.
• In ER graphs, poisson ki ∼ k k
k! .
Degree distribution
100 101 102
Hubs
Degree
Probability
• In real networks, power-law ki ∼ k−α, usually 2 < α < 3.
• In ER graphs, poisson ki ∼ k k
k! .
Barab´asi & Albert
How to get power-law degree distribution?
Preferential attachment, cumulative advantage
Start with graph with q nodes
1 Add node
2 Add q links to previous nodes with probability pi ∼ ki
3 Repeat (1)-(2).
Results
• Analysis by master rate equation p(k) = k−1
2 p(k − 1) − k
2 p(k).
• Leads to p(k) = m(m+1)(m+2)
k(k+1)(k+2) ∼ k−3.
• Preferential attachment ⇒ scale free network.
Scale free
Scale free: so what?
Why does it matter?
• Scale free networks robust again random node failure.
• Vulnerable for targeted attacks (take out the hubs).
• No threshold for epidemic spreading.
Approach: percolation & generating functions.
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = k kpk1k−1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate mean k = g (1)
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xS )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(x Si )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = E(xSi )
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = g(x)
Generating functions
Definition (Generating function)
Let Pr(S = k) = pk. Then
g(x) = E(xS
) =
k
pkxk
is the probability generating function (pgf).
Properties
• Normalized g(1) = E(1S ) = k pk = 1
• Calculate moment km = x ∂
∂x
m
g
x=1
• Sum S = Si , pgf f (x) = g(x)m
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k pk(1 − p)n−kxk
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k pk(1 − p)n−kxk
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = k
n
k (xp)k(1 − p)n−k (binomial theorem)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (px + (1 − p))n
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (1 + p(x − 1))n (remember k = pn)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = (1 + k (x−1)
n )n (limn→∞, def. exp)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (x) = k e k (x−1).
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes g(x)m = em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes (g(x)m) = m k em k (x−1).
Degree generating function
Example, ER degree distribution
• Let pk be probability node has degree k.
• Take pk = n
k pk(1 − p)n−k (Erd¨os-R´enyi)
• Then pgf g(x) = e k (x−1)
• Normalized g(1) = e k (1−1) = 1.
• Mean g (1) = k .
• Number of neighbours of m nodes (g(1)m) = m k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k kpk
.
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k kpk
.
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k kpk
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k > k .
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2
k − k > 0.
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0.
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k qkxk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k(k + 1)pk+1xk
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k k kpkxk−1
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m k pmp2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m pm k p2(k|m)xk
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours m pmg1(x)m
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of second neighbours g(g1(x))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of third neighbours g(g1(g1(x)))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of d neighbours g(g1(· · · g1(x) · · · ))
• Average number of second neighbours k2 − k .
Excess degree distribution
Earlier: random node
Now: follow random edge to node
• Probability that node has k links is kpk
k .
• Average neighbour degree k2 − k 2
k > 0 (friendship paradox).
• Probability that node has k other links is qk = (k+1)pk+1
k
• pgf g1(x) = 1
k g (x)
• Second neigbhours for m degree g1(x)m = k p2(k|m)xk
• Distribution of d neighbours g(g1(· · · g1(x) · · · ))
• Average number of second neighbours k2 − k .
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u).
Giant component
Giant component (GC)
• Always only one GC (and lots of small ones).
• Probability link does not connect node to GC u.
• Probability node of degree k not in GC uk
• Probability node not in giant component k pkuk = g(u)
• Size of giant component: S = 1 − g(u).
But what is u?
Self consistency
• Probability link not connects to GC is u.
• Connects to node with k other neighbours: excess degree.
• Average probability: k qkuk = g1(u) = u.
Giant component
How to solve g1(u) = u?
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
g1(u)
u = g1(u)
u
Giant component
If derivative g1(1) > 1 giant component appears.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
g (1)
u
Giant component
• GC appears when 1 < g1(1).
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k kqk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k k(k + 1)pk+1.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k(k − 1)kpk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = 1
k k k2pk − kpk.
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Giant component
• GC appears when 1 < g1(1) = k2 − k
k .
• For ER graphs k2 − k = k 2, so k > 1 the GC appears.
0 1 2 3 4 5 6
0
0.5
1
k
S
• For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node does not “fail”.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node “functions”.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average k qk(1 − φ + φuk).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φ k qkuk.
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u).
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u) = u.
• Solve for u gives solution.
Node failure
How fast is giant component destroyed if nodes are removed?
Same approach
• Probability φ node not removed from network.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φ).
• II: Neighbour is not removed (φ), but not in GC (uk).
• So, probability is 1 − φ + φuk.
• On average 1 − φ + φg1(u) = u.
• Solve for u gives solution.
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if ∂
∂u 1 − φ + φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if ∂
∂u 1 − φ + φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φg1(u) > 1 GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
Node failure
• Again, solving u = 1 − φ + φg1(u) not easy.
• But if φ > 1
g1(u)
= φc GC exists.
• For ER φc = 1/ k , for scale free φc = 0.
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
ER
Scale Free
φ
S
0 0.2
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Node attack
What if we attack specific nodes?
Same approach
• Probability φk node of degree k does not “fail”.
• On average φ = k φkpk.
• Again u probability link does not connect to GC.
Self consistency
• I: Neighbour is removed (1 − φk).
• II: Neighbour is not removed (φk), but not in GC (uk−1).
• So on average u = k qk−1(1 − φk + φkuk−1).
• Define f (u) = k φkqk−1uk−1.
• Then u = 1 − f (1) + f (u), solve for u gives solution.
Failure and Attack
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, recovery rate ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = 1
g1(u)
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = k
k2 − k
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Disease spreading
• Standard models: Susceptable, Infected, Recovered.
• SIR: transmission rate β, infectious time τ = 1/ν.
• Infect neighbour with probability φ = 1 − eβτ
• How far will it spread: giant component.
Percolation
• I: Disease not transmitted (1 − φ).
• II: Disease transmitted (φ), but not to GC (uk).
• Already solved: critical φc = k
k2 − k
.
• Epidemiological threshold βτ = log k2 − k
k2 −2 k
Epidemics
Epidemic threshold
• For ER, threshold βτ = log k
k −1.
• For scale free, k2 diverges: always epidemic outbreak.
0 0.2 0.4 0.6 0.8 1
0
0.5
1
ER
Scale Free
φ
S
Conclusions
Models
• Short pats & clustering: small world model
• Scale free: preferential attachment
• Many other mechanisms: e.g. triadic closure, homophily, etc. . .
• Focus on stylistic features.
Analysis
• Scale-free networks robust, spread fast, but vulnerable for attack.
• Generating functions greatly help analysis.
• Compare observed network to random/model. How does it
deviate?
Questions?
Ad

More Related Content

What's hot (20)

17 Statistical Models for Networks
17 Statistical Models for Networks17 Statistical Models for Networks
17 Statistical Models for Networks
Duke Network Analysis Center
 
Network centrality measures and their effectiveness
Network centrality measures and their effectivenessNetwork centrality measures and their effectiveness
Network centrality measures and their effectiveness
emapesce
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysis
Giorgos Cheliotis
 
Chapter8
Chapter8Chapter8
Chapter8
akhila chilukuri
 
Clustering ppt
Clustering pptClustering ppt
Clustering ppt
sreedevibalasubraman
 
Graph Neural Network - Introduction
Graph Neural Network - IntroductionGraph Neural Network - Introduction
Graph Neural Network - Introduction
Jungwon Kim
 
Deepwalk vs Node2vec
Deepwalk vs Node2vecDeepwalk vs Node2vec
Deepwalk vs Node2vec
SiddhantVerma49
 
Community detection in graphs
Community detection in graphsCommunity detection in graphs
Community detection in graphs
Nicola Barbieri
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering Tutorial
Zitao Liu
 
Positive and Negative Relationship
Positive and Negative RelationshipPositive and Negative Relationship
Positive and Negative Relationship
SaeidGhasemshirazi
 
Community Detection in Social Networks: A Brief Overview
Community Detection in Social Networks: A Brief OverviewCommunity Detection in Social Networks: A Brief Overview
Community Detection in Social Networks: A Brief Overview
Satyaki Sikdar
 
1.7 data reduction
1.7 data reduction1.7 data reduction
1.7 data reduction
Krish_ver2
 
NE7012- SOCIAL NETWORK ANALYSIS
NE7012- SOCIAL NETWORK ANALYSISNE7012- SOCIAL NETWORK ANALYSIS
NE7012- SOCIAL NETWORK ANALYSIS
rathnaarul
 
Support Vector Machine without tears
Support Vector Machine without tearsSupport Vector Machine without tears
Support Vector Machine without tears
Ankit Sharma
 
PR-043: HyperNetworks
PR-043: HyperNetworksPR-043: HyperNetworks
PR-043: HyperNetworks
Taesu Kim
 
Social Network Analysis Introduction including Data Structure Graph overview.
Social Network Analysis Introduction including Data Structure Graph overview. Social Network Analysis Introduction including Data Structure Graph overview.
Social Network Analysis Introduction including Data Structure Graph overview.
Doug Needham
 
Content addressable network(can)
Content addressable network(can)Content addressable network(can)
Content addressable network(can)
Amit Dahal
 
Social Media Mining - Chapter 8 (Influence and Homophily)
Social Media Mining - Chapter 8 (Influence and Homophily)Social Media Mining - Chapter 8 (Influence and Homophily)
Social Media Mining - Chapter 8 (Influence and Homophily)
SocialMediaMining
 
Computer networks chapter1
Computer networks chapter1Computer networks chapter1
Computer networks chapter1
kirankumar boidhapu
 
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Preferred Networks
 
Network centrality measures and their effectiveness
Network centrality measures and their effectivenessNetwork centrality measures and their effectiveness
Network centrality measures and their effectiveness
emapesce
 
Graph Neural Network - Introduction
Graph Neural Network - IntroductionGraph Neural Network - Introduction
Graph Neural Network - Introduction
Jungwon Kim
 
Community detection in graphs
Community detection in graphsCommunity detection in graphs
Community detection in graphs
Nicola Barbieri
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering Tutorial
Zitao Liu
 
Positive and Negative Relationship
Positive and Negative RelationshipPositive and Negative Relationship
Positive and Negative Relationship
SaeidGhasemshirazi
 
Community Detection in Social Networks: A Brief Overview
Community Detection in Social Networks: A Brief OverviewCommunity Detection in Social Networks: A Brief Overview
Community Detection in Social Networks: A Brief Overview
Satyaki Sikdar
 
1.7 data reduction
1.7 data reduction1.7 data reduction
1.7 data reduction
Krish_ver2
 
NE7012- SOCIAL NETWORK ANALYSIS
NE7012- SOCIAL NETWORK ANALYSISNE7012- SOCIAL NETWORK ANALYSIS
NE7012- SOCIAL NETWORK ANALYSIS
rathnaarul
 
Support Vector Machine without tears
Support Vector Machine without tearsSupport Vector Machine without tears
Support Vector Machine without tears
Ankit Sharma
 
PR-043: HyperNetworks
PR-043: HyperNetworksPR-043: HyperNetworks
PR-043: HyperNetworks
Taesu Kim
 
Social Network Analysis Introduction including Data Structure Graph overview.
Social Network Analysis Introduction including Data Structure Graph overview. Social Network Analysis Introduction including Data Structure Graph overview.
Social Network Analysis Introduction including Data Structure Graph overview.
Doug Needham
 
Content addressable network(can)
Content addressable network(can)Content addressable network(can)
Content addressable network(can)
Amit Dahal
 
Social Media Mining - Chapter 8 (Influence and Homophily)
Social Media Mining - Chapter 8 (Influence and Homophily)Social Media Mining - Chapter 8 (Influence and Homophily)
Social Media Mining - Chapter 8 (Influence and Homophily)
SocialMediaMining
 
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Preferred Networks
 

Viewers also liked (9)

Graph theory concepts complex networks presents-rouhollah nabati
Graph theory concepts   complex networks presents-rouhollah nabatiGraph theory concepts   complex networks presents-rouhollah nabati
Graph theory concepts complex networks presents-rouhollah nabati
nabati
 
Big data 43 (guardian) final
Big data 43 (guardian) finalBig data 43 (guardian) final
Big data 43 (guardian) final
Keisha Taylor
 
Complex contagion of campaign donations
Complex contagion of campaign donationsComplex contagion of campaign donations
Complex contagion of campaign donations
Vincent Traag
 
Networks .ppt
Networks .pptNetworks .ppt
Networks .ppt
brisso99
 
Boekhouden - Introductie
Boekhouden - IntroductieBoekhouden - Introductie
Boekhouden - Introductie
Jurroen Cluitmans
 
Mind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the NetworkMind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the Network
Paolo Ciuccarelli
 
Albert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health careAlbert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health care
ponencias_mihealth2012
 
How to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & TricksHow to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & Tricks
SlideShare
 
Getting Started With SlideShare
Getting Started With SlideShareGetting Started With SlideShare
Getting Started With SlideShare
SlideShare
 
Graph theory concepts complex networks presents-rouhollah nabati
Graph theory concepts   complex networks presents-rouhollah nabatiGraph theory concepts   complex networks presents-rouhollah nabati
Graph theory concepts complex networks presents-rouhollah nabati
nabati
 
Big data 43 (guardian) final
Big data 43 (guardian) finalBig data 43 (guardian) final
Big data 43 (guardian) final
Keisha Taylor
 
Complex contagion of campaign donations
Complex contagion of campaign donationsComplex contagion of campaign donations
Complex contagion of campaign donations
Vincent Traag
 
Networks .ppt
Networks .pptNetworks .ppt
Networks .ppt
brisso99
 
Mind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the NetworkMind the Graph! A Discussion on the Design of the Network
Mind the Graph! A Discussion on the Design of the Network
Paolo Ciuccarelli
 
Albert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health careAlbert Laszlo Barabasi - Innovation inspired positive change in health care
Albert Laszlo Barabasi - Innovation inspired positive change in health care
ponencias_mihealth2012
 
How to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & TricksHow to Make Awesome SlideShares: Tips & Tricks
How to Make Awesome SlideShares: Tips & Tricks
SlideShare
 
Getting Started With SlideShare
Getting Started With SlideShareGetting Started With SlideShare
Getting Started With SlideShare
SlideShare
 
Ad

Similar to Introduction to complex networks (20)

Topology Matters in Communication
Topology Matters in CommunicationTopology Matters in Communication
Topology Matters in Communication
cseiitgn
 
Relaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networksRelaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networks
David Gleich
 
Sampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying FrameworkSampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying Framework
Nesreen K. Ahmed
 
Mae331 lecture1
Mae331 lecture1Mae331 lecture1
Mae331 lecture1
Ahmed Chegrani
 
Optim_methods.pdf
Optim_methods.pdfOptim_methods.pdf
Optim_methods.pdf
SantiagoGarridoBulln
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysis
rik0
 
Target tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filteringTarget tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filtering
Luis Úbeda Medina
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012
Zheng Mengdi
 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf
Rahul926331
 
implementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.pptimplementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.ppt
MuhammadAbdullah311866
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
The Statistical and Applied Mathematical Sciences Institute
 
lecture07 dicrete mathematics relation .ppt
lecture07 dicrete mathematics relation .pptlecture07 dicrete mathematics relation .ppt
lecture07 dicrete mathematics relation .ppt
ssuser7b9bda1
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Leo Asselborn
 
13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf
EmanAsem4
 
DimensionalityReduction.pptx
DimensionalityReduction.pptxDimensionalityReduction.pptx
DimensionalityReduction.pptx
36rajneekant
 
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Randa Elanwar
 
Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)
Kira
 
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
2018 MUMS Fall Course - Sampling-based techniques for uncertainty propagation...
The Statistical and Applied Mathematical Sciences Institute
 
Regularized Estimation of Spatial Patterns
Regularized Estimation of Spatial PatternsRegularized Estimation of Spatial Patterns
Regularized Estimation of Spatial Patterns
Wen-Ting Wang
 
Topology Matters in Communication
Topology Matters in CommunicationTopology Matters in Communication
Topology Matters in Communication
cseiitgn
 
Relaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networksRelaxation methods for the matrix exponential on large networks
Relaxation methods for the matrix exponential on large networks
David Gleich
 
Sampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying FrameworkSampling from Massive Graph Streams: A Unifying Framework
Sampling from Massive Graph Streams: A Unifying Framework
Nesreen K. Ahmed
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysis
rik0
 
Target tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filteringTarget tracking suing multiple auxiliary particle filtering
Target tracking suing multiple auxiliary particle filtering
Luis Úbeda Medina
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012
Zheng Mengdi
 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf
Rahul926331
 
implementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.pptimplementing the encryption in the JAVA.ppt
implementing the encryption in the JAVA.ppt
MuhammadAbdullah311866
 
lecture07 dicrete mathematics relation .ppt
lecture07 dicrete mathematics relation .pptlecture07 dicrete mathematics relation .ppt
lecture07 dicrete mathematics relation .ppt
ssuser7b9bda1
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
Christian Robert
 
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Control of Discrete-Time Piecewise Affine Probabilistic Systems using Reachab...
Leo Asselborn
 
13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf13_Unsupervised Learning.pdf
13_Unsupervised Learning.pdf
EmanAsem4
 
DimensionalityReduction.pptx
DimensionalityReduction.pptxDimensionalityReduction.pptx
DimensionalityReduction.pptx
36rajneekant
 
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9Introduction to Neural networks (under graduate course) Lecture 6 of 9
Introduction to Neural networks (under graduate course) Lecture 6 of 9
Randa Elanwar
 
Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)Tutorial 4 (duplicate detection)
Tutorial 4 (duplicate detection)
Kira
 
Regularized Estimation of Spatial Patterns
Regularized Estimation of Spatial PatternsRegularized Estimation of Spatial Patterns
Regularized Estimation of Spatial Patterns
Wen-Ting Wang
 
Ad

More from Vincent Traag (20)

Peer review uncertainty at the institutional level
Peer review uncertainty at the institutional levelPeer review uncertainty at the institutional level
Peer review uncertainty at the institutional level
Vincent Traag
 
Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?
Vincent Traag
 
Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...
Vincent Traag
 
Uncovering important intermediate publications
Uncovering important intermediate publicationsUncovering important intermediate publications
Uncovering important intermediate publications
Vincent Traag
 
Polarization and consensus in citation networks
Polarization and consensus in citation networksPolarization and consensus in citation networks
Polarization and consensus in citation networks
Vincent Traag
 
Community structure in complex networks
Community structure in complex networksCommunity structure in complex networks
Community structure in complex networks
Vincent Traag
 
Public thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networksPublic thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networks
Vincent Traag
 
Structure of media attention
Structure of media attentionStructure of media attention
Structure of media attention
Vincent Traag
 
Dynamics of Media Attention
Dynamics of Media AttentionDynamics of Media Attention
Dynamics of Media Attention
Vincent Traag
 
Dynamical Models Explaining Social Balance
Dynamical Models Explaining Social BalanceDynamical Models Explaining Social Balance
Dynamical Models Explaining Social Balance
Vincent Traag
 
Significant scales in community structure
Significant scales in community structureSignificant scales in community structure
Significant scales in community structure
Vincent Traag
 
Reconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from NewspapersReconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from Newspapers
Vincent Traag
 
Reputation Dynamics Through Gossiping
Reputation Dynamics Through GossipingReputation Dynamics Through Gossiping
Reputation Dynamics Through Gossiping
Vincent Traag
 
Limits of community detection
Limits of community detectionLimits of community detection
Limits of community detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 
Resolution-free community detection
Resolution-free community detectionResolution-free community detection
Resolution-free community detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 
Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.
Vincent Traag
 
Social Event Detection
Social Event DetectionSocial Event Detection
Social Event Detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 
Peer review uncertainty at the institutional level
Peer review uncertainty at the institutional levelPeer review uncertainty at the institutional level
Peer review uncertainty at the institutional level
Vincent Traag
 
Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?Replacing peer review by metrics in the UK REF?
Replacing peer review by metrics in the UK REF?
Vincent Traag
 
Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...Use of the journal impact factor for assessing individual articles need not b...
Use of the journal impact factor for assessing individual articles need not b...
Vincent Traag
 
Uncovering important intermediate publications
Uncovering important intermediate publicationsUncovering important intermediate publications
Uncovering important intermediate publications
Vincent Traag
 
Polarization and consensus in citation networks
Polarization and consensus in citation networksPolarization and consensus in citation networks
Polarization and consensus in citation networks
Vincent Traag
 
Community structure in complex networks
Community structure in complex networksCommunity structure in complex networks
Community structure in complex networks
Vincent Traag
 
Public thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networksPublic thesis defence: groups and reputation in social networks
Public thesis defence: groups and reputation in social networks
Vincent Traag
 
Structure of media attention
Structure of media attentionStructure of media attention
Structure of media attention
Vincent Traag
 
Dynamics of Media Attention
Dynamics of Media AttentionDynamics of Media Attention
Dynamics of Media Attention
Vincent Traag
 
Dynamical Models Explaining Social Balance
Dynamical Models Explaining Social BalanceDynamical Models Explaining Social Balance
Dynamical Models Explaining Social Balance
Vincent Traag
 
Significant scales in community structure
Significant scales in community structureSignificant scales in community structure
Significant scales in community structure
Vincent Traag
 
Reconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from NewspapersReconstructing Third World Elite Rotation Events from Newspapers
Reconstructing Third World Elite Rotation Events from Newspapers
Vincent Traag
 
Reputation Dynamics Through Gossiping
Reputation Dynamics Through GossipingReputation Dynamics Through Gossiping
Reputation Dynamics Through Gossiping
Vincent Traag
 
Limits of community detection
Limits of community detectionLimits of community detection
Limits of community detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 
Resolution-free community detection
Resolution-free community detectionResolution-free community detection
Resolution-free community detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 
Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.Exponential Ranking: Taking into account negative links.
Exponential Ranking: Taking into account negative links.
Vincent Traag
 
Social Event Detection
Social Event DetectionSocial Event Detection
Social Event Detection
Vincent Traag
 
Cooperation, Reputation & Gossiping
Cooperation, Reputation & GossipingCooperation, Reputation & Gossiping
Cooperation, Reputation & Gossiping
Vincent Traag
 

Recently uploaded (20)

Pharmacologically active constituents.pdf
Pharmacologically active constituents.pdfPharmacologically active constituents.pdf
Pharmacologically active constituents.pdf
Nistarini College, Purulia (W.B) India
 
Animal Models for Biological and Clinical Research ppt 2.pptx
Animal Models for Biological and Clinical Research ppt 2.pptxAnimal Models for Biological and Clinical Research ppt 2.pptx
Animal Models for Biological and Clinical Research ppt 2.pptx
MahitaLaveti
 
Reticular formation_groups_organization_
Reticular formation_groups_organization_Reticular formation_groups_organization_
Reticular formation_groups_organization_
klynct
 
CORONARY ARTERY BYPASS GRAFTING (1).pptx
CORONARY ARTERY BYPASS GRAFTING (1).pptxCORONARY ARTERY BYPASS GRAFTING (1).pptx
CORONARY ARTERY BYPASS GRAFTING (1).pptx
DharaniJajula
 
Transgenic Mice in Cancer Research - Creative Biolabs
Transgenic Mice in Cancer Research - Creative BiolabsTransgenic Mice in Cancer Research - Creative Biolabs
Transgenic Mice in Cancer Research - Creative Biolabs
Creative-Biolabs
 
Introduction to Black Hole and how its formed
Introduction to Black Hole and how its formedIntroduction to Black Hole and how its formed
Introduction to Black Hole and how its formed
MSafiullahALawi
 
Top 10 Biotech Startups for Beginners.pptx
Top 10 Biotech Startups for Beginners.pptxTop 10 Biotech Startups for Beginners.pptx
Top 10 Biotech Startups for Beginners.pptx
alexbagheriam
 
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptxSiver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
PriyaAntil3
 
Mycology:Characteristics of Ascomycetes Fungi
Mycology:Characteristics of Ascomycetes FungiMycology:Characteristics of Ascomycetes Fungi
Mycology:Characteristics of Ascomycetes Fungi
SAYANTANMALLICK5
 
Antimalarial drug Medicinal Chemistry III
Antimalarial drug Medicinal Chemistry IIIAntimalarial drug Medicinal Chemistry III
Antimalarial drug Medicinal Chemistry III
HRUTUJA WAGH
 
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.pptSULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
HRUTUJA WAGH
 
Seismic evidence of liquid water at the base of Mars' upper crust
Seismic evidence of liquid water at the base of Mars' upper crustSeismic evidence of liquid water at the base of Mars' upper crust
Seismic evidence of liquid water at the base of Mars' upper crust
Sérgio Sacani
 
Study in Pink (forensic case study of Death)
Study in Pink (forensic case study of Death)Study in Pink (forensic case study of Death)
Study in Pink (forensic case study of Death)
memesologiesxd
 
Batteries and fuel cells for btech first year
Batteries and fuel cells for btech first yearBatteries and fuel cells for btech first year
Batteries and fuel cells for btech first year
MithilPillai1
 
Secondary metabolite ,Plants and Health Care
Secondary metabolite ,Plants and Health CareSecondary metabolite ,Plants and Health Care
Secondary metabolite ,Plants and Health Care
Nistarini College, Purulia (W.B) India
 
Components of the Human Circulatory System.pptx
Components of the Human  Circulatory System.pptxComponents of the Human  Circulatory System.pptx
Components of the Human Circulatory System.pptx
autumnstreaks
 
Fatigue and its management in aviation medicine
Fatigue and its management in aviation medicineFatigue and its management in aviation medicine
Fatigue and its management in aviation medicine
ImranJewel2
 
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptxCleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
zainab98aug
 
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptxA CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
ANJALICHANDRASEKARAN
 
An upper limit to the lifetime of stellar remnants from gravitational pair pr...
An upper limit to the lifetime of stellar remnants from gravitational pair pr...An upper limit to the lifetime of stellar remnants from gravitational pair pr...
An upper limit to the lifetime of stellar remnants from gravitational pair pr...
Sérgio Sacani
 
Animal Models for Biological and Clinical Research ppt 2.pptx
Animal Models for Biological and Clinical Research ppt 2.pptxAnimal Models for Biological and Clinical Research ppt 2.pptx
Animal Models for Biological and Clinical Research ppt 2.pptx
MahitaLaveti
 
Reticular formation_groups_organization_
Reticular formation_groups_organization_Reticular formation_groups_organization_
Reticular formation_groups_organization_
klynct
 
CORONARY ARTERY BYPASS GRAFTING (1).pptx
CORONARY ARTERY BYPASS GRAFTING (1).pptxCORONARY ARTERY BYPASS GRAFTING (1).pptx
CORONARY ARTERY BYPASS GRAFTING (1).pptx
DharaniJajula
 
Transgenic Mice in Cancer Research - Creative Biolabs
Transgenic Mice in Cancer Research - Creative BiolabsTransgenic Mice in Cancer Research - Creative Biolabs
Transgenic Mice in Cancer Research - Creative Biolabs
Creative-Biolabs
 
Introduction to Black Hole and how its formed
Introduction to Black Hole and how its formedIntroduction to Black Hole and how its formed
Introduction to Black Hole and how its formed
MSafiullahALawi
 
Top 10 Biotech Startups for Beginners.pptx
Top 10 Biotech Startups for Beginners.pptxTop 10 Biotech Startups for Beginners.pptx
Top 10 Biotech Startups for Beginners.pptx
alexbagheriam
 
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptxSiver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
Siver Nanoparticles syntheisis, mechanism, Antibacterial activity.pptx
PriyaAntil3
 
Mycology:Characteristics of Ascomycetes Fungi
Mycology:Characteristics of Ascomycetes FungiMycology:Characteristics of Ascomycetes Fungi
Mycology:Characteristics of Ascomycetes Fungi
SAYANTANMALLICK5
 
Antimalarial drug Medicinal Chemistry III
Antimalarial drug Medicinal Chemistry IIIAntimalarial drug Medicinal Chemistry III
Antimalarial drug Medicinal Chemistry III
HRUTUJA WAGH
 
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.pptSULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
SULPHONAMIDES AND SULFONES Medicinal Chemistry III.ppt
HRUTUJA WAGH
 
Seismic evidence of liquid water at the base of Mars' upper crust
Seismic evidence of liquid water at the base of Mars' upper crustSeismic evidence of liquid water at the base of Mars' upper crust
Seismic evidence of liquid water at the base of Mars' upper crust
Sérgio Sacani
 
Study in Pink (forensic case study of Death)
Study in Pink (forensic case study of Death)Study in Pink (forensic case study of Death)
Study in Pink (forensic case study of Death)
memesologiesxd
 
Batteries and fuel cells for btech first year
Batteries and fuel cells for btech first yearBatteries and fuel cells for btech first year
Batteries and fuel cells for btech first year
MithilPillai1
 
Components of the Human Circulatory System.pptx
Components of the Human  Circulatory System.pptxComponents of the Human  Circulatory System.pptx
Components of the Human Circulatory System.pptx
autumnstreaks
 
Fatigue and its management in aviation medicine
Fatigue and its management in aviation medicineFatigue and its management in aviation medicine
Fatigue and its management in aviation medicine
ImranJewel2
 
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptxCleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
Cleaned_Expanded_Metal_Nanoparticles_Presentation.pptx
zainab98aug
 
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptxA CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
A CASE OF MULTINODULAR GOITRE,clinical presentation and management.pptx
ANJALICHANDRASEKARAN
 
An upper limit to the lifetime of stellar remnants from gravitational pair pr...
An upper limit to the lifetime of stellar remnants from gravitational pair pr...An upper limit to the lifetime of stellar remnants from gravitational pair pr...
An upper limit to the lifetime of stellar remnants from gravitational pair pr...
Sérgio Sacani
 

Introduction to complex networks

  • 1. Introduction to Complex Networks V.A. Traag KITLV, Leiden, the Netherlands e-Humanities, KNAW, Amsterdam, the Netherlands March 30, 2014 eRoyal Netherlands Academy of Arts and Sciences Humanities
  • 2. Overview 1 What are networks? 2 Classics: scale free & small worlds. 3 Percolation: giant components, failure & attack and epidemics. Probability generating functions.
  • 3. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 4. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 5. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 6. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 7. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 8. Examples • Neural networks • Power grids • Gas networks • Internet router network • World Wide Web • Road networks • Airline networks • Call networks • Social networks • Social media networks
  • 9. Basics Network • Graph or networks G = (V , E) • Nodes V = 1, . . . , n (vertices) Power station, webpage, intersection, person. • Edges E ⊆ V × V (links, ties) Power cables, hyperlinks, roads, friendships. Can be directed, and possibly weighted Essentials • Degree ki is number of links at node i. • If |E| = m number of edges, then i ki = 2m. • Average degree k = 2m n . • Density p = m (n 2) = k n−1 ≈ k n . • Most networks sparse: k , p low.
  • 10. Picture worth . . . words Visualisations essentially wrong, but sometimes insightful. Need to assess statistics to understand networks.
  • 11. Analysed properties Analysis strategy • Focus on some key (statistical) ingredients. • Only overall general properties, no particulates. • Compare to random graph: what can we expect? • Modelling ⇒ replicate key properties. Some key properties • Degree distribution • Degree correlations • Path lengths • Clustering • Modularity • Dynamics: inter event times
  • 12. Small world Milgram’s experiment (1960s) • Ask people to reach specific person: John Doe, Journalist, Kansas • Send letter to acquaintance, who forwards, and so on • Result: about 5 intermediaries to reach destination. • Six degrees of separation. Key question: is this different from a random graph?
  • 13. Erd¨os-R´enyi (ER) graphs Random graph • Create empty graph G with n nodes. • Every edge probability p of appearing. • On average p n 2 = m edges. • Average degree k = pn. • Random graph essentially a (very simple) model. • Was (and still is) used frequently. Biology, epidemiology: well mixed population. • Many interesting questions still.
  • 14. Small world? Path length • Every node ki ≈ k , in steps, reach about k . • When k = n reached whole network. • Hence ≈ log n log k : grows slowly! Random edges create short paths. Clustering • Clustering, Ci = ei ki 2 . • In ER graph Ci = p3n(n − 1) p2n(n − 1) = p. • Networks are sparse, low p, so low Ci . Real world: both short paths & clustering. How to get that?
  • 15. Small world? Path length • Every node ki ≈ k , in steps, reach about k . • When k = n reached whole network. • Hence ≈ log n log k : grows slowly! Random edges create short paths. Clustering • Clustering, Ci = ei ki 2 . • In ER graph Ci = p3n(n − 1) p2n(n − 1) = p. • Networks are sparse, low p, so low Ci . Real world: both short paths & clustering. How to get that?
  • 16. Watts & Strogatz Small world model • Create lattice (connect to nearest neighbours). • Rewire edge (or add) with probability p.
  • 18. Watts & Strogatz Small world model • Create lattice (connect to nearest neighbours). • Rewire edge (or add) with probability p. Few shortcuts enough to create short paths
  • 19. Degree distribution 0 20 40 60 80 100 ki ≈ k Degree Probability • In real networks, power-law ki ∼ k−α, usually 2 < α < 3. • In ER graphs, poisson ki ∼ k k k! .
  • 20. Degree distribution 100 101 102 Hubs Degree Probability • In real networks, power-law ki ∼ k−α, usually 2 < α < 3. • In ER graphs, poisson ki ∼ k k k! .
  • 21. Barab´asi & Albert How to get power-law degree distribution? Preferential attachment, cumulative advantage Start with graph with q nodes 1 Add node 2 Add q links to previous nodes with probability pi ∼ ki 3 Repeat (1)-(2). Results • Analysis by master rate equation p(k) = k−1 2 p(k − 1) − k 2 p(k). • Leads to p(k) = m(m+1)(m+2) k(k+1)(k+2) ∼ k−3. • Preferential attachment ⇒ scale free network.
  • 22. Scale free Scale free: so what? Why does it matter? • Scale free networks robust again random node failure. • Vulnerable for targeted attacks (take out the hubs). • No threshold for epidemic spreading. Approach: percolation & generating functions.
  • 23. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 24. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 25. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk • Sum S = Si , pgf f (x) = E(xS )
  • 26. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = k kpk1k−1 • Sum S = Si , pgf f (x) = E(xS )
  • 27. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate mean k = g (1) • Sum S = Si , pgf f (x) = E(xS )
  • 28. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xS )
  • 29. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xS )
  • 30. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(x Si )
  • 31. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = E(xSi )
  • 32. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = g(x)
  • 33. Generating functions Definition (Generating function) Let Pr(S = k) = pk. Then g(x) = E(xS ) = k pkxk is the probability generating function (pgf). Properties • Normalized g(1) = E(1S ) = k pk = 1 • Calculate moment km = x ∂ ∂x m g x=1 • Sum S = Si , pgf f (x) = g(x)m
  • 34. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k pk(1 − p)n−kxk • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 35. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k pk(1 − p)n−kxk • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 36. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = k n k (xp)k(1 − p)n−k (binomial theorem) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 37. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (px + (1 − p))n • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 38. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (1 + p(x − 1))n (remember k = pn) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 39. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = (1 + k (x−1) n )n (limn→∞, def. exp) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 40. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 41. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 42. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (x) = k e k (x−1). • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 43. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 44. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes g(x)m = em k (x−1).
  • 45. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes (g(x)m) = m k em k (x−1).
  • 46. Degree generating function Example, ER degree distribution • Let pk be probability node has degree k. • Take pk = n k pk(1 − p)n−k (Erd¨os-R´enyi) • Then pgf g(x) = e k (x−1) • Normalized g(1) = e k (1−1) = 1. • Mean g (1) = k . • Number of neighbours of m nodes (g(1)m) = m k .
  • 47. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k kpk . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 48. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k kpk . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 49. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 50. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k kpk k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 51. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 52. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k > k . • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 53. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 k − k > 0. • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 54. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0. • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 55. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 56. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 57. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k qkxk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 58. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k(k + 1)pk+1xk • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 59. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k k kpkxk−1 • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 60. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 61. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 62. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 63. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m k pmp2(k|m)xk • Average number of second neighbours k2 − k .
  • 64. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m pm k p2(k|m)xk • Average number of second neighbours k2 − k .
  • 65. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours m pmg1(x)m • Average number of second neighbours k2 − k .
  • 66. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of second neighbours g(g1(x)) • Average number of second neighbours k2 − k .
  • 67. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of third neighbours g(g1(g1(x))) • Average number of second neighbours k2 − k .
  • 68. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of d neighbours g(g1(· · · g1(x) · · · )) • Average number of second neighbours k2 − k .
  • 69. Excess degree distribution Earlier: random node Now: follow random edge to node • Probability that node has k links is kpk k . • Average neighbour degree k2 − k 2 k > 0 (friendship paradox). • Probability that node has k other links is qk = (k+1)pk+1 k • pgf g1(x) = 1 k g (x) • Second neigbhours for m degree g1(x)m = k p2(k|m)xk • Distribution of d neighbours g(g1(· · · g1(x) · · · )) • Average number of second neighbours k2 − k .
  • 70. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 71. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 72. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 73. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 74. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 75. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 76. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u).
  • 77. Giant component Giant component (GC) • Always only one GC (and lots of small ones). • Probability link does not connect node to GC u. • Probability node of degree k not in GC uk • Probability node not in giant component k pkuk = g(u) • Size of giant component: S = 1 − g(u). But what is u? Self consistency • Probability link not connects to GC is u. • Connects to node with k other neighbours: excess degree. • Average probability: k qkuk = g1(u) = u.
  • 78. Giant component How to solve g1(u) = u? 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 g1(u) u = g1(u) u
  • 79. Giant component If derivative g1(1) > 1 giant component appears. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 g (1) u
  • 80. Giant component • GC appears when 1 < g1(1). • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 81. Giant component • GC appears when 1 < g1(1) = k kqk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 82. Giant component • GC appears when 1 < g1(1) = 1 k k k(k + 1)pk+1. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 83. Giant component • GC appears when 1 < g1(1) = 1 k k(k − 1)kpk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 84. Giant component • GC appears when 1 < g1(1) = 1 k k k2pk − kpk. • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 85. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 86. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 87. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 88. Giant component • GC appears when 1 < g1(1) = k2 − k k . • For ER graphs k2 − k = k 2, so k > 1 the GC appears. 0 1 2 3 4 5 6 0 0.5 1 k S • For scale free graphs k2 → ∞ , so always GC (if 2 < α < 3).
  • 89. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node does not “fail”. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 90. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node “functions”. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 91. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 92. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 93. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 94. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 95. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average k qk(1 − φ + φuk). • Solve for u gives solution.
  • 96. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φ k qkuk. • Solve for u gives solution.
  • 97. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u). • Solve for u gives solution.
  • 98. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u) = u. • Solve for u gives solution.
  • 99. Node failure How fast is giant component destroyed if nodes are removed? Same approach • Probability φ node not removed from network. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φ). • II: Neighbour is not removed (φ), but not in GC (uk). • So, probability is 1 − φ + φuk. • On average 1 − φ + φg1(u) = u. • Solve for u gives solution.
  • 100. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if ∂ ∂u 1 − φ + φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 101. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if ∂ ∂u 1 − φ + φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 102. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φg1(u) > 1 GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 103. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 104. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 105. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S
  • 106. Node failure • Again, solving u = 1 − φ + φg1(u) not easy. • But if φ > 1 g1(u) = φc GC exists. • For ER φc = 1/ k , for scale free φc = 0. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 ER Scale Free φ S 0 0.2
  • 107. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 108. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 109. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 110. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 111. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 112. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 113. Node attack What if we attack specific nodes? Same approach • Probability φk node of degree k does not “fail”. • On average φ = k φkpk. • Again u probability link does not connect to GC. Self consistency • I: Neighbour is removed (1 − φk). • II: Neighbour is not removed (φk), but not in GC (uk−1). • So on average u = k qk−1(1 − φk + φkuk−1). • Define f (u) = k φkqk−1uk−1. • Then u = 1 − f (1) + f (u), solve for u gives solution.
  • 115. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, recovery rate ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 116. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 117. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 118. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = 1 g1(u) . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 119. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = k k2 − k . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 120. Epidemics Disease spreading • Standard models: Susceptable, Infected, Recovered. • SIR: transmission rate β, infectious time τ = 1/ν. • Infect neighbour with probability φ = 1 − eβτ • How far will it spread: giant component. Percolation • I: Disease not transmitted (1 − φ). • II: Disease transmitted (φ), but not to GC (uk). • Already solved: critical φc = k k2 − k . • Epidemiological threshold βτ = log k2 − k k2 −2 k
  • 121. Epidemics Epidemic threshold • For ER, threshold βτ = log k k −1. • For scale free, k2 diverges: always epidemic outbreak. 0 0.2 0.4 0.6 0.8 1 0 0.5 1 ER Scale Free φ S
  • 122. Conclusions Models • Short pats & clustering: small world model • Scale free: preferential attachment • Many other mechanisms: e.g. triadic closure, homophily, etc. . . • Focus on stylistic features. Analysis • Scale-free networks robust, spread fast, but vulnerable for attack. • Generating functions greatly help analysis. • Compare observed network to random/model. How does it deviate? Questions?
  翻译: