Trust in Networks and Networked Systems
John S. Baras The Institute for Systems Research, Electrical and Computer Engineering Department, Applied Mathematics, Statistics and Scientific Computation Program University of Maryland College Park USA
December 7, 2017 ANU Workshop on Systems and Control ANU, Canberra, Australia Acknowledgments
• Joint work with: Peixin Gao, Tao Jiang, Ion Matei, Kiran Somasundaram, Xiangyang Liu, George Theodorakopoulos • Sponsors: NSF, ARO, ARL, AFOSR, NIST, DARPA, Lockheed Martin, Northrop Grumman, Telcordia (ACS)
2 Networked Systems
Infrastructure / Social / Biological Communication Economic Networks Networks Networtks
Internet / WWW Social MANET Interactions Community Epiddemic Sensor Nets Collaboration Cellular and Robotic Nets Social Filtering Economic Sub-cellular Hybrid Nets: Neural Comm, Sensor, Alliances Insects Robotic and Web-based Human Nets social systems Animal Flocks
3 Networks and Trust
• Trust and reputation critical for collaboration • Characteristics of trust relations: – Integrative (Parsons1937) – main source of social order – Reduction of complexity – without it bureaucracy and transaction complexity increases (Luhmann 1988) – Trust as a lubricant for cooperation (Arrow 1974) – rational choice theory • Social Webs, Economic Webs – MySpace, Facebook, Windows Live Spaces, Flickr, Classmates Online, Orkut, Yahoo! Groups, MSN Groups – e-commerce, e-XYZ, services and service composition – Reputation and recommender systems 4 Indirect Network Trust
User 8 asks for access to User 1’s files. User 1 and User 8 have no previous interaction What should User 1 do? 2 Use transitivity 1 7 of trust (i.e. use references to 4 compute 6 indirect trust) 3 8 5
5 Indirect Trust: System Model
• System mapped to a weighted, directed graph – Vertices : entities/users – Edges : direct trust relations – Weights : w(i,j) = How much i trusts j • Establish an indirect trust relation, between users that have not had direct interactions – We assume that trust is transitive (at least partially)
• Trust computation: path problem on a graph – Information about j that is useful to i Directed paths from i to j – Combine information along each path, and then aggregate across paths
6 Semirings-Examples
• Shortest Path Problem – Semiring: – is + and computes total path delay – is and picks shortest path • Bottleneck Problem – Semiring: – is and computes path bandwidth – is and picks highest bandwidth
7 Trust Semiring Properties: Partial Order
• Combined along-a-path weight should not increase : ab 1 2 3
• Combined across-paths weight should not decrease : a
b
8 Trust Path Semiring
•0 £ trust, confidence £ 1 • is • is
9 Trust Distance Semiring
• Motivated from Eisner’s Expectation Semiring (2002) (speech/language processing)
(a1, b1) (a2, b2) = (a1b2+a2b1, b1b2)
(a1, b1) (a2, b2) = (a1 + a2, b1 + b2) •0 £ trust, confidence £ 1, (t, c) (c/t, c) S = [0, ¥] x [0,1] • is
• is
10 Computing Indirect Trust
• Path interpretation
• Linear system interpretation
ttwij ik kj User k Indicator vector of pre- trusted nodes tWtnn1 b • Treat as a linear system – We are looking for its steady state.
11 Attacks to Indirect Trust
• ATTACK the trust computation!
– Aim: Increase t1→8 to a level that would grant access. •How? – Edge attack: change opinion on an edge (trick a node into forming false opinion) – Node attack: change any opinion emanating from a node (gain complete control of a node)
12 Attacks to Indirect Trust
Edge Attack 2 1 7
4 6 3
5 8
13 Attacks to Indirect Trust
Node Attack 2 1 7
4 6 3
5 8
14 Edge Tolerances
• Upper (Lower) edge tolerance of an edge e, w.r.t. an optimal path p*, is the highest (lowest) weight of e that would preserve the optimality of p*.
• In a shortest path problem (min, +), the most vital edge is the path edge whose weight has the largest difference with the upper tolerance.
• In a maximum capacity problem (max, min), the most vital edge is the path edge whose weight has the largest difference with the lower tolerance.
15 Upper Tolerance Example
• Upper Tolerances for the Shortest Path Problem
Upper Tolerances 2 4 ∞∞∞∞ 55 6 6 Shortest Path 10 12 1 3 5 1 10 Most Vital Edge
16 Lower Tolerance Example
• Lower Tolerances for the Shortest Path Problem
“Smallest” Lower Tolerances 2 4 required changes -4 -4 4 4 55 6 6 Shortest Path -∞ -∞ 1 3 5 1 10
17 Attacked Edge on the Path
Trust Edge Attack
2 1 7
4 RESULT: Decrease Trust! 6 3
5 Optimal Path p*, 8 trust value: t*
18 Attacked Edge not on the Path
Trust Edge Attack
2 1 7 New Optimal Path p’, trust value: t’ 4 RESULT: Increase Trust! 6 Change Path! 3
5 Optimal Path p*, 8 trust value: t*
19 Tolerances for any Optimization Semiring
• Optimization semirings: is min or max
• -minimal (maximal) tolerance αe (βe) of edge e instead of lower (upper) tolerance. • is the inverse of defined by: a x = b x = b a • w(e) is the weight of edge e. w(p) is the weight of path p. If e p* If e p*
20 Tolerances for the Trust Semiring
• Assume (max, ·) semiring; essentially equivalent to our trust semiring. • Tolerances: If e p* If e p*
21 Distributed Kalman Filtering and Tracking: Performance Improvements from Trusted Core
• Realistic sensor networks: Normal nodes, faulty or corrupted nodes, malicious nodes • Hierarchical scheme – provide global trust on a particular context without requiring direct trust on the same context between all agents • Combine techniques from fusion centric, collaborative filtering, estimation propagation • Trusted Core – Trust Particles, higher security, additional sensing capabilities, broader observation of the system, confidentiality and integrity, multipath comms – Every sensor can communicate with one or more trust particles at a cost
22 Trust and Induced Graphs
Trust relation
Induced Graph G (V, A)
VVtc wij(, ) ( cij (, ), tij (, )[ n ])
Weighted Directed Dynamic Trust
Graph Gt (V, At )
23 Goals of Trusted System
1. All the sensors which abide by the protocols of sensing and message passing, should be able to track the trajectories. 2. This implies that those nodes which have poor sensing capabilities, nodes with corrupted sensors, should be aided by their neighbors in tracking. 3. Those nodes which are malicious and pass false estimates, should be quickly detected by the trust mechanism and their estimates should be discarded.
xn[1][][] Axn Bwn
znii[] Hnxn [][] vn i []
zHnxnvntc tc[][] tc []
24 Trusted DKF and Particles
• Can use any valid trust system as trust update component • Can replace DKF with any Distributed Sequential MMSE or other filter • Trust update mechanism: Linear credit and exponential penalty
25 Trusted DKF Performance
Closed Loop Performance Open Loop Performance Trust System
Performance 26 Power Grid Cyber‐security
• Inter‐area oscillations (modes) – Associated with large inter‐connected power networks between clusters of generators – Critical in system stability – Requiring on‐line observation and control
• Automatic estimation of modes – Using currents, voltages and angle differences measured by PMUs (Power Management Units) that are distributed throughout the power system
27 System Model
• Linearization around the nominal operating points – The initial steady–state value is eliminated – Disturbance inputs consist of M frequency modes defined as M ft( ) a11 exp( t )cos( 1 t ) aii exp( t )cos( ii t ) i 2
ai : oscillation amplitudes; i damping: constants;
i : oscillation frequencies; i :phase angles of the oscillations – Consider two modes and use the first two terms in the Taylor series expansion of the exponential function; expanding the trigonometric functions:
ft() a (1 t )cos( t ) 11 1
at22(1 )[cos 2 cos( 2 t ) sin 2 sin( 2 t )].
28 Distributed Estimation
GPS Satellite
N multiple recording sites (PMUs) to measure the output PMU PMU PMU signals
• To compute an accurate estimate of the state x (k), using:
– local measurements yj (k); – information received from the PMUs in its communication neighborhood; – confidence in the information received from other PMUs provided by the trust model
29 Trust Model
• To each information flow (link) j i, we attach a positive value
Tij , which represents the trust PMU i has in the information received from PMU j ;
• Trust interpretation: – Accuracy – Reliability
• Goal: Each PMU has to compute accurate estimates of the state, by intelligently combining the measurements and the information from neighboring PMUs
30 Trust‐based Multi‐agent State Estimation
• Does not require global information about the power grid topology
• Ensures greater robustness in computing the state estimate
• Main idea: pick the weights wij to be trust dependent
31 Numerical Example
• 3‐generators, 9‐bus system:
32 Numerical example (cont.)
• We assume that a PMU is placed at each bus that measures the complex voltages and currents (in the case of adjacent buses). • The state vector is formed by the voltages measured at buses,
i.e., X = (Ui), where Ui is the complex voltage at bus i. • Measurement model: In the case the buses (i, j) are adjacent
where the measurement vector Zi(k)′= (Ui(k), Iij(k)), Yij is the admittance of line (i, j) and Vi(k) is the complex measurement noise Numerical Example (cont.)
• PMU network: Compromised node
34 Numerical Example (cont.)
• Estimates of the voltage at bus 1 using Algorithm 1, with agent 8 injecting false data
35 Numerical Example (cont.)
• Estimates of the voltage at bus 1 using Algorithm 3, with agent 8 injecting false data
36 Numerical Example (cont.)
• The evolution of agent 4’s weights
37 MANET Trust Aware Routing - -Trust/Reputation Systems
• Components – Monitoring and Detection Modules – Reputation Control System – Response Modules
• Our approach is different: build and use a Trusted Sentinel Sub-Network (SSN), which is responsible for monitoring and flooding reputation measures
• Logically decouple the Trust/Reputation System from other network functionalities • Demonstrate that logical constraints on the SSN translate to constraints on the communication graph topology of the network • Trade-off analysis between security and performance
38 Path problems on Graphs – Delay and Trust Semirings
ij(d(i,j), t(i,j))
2 fR:PSD mindp ( ) min dij ( , ) ppPP SD SD (,ij ) p fp( ) ( dp ( ), tp ( )), p SD
maxtp ( ) max min tij ( , ) min max( tij ( , )) ppSD SD (,ij ) p p SD (, ijp ) PP P Delay Semiring :(R {0}, min, ) Trust Semiring : ( {0}, min,max) R Notions of Optimality: Pareto, Lexicographic,P Max-Ordering, Approximation Semirings
39 Trust Aware Routing – Multi‐Criteria Optimization Problem
• Delay of a path “p” Bi‐metric Network
j2 dp() di (,)j j3 (,ij ) p j1 i • Trust of a path “p” – j4 bottleneck trust j7 J6 tp() min(,ij ) p tij (,) j5
40 Haimes Method – Two Stage Recipe
G (V,E) Source
1. G’ – reduced graph O(|E|) 2. G’SP –SP on reduced graph O(|V|.|E|)
41 What is a Network …?
• In several fields or contexts:
– social – economic – communication – control – sensor – biological – physics and materials
42 A Network is …
• A collection of nodes, agents, … that collaborate to accomplish actions, gains, … that cannot be accomplished with out such collaboration
• Most significant concept for autonomic networks
43 The Fundamental Trade-off
• The nodes gain from collaborating • But collaboration has costs (e.g. communications) • Trade-off: gain from collaboration vs cost of collaboration Vector metrics involved typically Constrained Coalitional Games
Example 1: Network Formation -- Effects on Topology Example 2: Collaborative robotics, communications Example 3: Web-based social networks and services Example 4: Groups of cancer tumor or virus cells ●●● 44 Gain
• Each node potentially offers benefits V per time unit to other nodes: e.g. V is the number of bits per time unit. • Potential benefit V is reduced during transmissions due to transmission failures and delay • Jackson-Wolingsky connections model, gain of node i
rij 1 wGi () V jg
• rij is # of hops in the shortest path between i and j rijij if there is no path between and
•01 is the communication depreciation rate
45 Cost
• Activating links is costly – Example – cost is the energy consumption for sending data
– Like wireless propagation model, cost cij of link ij as a function of link length dij : cPdij ij • P is a parameter depending on the transmission/receiver antenna gain and the system loss not related to propagation
• is path loss exponent -- depends on specific propagation environment.
46 Pairwise Game and Convergence
• Payoff of node i from the network G is defined as
vGiii( ) gain cost wG ( ) cG ( ) • Iterated process
– Node pair ij is selected with probability pij – If link ij is already in the network, the decision is whether to sever it, and otherwise the decision is whether to activate the link – The nodes act myopically, activating the link if it makes each at least as well off and one strictly better off, and deleting the link if it makes either player better off – End: if after some time, no additional links are formed or severed – With random mutations , the game converges to a unique Pareto equilibrium (underlying Markov chain states )
47 Coalition Formation at the Stable State
• The cost depends on the physical locations of nodes – Random network where nodes are placed according to a uniform Poisson point process on the [0,1] x [0,1] square. • Theorem: The coalition formation at the stable state for n∞ lnn 2 — Given 0 , VP is a n sharp threshold for establishing the grand coalition ( number of coalitions = 1).
— For , the threshold is 01 lnn 2 less than P . n n = 20
48 Topologies Formed
49 Trust and Collaborative Control/Operation
An example of Two linked dynamics constrained • Trust / Reputation coalitional games propagation and collaborative control evolution
• Integrating network utility maximization (NUM) with constraint based reasoning and coalitional games • Beyond linear algebra and weights, semirings of constraints, constraint programming, soft constraints semirings, policies, agents • Learning on graphs and network dynamic games: behavior, adversaries • Adversarial models, attacks, constrained shortest paths, …
50 Trust Dynamics and ‘Local’ Voting Rules
• Trust and mistrust spreading
Initial “islands” of trusts
Trust spreads
Trust-connected network
si (1)k f Jskji ,()|j j Ni ● ‘Generalized consensus’ problems ● Spin glasses (from statistical physics), phase transitions
51 Results of Game Evolution ● Theorem: iN and x J, there exists τ , such that iijN ij 0 i for a reestablishing period τ > τ0 – iterated game converges to Nash equilibrium; – In the Nash equilibrium, all nodes cooperate with all their neighbors. ● Compare games with (without) trust mechanism, strategy update:
Percentage of cooperating pairs vs negative links Average payoffs vs negative links
52 Future Directions
• Constrained coalitional games, constrained satisfaction problems and spin glasses – better algorithms? • Dynamic hypergraphs over semirings – replace weights with rules (hard and soft constraints, logic) • Time varying hypergraphs – mixing – statistical physics • Generalized constrained shortest path problems – multiple semirings • Design of high integrity reputation systems, resilient trust • Network self-organization to improve resiliency/robustness • Role of trust in establishing composable security • Networks of agents with dynamic positive and negative trust (recommendations)?
53 Thank you!
[email protected] 301‐405‐6606 http://dev‐baras.pantheonsite.io/
Questions?