John S. Baras.Pdf
Total Page:16
File Type:pdf, Size:1020Kb
Trust in Networks and Networked Systems John S. Baras The Institute for Systems Research, Electrical and Computer Engineering Department, Applied Mathematics, Statistics and Scientific Computation Program University of Maryland College Park USA December 7, 2017 ANU Workshop on Systems and Control ANU, Canberra, Australia Acknowledgments • Joint work with: Peixin Gao, Tao Jiang, Ion Matei, Kiran Somasundaram, Xiangyang Liu, George Theodorakopoulos • Sponsors: NSF, ARO, ARL, AFOSR, NIST, DARPA, Lockheed Martin, Northrop Grumman, Telcordia (ACS) 2 Networked Systems Infrastructure / Social / Biological Communication Economic Networks Networks Networtks Internet / WWW Social MANET Interactions Community Epiddemic Sensor Nets Collaboration Cellular and Robotic Nets Social Filtering Economic Sub-cellular Hybrid Nets: Neural Comm, Sensor, Alliances Insects Robotic and Web-based Human Nets social systems Animal Flocks 3 Networks and Trust • Trust and reputation critical for collaboration • Characteristics of trust relations: – Integrative (Parsons1937) – main source of social order – Reduction of complexity – without it bureaucracy and transaction complexity increases (Luhmann 1988) – Trust as a lubricant for cooperation (Arrow 1974) – rational choice theory • Social Webs, Economic Webs – MySpace, Facebook, Windows Live Spaces, Flickr, Classmates Online, Orkut, Yahoo! Groups, MSN Groups – e-commerce, e-XYZ, services and service composition – Reputation and recommender systems 4 Indirect Network Trust User 8 asks for access to User 1’s files. User 1 and User 8 have no previous interaction What should User 1 do? 2 Use transitivity 1 7 of trust (i.e. use references to 4 compute 6 indirect trust) 3 8 5 5 Indirect Trust: System Model • System mapped to a weighted, directed graph – Vertices : entities/users – Edges : direct trust relations – Weights : w(i,j) = How much i trusts j • Establish an indirect trust relation, between users that have not had direct interactions – We assume that trust is transitive (at least partially) • Trust computation: path problem on a graph – Information about j that is useful to i Directed paths from i to j – Combine information along each path, and then aggregate across paths 6 Semirings-Examples • Shortest Path Problem – Semiring: – is + and computes total path delay – is and picks shortest path • Bottleneck Problem – Semiring: – is and computes path bandwidth – is and picks highest bandwidth 7 Trust Semiring Properties: Partial Order • Combined along-a-path weight should not increase : ab 1 2 3 • Combined across-paths weight should not decrease : a b 8 Trust Path Semiring •0 £ trust, confidence £ 1 • is • is 9 Trust Distance Semiring • Motivated from Eisner’s Expectation Semiring (2002) (speech/language processing) (a1, b1) (a2, b2) = (a1b2+a2b1, b1b2) (a1, b1) (a2, b2) = (a1 + a2, b1 + b2) •0 £ trust, confidence £ 1, (t, c) (c/t, c) S = [0, ¥] x [0,1] • is • is 10 Computing Indirect Trust • Path interpretation • Linear system interpretation ttwij ik kj User k Indicator vector of pre- trusted nodes tWtnn1 b • Treat as a linear system – We are looking for its steady state. 11 Attacks to Indirect Trust • ATTACK the trust computation! – Aim: Increase t1→8 to a level that would grant access. •How? – Edge attack: change opinion on an edge (trick a node into forming false opinion) – Node attack: change any opinion emanating from a node (gain complete control of a node) 12 Attacks to Indirect Trust Edge Attack 2 1 7 4 6 3 5 8 13 Attacks to Indirect Trust Node Attack 2 1 7 4 6 3 5 8 14 Edge Tolerances • Upper (Lower) edge tolerance of an edge e, w.r.t. an optimal path p*, is the highest (lowest) weight of e that would preserve the optimality of p*. • In a shortest path problem (min, +), the most vital edge is the path edge whose weight has the largest difference with the upper tolerance. • In a maximum capacity problem (max, min), the most vital edge is the path edge whose weight has the largest difference with the lower tolerance. 15 Upper Tolerance Example • Upper Tolerances for the Shortest Path Problem Upper Tolerances 2 4 ∞∞∞∞ 55 6 6 Shortest Path 10 12 1 3 5 1 10 Most Vital Edge 16 Lower Tolerance Example • Lower Tolerances for the Shortest Path Problem “Smallest” Lower Tolerances 2 4 required changes -4 -4 4 4 55 6 6 Shortest Path -∞ -∞ 1 3 5 1 10 17 Attacked Edge on the Path Trust Edge Attack 2 1 7 4 RESULT: Decrease Trust! 6 3 5 Optimal Path p*, 8 trust value: t* 18 Attacked Edge not on the Path Trust Edge Attack 2 1 7 New Optimal Path p’, trust value: t’ 4 RESULT: Increase Trust! 6 Change Path! 3 5 Optimal Path p*, 8 trust value: t* 19 Tolerances for any Optimization Semiring • Optimization semirings: is min or max • -minimal (maximal) tolerance αe (βe) of edge e instead of lower (upper) tolerance. • is the inverse of defined by: a x = b x = b a • w(e) is the weight of edge e. w(p) is the weight of path p. If e p* If e p* 20 Tolerances for the Trust Semiring • Assume (max, ·) semiring; essentially equivalent to our trust semiring. • Tolerances: If e p* If e p* 21 Distributed Kalman Filtering and Tracking: Performance Improvements from Trusted Core • Realistic sensor networks: Normal nodes, faulty or corrupted nodes, malicious nodes • Hierarchical scheme – provide global trust on a particular context without requiring direct trust on the same context between all agents • Combine techniques from fusion centric, collaborative filtering, estimation propagation • Trusted Core – Trust Particles, higher security, additional sensing capabilities, broader observation of the system, confidentiality and integrity, multipath comms – Every sensor can communicate with one or more trust particles at a cost 22 Trust and Induced Graphs Trust relation Induced Graph G (V, A) VVtc wij(, ) ( cij (, ), tij (, )[ n ]) Weighted Directed Dynamic Trust Graph Gt (V, At ) 23 Goals of Trusted System 1. All the sensors which abide by the protocols of sensing and message passing, should be able to track the trajectories. 2. This implies that those nodes which have poor sensing capabilities, nodes with corrupted sensors, should be aided by their neighbors in tracking. 3. Those nodes which are malicious and pass false estimates, should be quickly detected by the trust mechanism and their estimates should be discarded. xn[1][][] Axn Bwn znii[] Hnxn [][] vn i [] zHnxnvntc tc[][] tc [] 24 Trusted DKF and Particles • Can use any valid trust system as trust update component • Can replace DKF with any Distributed Sequential MMSE or other filter • Trust update mechanism: Linear credit and exponential penalty 25 Trusted DKF Performance Closed Loop Performance Open Loop Performance Trust System Performance 26 Power Grid Cyber‐security • Inter‐area oscillations (modes) – Associated with large inter‐connected power networks between clusters of generators – Critical in system stability – Requiring on‐line observation and control • Automatic estimation of modes – Using currents, voltages and angle differences measured by PMUs (Power Management Units) that are distributed throughout the power system 27 System Model • Linearization around the nominal operating points – The initial steady–state value is eliminated – Disturbance inputs consist of M frequency modes defined as M ft( ) a11 exp( t )cos( 1 t ) aii exp( t )cos( ii t ) i 2 ai : oscillation amplitudes; i damping: constants; i : oscillation frequencies; i :phase angles of the oscillations – Consider two modes and use the first two terms in the Taylor series expansion of the exponential function; expanding the trigonometric functions: ft() a11 (1 t )cos( 1 t ) at22(1 )[cos 2 cos( 2 t ) sin 2 sin( 2 t )]. 28 Distributed Estimation GPS Satellite N multiple recording sites (PMUs) to measure the output PMU PMU PMU signals • To compute an accurate estimate of the state x (k), using: – local measurements yj (k); – information received from the PMUs in its communication neighborhood; – confidence in the information received from other PMUs provided by the trust model 29 Trust Model • To each information flow (link) j i, we attach a positive value Tij , which represents the trust PMU i has in the information received from PMU j ; • Trust interpretation: – Accuracy – Reliability • Goal: Each PMU has to compute accurate estimates of the state, by intelligently combining the measurements and the information from neighboring PMUs 30 Trust‐based Multi‐agent State Estimation • Does not require global information about the power grid topology • Ensures greater robustness in computing the state estimate • Main idea: pick the weights wij to be trust dependent 31 Numerical Example • 3‐generators, 9‐bus system: 32 Numerical example (cont.) • We assume that a PMU is placed at each bus that measures the complex voltages and currents (in the case of adjacent buses). • The state vector is formed by the voltages measured at buses, i.e., X = (Ui), where Ui is the complex voltage at bus i. • Measurement model: In the case the buses (i, j) are adjacent where the measurement vector Zi(k)′= (Ui(k), Iij(k)), Yij is the admittance of line (i, j) and Vi(k) is the complex measurement noise Numerical Example (cont.) • PMU network: Compromised node 34 Numerical Example (cont.) • Estimates of the voltage at bus 1 using Algorithm 1, with agent 8 injecting false data 35 Numerical Example (cont.) • Estimates of the voltage at bus 1 using Algorithm 3, with agent 8 injecting false data 36 Numerical Example (cont.) • The evolution of agent 4’s weights 37 MANET Trust Aware Routing - -Trust/Reputation