The Minimal Communication Cost of Gathering Correlated .ppt

上传人:roleaisle130 文档编号:373301 上传时间:2018-10-05 格式:PPT 页数:16 大小:280KB
下载 相关 举报
The Minimal Communication Cost of Gathering Correlated .ppt_第1页
第1页 / 共16页
The Minimal Communication Cost of Gathering Correlated .ppt_第2页
第2页 / 共16页
The Minimal Communication Cost of Gathering Correlated .ppt_第3页
第3页 / 共16页
The Minimal Communication Cost of Gathering Correlated .ppt_第4页
第4页 / 共16页
The Minimal Communication Cost of Gathering Correlated .ppt_第5页
第5页 / 共16页
亲,该文档总共16页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述

1、The Minimal Communication Cost of Gathering Correlated Data over Sensor Networks,EL 736 Final ProjectBo Zhang,Motivation: Correlated Data Gathering,Correlated data gathering core component of many applications, real life information processes Large scale sensor applications Scientific data collectio

2、n: Habitat Monitoring High redundancy data: temperature, humidity, vibration, rain, etc. Surveillance videos,Resource Constraint,Data collection at one or more sinks Network: Limited Resources Wireless Sensor Networks Energy constraint (limited battery) Communication cost computation cost Internet C

3、ost metrics: bandwidth, delay etc.,Problem:,What is the Minimum total cost (e.g. communication) to collect correlated data at single sink?,Model Formalization,Source Graph: GX Undirected graph G(V, E) Source nodes 1, 2, , N , sink t e=(i, j) E comm. link, weight we Discrete Sources: X= X1, X2, , XN

4、Arbitrary distribution p( X1=x1, X2=x2, , XN=xN ) Generate i.i.d. samples, arbitrary sample rate Task: collect source data with negligible loss at t,Model Formalization: continued,Linear costs g( Re, we ) = Re we , e ERe - data rate on edge e, in bits/sample we - weight depends on applicationFor com

5、munication cost of wireless linkswe l , 2 4 , l Euclidean distanceGoal: Minimize total Cost,Minimal Communication Cost -Uncapacitated and data correlation ignored,Sink-t,1,2,3,4,5,6,8,9,11,12,7,10,Link-Path Formulation ECMP Shortest-Path Routing: Uncapacitated Minimum Costindices d = 1, 2, .,D deman

6、ds p = 1, 2, ., Pd paths for demand d e = 1, 2, .,E linksconstants hd volume of demand d edp = 1 if link e belongs to path p realizing demand d variables We metric of link e, w = (w1, w2, .,wE) Xdp(w) (non-negative) flow induced by link metric system w for demand d on path pminimize F = e Wed pedp X

7、dp(w)constraints p Xdp(w) = hd, d= 1, 2, .,D,Data correlation Tradeoffs: path length vs. data rate,Routing vs. Coding (Compression) Shorter path or fewer bits? Example: Two sources X1 X2 Three relaying nodes 1, 2, 3 R - data rate in bits/sample Joint compression reduces redundancy,Data correlation -

8、 Previous Work,Explicit Entropy Encoding (EEC) Joint encoding possible only with side info H(X1,X2,X3)= H(X1)+ H(X2|X1)+ H(X3|X1,X2) Coding depends on routing structure Routing - Spanning Tree (ST) Finding optimal ST NP-hard,X2,X3,H(X1),H(X2),Sink-t,1,2,3,4,5,6,7,8,9,10,11,12,X1,H(X1,X2, X3),Data co

9、rrelation - Previous Work (Contd),Slepian-Wolf Coding (SWC): Optimal SWC scheme routes? Shortest path routing rates? LP formulation (Cristecu et al, INFOCOM04),Sink-t,1,2,3,4,5,6,8,9,11,12,Correlation Factor,For each node in the Graph G (V,E), find correlation factors with its neighbors. Correlation

10、 factor uv , representing the correlation between node u and v. uv = 1 r / RR - data rate before jointly compressionr - data rate after jointly compression,Correlation Factor (Contd),Shortest Path Tree (SPT):Total Cost: 4R+r Jointly Compression:Total Cost: 3R+3rAs long as = 1- r/R 1/2, the SPT is no

11、 longer optimal,All edge weights are 1,Minimal Communication Cost local data correlation :Add Heuristic Algorithm,Step 0: Initially collecting data at sink t via shortest path. Compute Cost Fi(0) = e Ri We, where We is the weight of link e realizing demand Ri. Set Si(0) = j, where j is the next-hop

12、of node i. i, j = 1, 2 N, i j . Set iteration count to k = 0. Let Mi denote the neighbors of node i. Step 1: For j MiSi(k), doFij(k+1) = Fi(k) RiWij+RiWij + e (Ri ij) We Step 2: Determine a new j such thatFij(k+1) = min Fij(k+1) Fi(k).If there is no such j, go to step 4.Step 3: UpdateSi(k+1) = j Set

13、 Fi(k+1) = Fij(k+1) and k := k + 1 and go to Step 1.Step 4: No more improvement possible; stop.,Add Heuristic: example,First Step: Shortest path routing,Sink-t,1,2,3,4,5,6,9,10,12,7,11,8,After Heuristic: When ij 1/2, j will be the next hop of i.,Local data correlation: analysis,Information from neighbors needed Optimal? Approximation algorithm Other factors took into account: energy, capacity,Thanks!,

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 教学课件 > 大学教育

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1