10Latency Minimization Through Optimal Data Placement in Fog Networks

Ning Wangand Jie Wu

Department of Computer Science, Rowan University, Glassboro, NJ, USA

Center for Networked Computing, Temple University, Philadelphia, PA, USA

10.1 Introduction

With the wide availability of smart devices, people are able to access social websites, videos, and software from the Internet anywhere at any time. According to [1], in 2016, Netflix had more than 65 million users, with a total of 100 million hours spent streaming movies and TV shows daily. This accounted for 32.25% of the total downstream traffic during peak periods in North America. Facebook had 2.19 billion monthly active users up to the first quarter of 2018 [2]. The traditional mechanism is to store content into the central server, which leads to a relatively long data retrieving latency. To alleviate this issue, Fog Networks (FNs) are proposed and applied in commercial companies. According to Cisco's white paper [3], global streaming traffic is expected to account for 32% of all consumer traffic on the Internet by 2021, and more than 70% of this traffic will cross FNs up, an increase from 56% in 2016.

To solve the problem of the long content retrieving latency when storing data in the center, FNs can geographically store partial or entire data at fog servers deployed close to clients. This will then reduce network and center server loading times and provide a better quality of service, i.e. low latency to end-clients [4

Get Fog Computing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.