In 5G and next generation wireless networks, small cells as well as use of wide-band mmWave (which is coverage limited) is required to support next generation application that require high throughput and/or low latency. The resulting dense placement of base stations require corresponding dense wired backhaul network which can significantly increase the fiber deployment cost. One solution is to replace fibers with Integrated Access and Backhaul (IAB) network, where a part of the wireless spectrum is used for connecting base stations and thereby providing a multi-hop network to base stations with fiber-access. Realizing the benefits of an IAB network requires addressing a number of challenging inter-related problems in routing, scheduling, flow control, and link adaptation in multihop wireless networks.
In this research we will use in Machine Learning (ML) techniques to address three inter-related problems. First, we will design a centralized routing scheme based on Graph Neural Network (GNN) which minimize number of hops (delay) while achieving high system throughput. Second, we will design and implement a two level network controller - a higher level controller operating at the network level that coordinates with node level scheduler will operate on backpressure based scheduling algorithms. The parameters of these node level algorithms will be determined by the network controller which will be based on a Reinforcement Level (RL)-agent that will attempt to co-optimize overall network performance in terms of system throughput while achieving low end-to-end delay and high fairness. Third, we will investigate the applicability of hierarchical temporal memory (HTM) to predict wireless channel quality. HTM which is model of how the human neocortex processes various sensory information and learns higher order sequences for both prediction and anomaly detection will be used to predict link quality for improved link adaptation.
Video streaming is now the major component of the peak hour real-time entertainment tra.c in the Internet for both wired and mobile access. Consequently, the user-perceived quality-of-experience (QoE) of video streaming applications is very important to the content providers to generate revenue. The growth of video streaming is also important to the network provider for capacity planning, provisioning and resource allocation both in the access and the backhaul networks. The QoE is determined by many factors such as startup delay, video freezes due to re-buffering, and the playback video bitrate. These depend on the dynamic network conditions including congestion in network and the time varying quality of the wireless channel. This research focuses on video streaming over LTE. One of the key problems in video streaming is the client-side bitrate adaptation (generically referred to as Adaptive Bit Rate (ABR)). These algorithms operate at the applications layer and modify the video bitrate on feedback from the client. In rate-based approaches, the video bit rate is selected to match highest possible bitrate based on the estimated available bandwidth. On the the other hand, in the bu.er based approach, the bu.er occupancy is used to select the video bit rate. While preliminary work have investigated the applicability of these approaches, there are many open questions including 1) how existing approaches perform under broadband multiaccess wireless networks such as LTE and how they can be optimized and 2) what are appropriate content-aware cross-layer design of ABR for LTE.
As a first step we will carry out detailed performance analysis of dynamic adaptive streaming over HTTP (DASH) over LTE. Other than ABR and TCP control loops in DASH, there are additional control loops in the LTE radio access network. These including the RLC-AM which attempts to provide a near reliable delivery of data across the radio link and the MAC scheduler which allocates radio resources at a very fine time scale to achieve certain network-wide objective such a proportional fairness or maximum system throughput. Using an enhanced NS-3 based LTE simulation tool, we will study how the various control loops interact and the performance of DASH under different scenarios and the impact of the scheduler. We will study how the parameter of the growth and steady state phases in DASH should be set to maximize video quality. An alternative to application layer ABR is to consider a cross-layer design in which the ABR control is tightly integrated with the MAC scheduler. We will build upon our preliminary work and design and implement a content aware cross-layer DASH for LTE. The intuition is that the since network itself would be the .rst to know about a congestion and variations in the channel quality, integrating the ABR control loop with the MAC scheduler will enable the servers and the clients adapt faster and more accurately, thereby providing better quality and maximize the network resource usage. This will require two-way information exchange - network provider passing information to the ABR control loop of the content provider and the content provider providing video content information to the network provider . While this sharing of information is easily conceivable in a IPTV "walled garden" environment such is not the case for majority of video streaming that is video "over the top". We will consider a multi-agent multiple control loop formulation of the problem to determine under what scenario is such information sharing is feasible and what are the benefits.
Funded by a grant from Orange Labs.
*Current Research participants are bolded
Yingjie Tracy Zhang (PhD Student)
Zhiyi Huang (PhD Student)
Yu Liu (PhD, 2016))
Ahmed Ahmedin (PhD, 2014)
Amitabha Ghosh (Visitor)
Mung Chiang (Professor, Purdue University)
Kartik Pandit (PhD, 2013)
Haiping Liu (PhD, 2010)
Xiaoling Qiu (PhD, 2010)
Vijoy Pandey (PhD, 2007)
Abu Sayeem Reaz (PhD, 2005)
Vishwanath Ramamurthi (PhD 2009, Verizon)
Biswanath Mukherjee (Professor, UCDavis)