The proliferated rollout of 4G LTE has led to a pervasive usage of mobile Internet. Particularly the availability of cheap 4G data plans as well as low cost smartphones has stimulated an increase in online video streaming. The increase has been further promulgated by the availability of regional-based content on video streaming platforms like Netflix, Youtube, etc. However, the user experience of watching videos in smartphones is often limited by fast battery drainage and poor network conditions. The problem is further aggravated while the user is travelling. This is primarily because under mobility the user-equipments have to continuously scan the network to remain latched on to some base station. Besides, in developing countries the network infrastructure is often compromised to facilitate the availability of low cost internet. Hence, the user equipment often attaches to legacy 2G and 3G networks, which eventually hampers the perceived datarate of the user. To provide seamlessly good Quality of Experience (QoE) to the users amidst fluctuating network conditions, existing client video players use adaptive bitrate (ABR) streaming through Dynamic adaptive streaming over HTTP (DASH). However, these players largely ignore the energy savings aspect. Our work focuses on devising algorithms which cater to improving the energy efficiency of DASH protocols, even under mobility.
Devising these algorithms entails an in-depth understanding of the complex relations between the perceived network throughput, energy consumption, handovers, associated network technology, etc. To this end we have undertaken extensive measurement-based studies involving different service providers and commercially available smartphones in India. Equipped with this study, we have used recurrent neural networks to design an algorithm which works as a wrapper over DASH to significantly reduce the energy consumption. Since the relationship between the perceived throughput, energy consumption, and the various radio-related parameters are significantly complex, we use machine learning tools like deep reinforcement learning to design our energy saving algorithm.
We have also made modifications in the system architecture to control the algorithm from the server-side.