
Verizon said its 5G lab recently built and tested an independent GPU-based orchestration system and developed enterprise mobility functionalities meant to revolutionise mobility for virtual reality (VR), mixed reality (XR), augmented reality (AR), and cinematic reality (CR). Together, these functionalities may pave the way for a new class of mobile cloud services, provide a platform for developing ultra low-latency cloud gaming, and enable the development of scalable GPU cloud-based services, the company said in a press release.
GPU-based orchestration system
Verizon team developed a prototype using GPU slicing and management of virtualisation that supports any GPU-based service and will increase the ability for multiple user-loads and tenants. In proof of concept trials on a live network in Houston using the newly developed GPU orchestration technology in combination with edge services, Verizon engineers were able to successfully test this new technology.
Edge application functionalities
To assist developers in creating these new applications and products, Verizon’s team developed a suite of edge functionalities. These functionalities, similar in nature to APIs (application programming interface), describe processes that developers can use to build an application without the need for additional code. This eases the burden on developers and also creates more consistency across apps. Building on this technology, the team created eight services for developers to use when creating applications and products for use on 5G Edge technology: 2D Computer vision; XR lighting; Split rendering; Real time ray tracing; Spatial audio; 3D Computer vision; Real time transcoding, as well as Asset caching.