Antonio Arcidiacono, EBU Director of Technology & Innovation
A recent article on the Financial Times website noted that, on an evening in early December, UK internet traffic had spiked at 25.5 terabits of data per second. The result of Amazon Prime live streaming six football matches simultaneously, it stretched the national infrastructure to the limit. While the networks coped with the spike, the writer went on to put this milestone in the context of the growing hype around the metaverse.
“A fully working virtual world, or even just a real-time, high-definition immersive experience, will require far, far more capacity to transmit data between the consumer and network than is currently available in homes around the world.”
This raises the question of what it will take to enable the metaverse in reality and – most importantly – who will pay for the infrastructure. Providing a truly immersive VR experience would require images going from 8K to the equivalent of 24K resolution, corresponding to a guaranteed end-user bandwidth – considering a unicast fibre connection – in the order of 1 GBPS and a corresponding access network enhanced to 25-50 GBPS (link).
It is often surprising to learn how little consideration is given to the question of how such services and experiences will be delivered to the general public. A few years ago we visited the 3D production facilities at Intel Studios in Los Angeles. They were very proud of what they could produce in terms of an immersive experience but also worried that they would need gigabits per second for the connection with each individual user. This could not scale for tens or hundreds of millions of users. I immediately commented that they were ignoring the capabilities of broadcast/multicast delivery, if and when combined with unicast low-latency connectivity. They smiled and nodded politely, but they were probably asking themselves why is he talking about broadcasting?!
We should recall that users seek an experience that is both personalized and shared. While content must be tailored to individual users, the shared experience relies on simultaneous use of the same media elements by many users.
Broadcast the background
An optimal user experience can only be ensured if the delivery requirements are taken into account at the design stage for new applications and by using production strategies that can combine and synchronize data delivered on different IP infrastructures. The metaverse would thus operate on the basis of background content that can be transmitted using broadcast/multicast, with unicast connectivity used to deliver the interactive and personalized content that is demanded by that individual user’s current actions.
In reality, the combination of broadcast and unicast is absolutely feasible, even if they are based on different network topologies and infrastructures. An additional and key element is that of using intelligence at the smart edges of the network. If properly conceived, the combination of different types of networks and network elements into a multilayer infrastructure – combining unicast, broadcast/multicast and local smart edges using AI functionalities – will provide a unique set of tools to drastically improve the sustainability of the future metaverse.
All of this is possible using infrastructures and technologies that are already available, from fibre connectivity and cellular networks to terrestrial and satellite broadcast networks. The latter, for example, are today capable of delivering direct-to-edge GBPS-worth of content, in native UDP format, to millions of people with a single transmission. This is part of what we are preparing in the EBU 5G-EMERGE project.
Returning to the question of who should pay, if the metaverse – whatever it turns out to be – is to become a reality, it is essential to develop an infrastructure whose cost can be sustainably shared between those who build the metaverse, the telcos and the end users.
It goes without saying, there are also difficult questions around the environmental sustainability of ubiquitous
virtual worlds. The use of a multi-layered infrastructure, with its inherent efficiencies, can be one essential part of the answer, preparing to face the implications of an ever more data-hungry world.
This article was first published in issue 51 of tech-i magazine.