6G NeXt – A glimpse into the connected world of tomorrow

6G proposes a globally connected, intelligent system in which data transmission, data processing, and AI are first-class citizens, going far beyond traditional data transmission networks to unlock entirely new use cases and lay the foundation for widespread XR adoption.

Over the past three years, Fraunhofer FOKUS has been working with partners in the “6G Native Extensions for XR Technologies” (6G NeXt) project to prove that the next generation of mobile communications will be defined by the convergence of connectivity, computing power, and artificial intelligence. Within the project, our team at the business unit FAME has developed three key solutions to explore novel distributed immersive experiences.

  1. FAMIUM Remote Rendering Framework: Our remote rendering framework is built for split computing and allows rendering compute intensive 3D applications in the cloud, on the network edge, or any split configuration. It is device agnostic and allows interaction with photorealistic 3D environments on end devices without graphical compute capabilities. Complex simulation environments can be rendered on the cloud, augmented with data from edge nodes, and displayed on handheld devices or head-mounted displays, offering a platform for industrial and consumer applications alike.
  2. Real-time 3D Head Reconstruction: To enable immersive telepresence with always up-to-date avatars, we developed a real-time 3D head reconstruction pipeline fully based on commodity hardware. Captured by a 2D camera, the user’s head shape is reconstructed as a 3D-mesh, textured, and streamed into any 3D environment driven by Unity, Unreal Engine, or any other 3D engine. Reconstructed movement sequences can be recorded, stored and used as static assets. The real-time head reconstruction is fully compatible with the remote rendering framework.
  3. 5G Media Testbed: To validate developments in real-world settings, we built up our 5G media testbed, which is a mobile and configurable 5G network. Applications integrated into the remote rendering framework can be deployed along the cloud-edge continuum and executed under varying network conditions to test real-world feasibility.
Component architecture of the FAMIUM remote rendering framework

Together with our 6G NeXt partners, T-Labs, DFKI, TU Berlin, Volucap, TU Ilmenau, TH Wildau, LogicWay, IDRF, Schönhagen Airfield, and our associated partner Nvidia, we successfully hosted this project‘s closing event in September 2025 at Schönhagen Airfield. There, we validated a backbone architecture developed in the project using two challenging use cases: 3D video communication and a live anti-collision system for mixed air traffic consisting of manned and unmanned aircraft.

The challenge: overcoming the limitations of static infrastructures

Conventional network architectures reach their limits when it comes to applications such as real-time 3D communication or autonomous flying drone swarms. Today’s networks are often still structured as passive data lines, with the main function of routing data from one place to another. However, complex applications benefit from systems that behave proactively, moving away from static configurations toward a flexible, AI-orchestrated architecture that dynamically provides computing power where it is needed.

The complexity of the project was characterized by the interplay of seven dimensions: network technology, 3D reconstruction, aviation, cloud computing, split computing, immersive technologies, and quality of service. To overcome these challenges and build a system where networking and computation converge, close collaboration within an interdisciplinary consortium was essential.

The solution landscape

To cover all seven dimensions, work in the project was aligned along the following topic groups:

Network & cloud infrastructure: To support immersive and real-time applications, we developed a high-performance backbone architecture. This architecture enables distributed computing and network resources to be orchestrated in real time, guaranteeing that applications are always provided with the best possible resource configuration.

3D & Immersion: At the core of the project, we developed the essential software systems for delivering immersive real-time content, including the creation of a range of device-independent immersive technologies.

Aviation: Supported by the IDRF and Schönhagen Airfield as a test area, we developed a novel system for central collision avoidance of unmanned aerial vehicles.

QoS/QoE: We ensured that our technical developments were accompanied by a holistic QoS/QoE concept from the ground up, ensuring real-world usability of all developed systems.

The three major fields of innovation

  1. High-performance Backbone

5G is characterized above all by a new software and protocol stack and new frequency ranges that enable higher throughput. A special 5G use case is campus networks, which are private, localized high-performance networks for organizations. 6G, on the other hand, breaks down the separation between infrastructure and application and treats communication and computing resources as a single entity. Compared to 5G, it is no longer a pure data transmission medium, but a holistic system that dynamically distributes and orchestrates computing power across the entire network. The network no longer just passes data through to the other end. It can decide for itself what the optimal execution configuration is for each application.
To this end, the high-performance 6G backbone layer was developed in 6G NeXt. It forms a middleware between applications and resources and is characterized by dynamic workload optimization, continuous metric collection, and hardware independence. Network and computing resources, as well as applications, register their capabilities and requirements in the backbone layer. When an application is to be executed, the backbone layer evaluates the application and the available computing resources. Computing requirements (minimum performance), hardware requirements (GPU, screen type, etc.), divisibility, and QoS/QoE guarantees are all taken into account in the decision. The backbone layer then decides which part of the application will be executed where, to achieve optimal QoS/QoE.
Through metric collection, the system is continuously optimized by recognizing patterns in past executions and incorporating them into future decisions.

As part of the development activities of the high-performance backbone, Fraunhofer FOKUS evaluated and published their findings on the latencies introduced for applications deployed through the FAMIUM remote rendering framework. We conducted tests for both click-to-photon (CTP) and motion-to-photon (MTP) latency, measuring the delay between a click/head movement and the first visual change on the display device respectively. For a fully remotely rendered VR application, the remote rendering framework performed with MTP latencies as low as 30ms, averaging about 40 milliseconds. CTP latencies were browser dependent, averaging at about 60 milliseconds. The findings validated the usability of the FAMIUM remote rendering framework for both flat-screen displays and head-mounted displays.

  1. HOLOCOM: Immersive communication that runs everywhere

3D video communication has gained significant popularity in recent years. From Meta’s codec avatars to Nvidia’s “AI-mediated 3D video conferencing,” Apple’s Personas, and Google Starline, every major tech company seemed to be exploring the topic of 3D video communication. But why are meetings still conducted in 2D?

Because each of the solutions mentioned either requires expensive hardware or is simply not publicly available.
With HOLOCOM, we developed a device-agnostic, modular 3D video communication system based on standard hardware. People are recorded using a 2D camera, such as a cell phone camera or standard webcam. The resulting video stream is converted into a 3D format and played on a screen-agnostic player designed for both classic 2D and 3D screens. The entire processing pipeline is modular and connected through network communication protocols, allowing it to be efficiently distributed by the underlying backbone layer. The HOLOCOM system includes a central component storing user data, friend lists, a user’s online status and other social information. When a call is made, the backbone layer assesses the location and device capability of both the caller and the callee, and instantiates all necessary software components where they will lead to the best possible quality of experience. A dedicated post on the HOLOCOM system will follow soon on this blog, providing detail on individual components and diving deeper into the 3D reconstruction and data transmission.

Live demonstration of the 3D head reconstruction pipeline

HOLOCOM and a selection of other immersive tech solutions developed by our team will be showcased at this year’s Media Web Symposium at Fraunhofer FOKUS.

  1. Smart drones: safety first in mixed airspace

Mixed traffic at airports is a complicated and safety-critical issue. On the one hand, it allows for multidimensional safety concepts using classic surveillance technologies and autonomous vehicles, but on the other hand, the presence of unmanned vehicles poses a significant collision and therefore safety risk.
This problem was solved in the 6G NeXt project by a central anti-collision system. Manned and unmanned aircraft continuously report their position to a central coordinator in real time. The coordinator calculates probable flight trajectories, the probability of a collision, and possible evasive maneuvers based on this information. The latter are transmitted to the affected unmanned aircraft in the event of a collision probability, where they are immediately implemented.

Custom built Manta-X drone used to test 6G NeXt’s anti-collision system

A project with global appeal

6G NeXt was more than just a research project. It was an active dialogue with the international expert community. Over the three-year period, we not only validated our results, but also actively contributed them to the global 6G discussion.

Immersive media & video technology: We presented the HOLOCOM use case and our developments in the field of edge-based immersive technologies at the FOKUS Media Web Symposium in Berlin, the NAB Show in Las Vegas, and the ACM MILE High Video Conference in Denver. The Hannover Messe and the 6G Conference in Berlin served as yearly stages to present regular development progress.

6G network technologies & standardization: We presented the high-performance backbone layer at leading conferences such as IEEE Globecom in Cape Town, the EUCNC & 6G Summit (Gothenburg & Antwerp), and the Berlin 6G Conference. Our reach extended far beyond Europe: With workshops and summits in Austin (USA), Tokyo (Japan), and multiple locations in the United States, we ensured that German 6G research also gained international attention.

Aviation & drone technology: Our partners presented our approaches to AI-based collision avoidance at AERO Friedrichshafen (2023, 2024, 2025), Europe’s largest trade fair for general aviation, among other events. We also discussed how 6G can revolutionize airspace safety at the German Aerospace Congress and the Winter Satellite Workshop in Espoo.

Conclusion

3 years, 23 papers, 1 patent. These figures mark the end of three years of intensive research. 6G NeXt has proven that intelligent, AI-supported orchestration of network and computing power is no longer a distant theory, but technically feasible and necessary. With the successful validation in Schönhagen and the broad international response, we have set an important milestone. We would like to thank all our partners for their exceptional cooperation and look forward to continuing on the path to a connected future together.

Fraunhofer FOKUS  is committed to exploring the integration of AI in immersive media far beyond the 6G NeXt project by engineering specialized tools within our remote rendering framework that bridge the gap between AI and immersive media, unlocking seamless, AI-driven control over 3D environments.

If you want to read more on Fraunhofer FOKUS’ 5G and 6G Media activities, check out our 5G Media Testbed. To experience our work firsthand, including AI-driven immersive experiences and the latest developments from across our business unit, join us at the 13th FOKUS Media Web Symposium, or simply reach out to us today!

Leave a Reply

Your email address will not be published. Required fields are marked *