 |
2.4 |
9 |
{{id name="editors_note"/}}The 2026 edition of our multi-vendor interoperability test has been a massive undertaking of the **12 leading vendors participating this year:** Arista Networks, Calnex Solutions, Ciena, Cisco Systems, Ericsson, HPE, Keysight, Microchip, Nokia, Raisecom, Ribbon, and ZTE, with the EANTC engineering team. Together, we have spent more than 1,600 person days (equivalent to seven person years) designing and implementing test cases with 56 device types, creating more than 1,300 result datasets. More than three tons of equipment were moved across the globe to facilitate this undertaking and are shown live in Paris. |
 |
2.1 |
19 |
These use case scenarios involved almost all participating vendors, creating realistic architectures that can serve as **blueprints**, guiding operators towards vendor-independent network design. Being very demanding and time-consuming, the use case tests rewarded us with strong results. They are the foundation for the live demos shown at the Upperside Congress in Paris this year, are well documented, and will be expanded on next year. |
|
|
20 |
Of course, we must not forget about AI these days. Our tests included both relevant sub-topics this year: |
|
|
21 |
|
|
|
22 |
**AI-enabled networking** is governed by standardized provisioning (via PCE, Yang models, and BGP-SR — check), extensive telemetry data (via BGP-LS and TWAMP—check), and automated optimization checks (via digital twins—check). The vendors participating in this test area are on a steady path; that said, it is still a long way towards multi-vendor Autonomous Networks. Today, partial automation of specific service aspects in SR and EPVN is possible; it is important to require standardized methods in RfPs in detail. |
|
|
23 |
|
|
|
24 |
**Networking for AI workloads**is a topic we wanted to cover more intensively, but it was too early. The next generation of data center transport has been defined by the Ultra Ethernet Consortium (UEC). The implementations naturally take time to get ready because they require major hardware innovations. We only covered a small aspect this time and plan to expand the UEC integration next year. |
|
|
25 |
This 16-page report is only the very short version of all results. Please check out the QR codes with many more test results. We hope our joint effort is beneficial for any WAN, mobile x-haul, and data center network architects! |
|
|
26 |
If you have any detailed questions, suggestions for next year’s test coverage, or would like to tap our brains for an individual network design, please contact us. |
|
|
27 |
|
|
|
28 |
=== EANTC's Mission === |
 |
2.1 |
34 |
Preparations for the EANTC Transport & Cloud Networks Interop Test 2026 began in September 2025 with a technical call involving all vendors interested in participating. This initial discussion covered the overall event structure, followed by dedicated technical calls for each test area, led by the EANTC team alongside vendor experts. |
|
|
35 |
During these sessions, potential test cases were identified and refined, with vendors contributing new ideas and draft cases from their teams, while the focus remained on exploring innovative testing approaches and ensuring alignment with the latest industry standards. |
|
|
36 |
The Hot-Staging took place in Berlin over three weeks. During the first week, engineers arrived to install devices, with the latest hardware shipped from around the world. From January 26 to February 6, more than 85 engineers on-site participated in intensive testing. Detailed discussions and rapid problem-solving during this period resulted in over 1,839 validated outcomes and the preparation of the live demos for the Upperside World Congress in Paris. |
|
|
37 |
|
|
|
38 |
=== Interoperability Test Results === |
 |
2.1 |
40 |
EANTC engineers closely supported and validated every test combination, following strict procedures and predefined steps. The resulting report presents only results that were consistently logged, submitted, and verified by EANTC specialists, ensuring accuracy and preventing misinterpretations or false positives. |
|
|
41 |
While our focus is on multi-vendor testing, single-vendor cases are generally excluded. An exception is made if a previously validated multi-vendor test fails during hot staging, leaving only one vendor with a working, standards-compliant implementation. In such situations, EANTC acknowledges their effort and includes the result in the report. |
|
|
42 |
This test report highlights successful test combinations, clearly identifying the participating vendors and devices. “Tested” in this context refers specifically to multi-vendor interoperability. Combinations that did not pass are not shown in the diagrams but are mentioned anonymously to provide insight into the industry's current state. Maintaining confidentiality is essential to encourage vendors to present their latest, often still in beta, solutions, creating a safe environment for testing, learning, and advancing network interoperability. |
|
|
43 |
The test results will be presented live at the Upperside World Congress (previously the “MPLS World Congress”) in Paris, March 24–26. For 22 years, EANTC has showcased its interoperability testing at Upperside conferences, highlighting the latest advances in network technologies. |
|
|
44 |
|
|
|
45 |
(% id="prev-next-links" %) |