Deutsche Telekom Industrial AI Cloud wins SOOFI contract

Deutsche Telekom’s T-Systems has secured a multi-million-euro contract from Leibniz University Hannover to power SOOFI, a flagship initiative to build a 100-billion-parameter, European-operated large language model. The SOOFI (Sovereign Open Source Foundation Models) project will train a next-generation, open-source LLM focused on European languages and industrial requirements, replacing the current 7-billion-parameter Teuken7B with a model two orders of magnitude larger. T-Systems will host and operate the training environment in its new Industrial AI Cloud—an NVIDIA-powered facility that DT and NVIDIA unveiled as part of a €1 billion partnership.
Deutsche Telekom Industrial AI Cloud wins SOOFI contract
Image Source: Deutsche Telekom

Industrial AI Cloud win advances sovereign European LLM

Deutsche Telekom’s T-Systems has secured a multi-million-euro contract from Leibniz University Hannover to power SOOFI, a flagship initiative to build a 100-billion-parameter, European-operated large language model.

SOOFI 100B model on EU-sovereign compute

The SOOFI (Sovereign Open Source Foundation Models) project will train a next-generation, open-source LLM focused on European languages and industrial requirements, replacing the current 7-billion-parameter Teuken7B with a model two orders of magnitude larger. T-Systems will host and operate the training environment in its new Industrial AI Cloud—an NVIDIA-powered facility that DT and NVIDIA unveiled as part of a €1 billion partnership. The deployment underscores a pivot toward EU-based compute, data residency, and governance for AI development.


Leibniz-led consortium with T‑Systems and NVIDIA

Leibniz University Hannover is leading the effort with scientists from six German research institutions and two start-ups. The initiative is funded by Germany’s Federal Ministry for Economic Affairs and Energy (BMWK) and supported by the Fraunhofer Society, the German AI Association, and the Center for Sovereign AI (CESAI). T-Systems provides the AI factory; NVIDIA supplies the GPU systems; and DT brings telco-grade connectivity and operations discipline.

EU AI Act, data residency, and vendor-neutral options

European enterprises need high-quality, multilingual models that can be trained, tuned, and operated under EU rules and values—especially as the EU AI Act and sector regulations take shape. Sovereign, open foundation models offer an alternative to US and Chinese stacks, reduce vendor lock-in, and allow tighter control over data pipelines, safety methods, and lineage—critical for regulated industries and public sector deployments.

Inside DT’s NVIDIA-powered Industrial AI Cloud

T-Systems is positioning the platform as one of Europe’s largest AI factories with telco-grade connectivity, security, and resiliency.

10,000+ GPUs, 0.5 exaFLOPS, 20 PB storage

The facility aggregates more than 10,000 GPUs, delivers about 0.5 exaFLOPS of compute, and integrates roughly 20 petabytes of storage to handle large-scale pretraining and fine-tuning. Four 400 Gbps fiber links feed the data center to minimize bottlenecks across ingestion, checkpointing, and distributed training. From March 2026, approximately 130 NVIDIA DGX B200 systems—exceeding 1,000 GPUs—will be dedicated to SOOFI, ensuring predictable capacity for long training runs.

GDPR-aligned security, EU data residency, controlled access

Operations remain on German soil with a stack engineered for strict data protection, security, and reliability—keys for GDPR compliance and alignment with emerging EU AI governance. The setup supports EU data residency, controlled access to training artifacts, and regulated transfer paths—capabilities enterprises will need for auditability and risk management.

DGX B200 capacity and phased LLM training roadmap

The platform is live and scaling, with a discrete SOOFI allocation starting in 2026 to lock in GPU supply. Expect phased milestones: corpus curation and synthetic data pipelines; multi-lingual tokenizer and pretraining; alignment and evaluation; then domain-tuned variants for industry, SMEs, and public services.

SOOFI: building a 100B open-source European LLLM

SOOFI is designed to form the backbone for an open, trustworthy, and technologically independent European AI ecosystem.

100B multilingual LLM for industry, SMEs, and public sector

The program targets a 100B-parameter multilingual LLM trained and operated entirely in Europe, with performance optimized for industrial tasks, public sector workflows, and SME affordability. It aims to raise European benchmarks in safety tooling, evaluation frameworks, and instruction-following for languages underrepresented in global models.

BMWK funding, transparent governance, open research

Backed by BMWK and guided by organizations such as Fraunhofer, the German AI Association, and CESAI, SOOFI blends academic rigor with industry pull. Governance priorities include transparency, traceable data lineage, and open research practices that foster trust and community contribution without compromising security.

Open licensing to power domain adapters and tooling

Open licensing is expected to catalyze a commercial ecosystem of adapters, domain packs, and tooling. Manufacturers, utilities, healthcare providers, and city administrations can benefit from localized models that align with EU standards, reduce legal friction, and support vendor-neutral integration into existing IT and OT systems.

What sovereign LLMs mean for telcos, edge, and enterprise IT

The deal signals how sovereign AI and telco-grade cloud infrastructure will converge across networks, edge, and enterprise workloads.

Telco-grade copilots, 5G edge inference, factory AI

For operators, a sovereign LLM provides a backbone for NOC automation, customer care copilots, and intent-based networking—while meeting data residency needs. Paired with 5G and edge footprints, telcos can deliver low-latency inference for industrial automation, visual inspection, and safety monitoring on factory floors and campuses.

Multilingual RAG, code generation, secure enterprise copilots

European-trained models should better handle technical German, French, Italian, and other EU languages in service manuals, engineering logs, and compliance documents. Expect improvements in code generation for industrial systems, multilingual RAG over proprietary data, and copilots that respect enterprise control planes and security policies.

Competing with hyperscalers on sovereignty and GPU access

This move places DT and T-Systems in direct competition with hyperscalers’ AI offerings by emphasizing sovereignty, open models, and predictable GPU access. For buyers, it broadens choice and strengthens negotiating power while reducing exposure to extraterritorial data regimes.

Risks to track, buyer questions, and 2025–2026 milestones

Execution at 100B scale is non-trivial and requires careful planning across compute, power, data quality, and evaluation.

GPU, energy, training stability, safety, and community

GPU supply and energy costs; training stability and efficiency at 100B parameters; multilingual quality across low-resource languages; safety evaluation reproducibility; and sustained funding and community engagement beyond initial releases. Additionally, integrating with heterogeneous enterprise stacks and OT networks remains complex.

SLAs, licensing, fine-tuning isolation, and deployment options

Clarify SLAs for model access, uptime, and incident response; data residency and isolation guarantees for fine-tuning; licensing terms for commercial use; roadmap for domain-tuned variants; integration with MLOps pipelines and vector databases; inference options across cloud, on-prem, and edge; and energy efficiency metrics per token trained and served.

Corpus transparency, benchmarks, partners, and DGX ramp

Corpus composition and transparency; early checkpoints and evaluation results against European benchmarks; partnerships with ISVs and vertical specialists; availability of guardrails and auditing tools aligned to EU AI governance; and the ramp of NVIDIA DGX B200 capacity dedicated to SOOFI from March 2026.

Promote your brand in TeckNexus Private Network Magazines. Limited sponsor placements available—reserve now to be featured in upcoming 2025 editions.

TeckNexus Newsletters

I acknowledge and agree to receive TeckNexus communications in line with the T&C and privacy policy

Whitepaper
Private cellular networks are transforming industrial operations, but securing private 5G, LTE, and CBRS infrastructure requires more than legacy IT/OT tools. This whitepaper by TeckNexus and sponsored by OneLayer outlines a 4-pillar framework to protect critical systems, offering clear guidance for evaluating security vendors, deploying zero trust, and integrating IT,...
Whitepaper
Telecom networks are facing unprecedented complexity with 5G, IoT, and cloud services. Traditional service assurance methods are becoming obsolete, making AI-driven, real-time analytics essential for competitive advantage. This independent industry whitepaper explores how DPUs, GPUs, and Generative AI (GenAI) are enabling predictive automation, reducing operational costs, and improving service quality....
Whitepaper
Explore how Generative AI is transforming telecom infrastructure by solving critical industry challenges like massive data management, network optimization, and personalized customer experiences. This whitepaper offers in-depth insights into AI and Gen AI's role in boosting operational efficiency while ensuring security and regulatory compliance. Telecom operators can harness these AI-driven...
Supermicro and Nvidia Logo
Private Network Solutions - TeckNexus

Subscribe To Our Newsletter

Tech News & Insight
Tech News & Insight
Tech News & Insight

Feature Your Brand in Upcoming Magazines

Showcase your expertise through a sponsored article or executive interview in TeckNexus magazines, reaching enterprise and industry decision-makers.

Scroll to Top

Feature Your Brand in Private Network Magazines

With Award-Winning Deployments & Industry Leaders
Sponsorship placements open until Nov 21, 2025