- Tech News & Insight
- October 20, 2025
- Hema Kadia
					Arm and Meta have inked a multi-year partnership to scale AI efficiency from hyperscale data centers to on-device inference, aligning Arm’s performance-per-watt strengths with Meta’s AI software and infrastructure stack. Meta plans to run its ranking and recommendation workloads on Arm Neoverse-based data center platforms as part of an ongoing infrastructure expansion. The companies are co-optimizing AI software components—spanning compilers, libraries, and frameworks like PyTorch, FBGEMM, vLLM, and the ExecuTorch runtime—so models can execute more efficiently on Arm CPUs in the cloud and on Arm-based devices at the edge. The work includes leveraging Arm’s KleidiAI optimizations to improve inference throughput and energy efficiency, with code contributions flowing back to open source.				
				

 
 
 
 
 
 
 
 
								 
 
 
 
 
 
 
 
 
 
				 
 
 
