CraftRigs
CraftRigs / Glossary / LPDDR5X
Memory & Storage

LPDDR5X

Low-power high-bandwidth RAM used in Apple Silicon chips as the unified memory substrate.

LPDDR5X (Low Power Double Data Rate 5X) is a mobile-class memory standard optimized for power efficiency. It's the physical memory used in Apple Silicon chips — the M-series Macs use LPDDR5X as their unified memory, shared between the CPU, GPU, and Neural Engine.

The "LP" in the name stands for Low Power, and that tradeoff is real: LPDDR5X prioritizes energy efficiency over raw bandwidth. It's designed to deliver sustained performance in compact, thermally constrained systems — laptops, tablets, and small form factor devices — without overheating or draining batteries rapidly.

Bandwidth Numbers

LPDDR5X bandwidth in Apple Silicon varies by chip tier:

  • M4 (base): ~120 GB/s
  • M4 Pro: ~273 GB/s
  • M4 Max: ~546 GB/s
  • M4 Ultra (two M4 Max dies): ~819 GB/s

For comparison, GDDR6X in the RTX 4090 delivers 1,008 GB/s. The M4 Max is competitive with midrange discrete GPUs, while the M4 Ultra approaches RTX 4090 territory.

The Integration Advantage

Raw bandwidth numbers don't tell the full story. Because LPDDR5X in Apple Silicon is on-package — physically integrated with the chip rather than sitting on a separate PCB — latency is extremely low and there's no PCIe bus overhead. This is part of why Apple Silicon performs better per GB/s than discrete GPU bandwidth comparisons suggest.

Capacity Options

Apple Silicon Macs are sold with fixed unified memory configurations: 16GB, 24GB, 36GB, 48GB, 64GB, 128GB depending on the chip. Unlike a desktop where you can add RAM later, unified memory is soldered at purchase time. Choosing the right configuration upfront is critical.

Why It Matters for Local AI

LPDDR5X is the reason Apple Silicon Macs can run large LLMs efficiently in a fanless or near-silent laptop form factor. The power efficiency means sustained inference workloads don't throttle or overheat the way a gaming laptop GPU would. If you need a portable local AI workstation, the LPDDR5X foundation in Apple Silicon is what makes that viable.