Phi-4 14B
Phi-4 14B is Microsoft's math and reasoning specialist. Despite its modest 14.7B parameter count, it achieves performance competitive with 70B models on mathematical reasoning, logic puzzles, and structured problem-solving. The MIT license …
14.7B
Parameters
16K
Max Context
Dense
Architecture
Dec 12, 2024
Released
Text
Modality
About Phi-4 14B
Phi-4 14B is Microsoft's math and reasoning specialist. Despite its modest 14.7B parameter count, it achieves performance competitive with 70B models on mathematical reasoning, logic puzzles, and structured problem-solving. The MIT license makes it safe for commercial use. At ~8 GB VRAM at Q4_K_M, it fits on 12 GB GPUs. The trade-off: it is less strong on creative writing, general chat, and world knowledge compared to general-purpose models of similar size. Best used as a specialized reasoning tool alongside a general-purpose model.
Technical Specifications
System Requirements
Estimated VRAM at 10% overhead for different quantization methods and context sizes.
| Quantization | 1K ctx | 16K ctx |
|---|---|---|
Q4_K_M0.50 B/W ~97% of FP16 | 7.79Consumer GPU | 10.72Consumer GPU |
Q8_01.00 B/W ~100% of FP16 | 15.39Consumer GPU | 18.32Consumer GPU |
F162.00 B/W Reference | 30.59Datacenter GPU | 33.52Datacenter GPU |
Other Phi Models
View AllFind the right GPU for Phi-4 14B
Use the interactive VRAM Calculator to see exactly how much memory you need at any quantization level, context length, and overhead setting.