Selecting the correct cable size is one of the most critical decisions in electrical engineering and DIY wiring. Too small, and the cable overheats, causing fire or equipment failure. Too large, and the project becomes unnecessarily expensive and difficult to install. Getting it "just right" requires balancing physics, safety regulations, and practical constraints. Why Does Cable Size Matter? Electric current flowing through a wire generates heat. Every cable has a maximum operating temperature (e.g., 70°C for PVC insulation, 90°C for XLPE). If you push too many amps through a thin cable, the heat builds up, melting insulation, starting fires, or causing a dangerous voltage drop at the load end.
Example: 50 m run, 21.7 A, 2.5 mm² copper (R ≈ 7.98 Ω/km): ( V_{drop} = (2 \times 50 \times 21.7 \times 7.98) / 1000 = 17.3 , \text{V} ) On a 230 V supply, that's 7.5% drop – too high. Switch to 4 mm² (R ≈ 4.95 Ω/km): ( V_{drop} = 10.7 , \text{V} ) → 4.7% – acceptable. calculate the cable size
For safety-critical installations (submersible pumps, EV chargers, medical equipment), consult a qualified electrical engineer. The formulas above will get you in the ballpark; the regulations will keep you legal and alive. Selecting the correct cable size is one of
Formula for single-phase (approximate): [ V_{drop} = \frac{2 \times L \times I \times R_{cable}}{1000} ] Where: ( L ) = one-way length (m), ( I ) = actual load current (A), ( R_{cable} ) = resistance per km (from cable data, e.g., 7.98 Ω/km for 2.5 mm² copper). Getting it "just right" requires balancing physics, safety