$d_T = d_M \pm E$
$d_T = d_M \pm \dfrac{d_M}{L_t} \cdot e$
Where
dT = true distance
dM = measured distance
E = total error = Ne
N = number of tape lengths = dM/Lt
Lt = length of tape as marked
e = error per tape length (too long or too short)
± = (+) for tape too long, (-) for tape too short
Hence,
$d_T = 160.42 + \dfrac{160.42}{50}(0.02)$
$d_T = 160.484 ~ \text{m}$ ← answer
I think the answer is not B
I think the answer is not B but A. When the 50 meter steel tape is .02 m too long, a 50 meter reading in the tape is actually 49.98 m since the tape 50 meter tape is longer by .02m. Therefore a 160.42 m reading on the tape, should be less than 160.42 because the 50 meter tape is longer by .02. Letter B answer (160.484m) is grater than 160.42;hence said suggested answer is WRONG
No. You actually mixed it up.
In reply to I think the answer is not B by ENER (not verified)
No. You actually mixed it up. If a 50-meter tape is 0.02 m too long then the correct distance for 50-m reading is equal to 50.02 m. The answer B is CORRECT.