Why is it that in amateur radio (or any radio transmission) we use coax cable for a transmission line? For that matter, what are the distinguishing characteristics of an electrical signal that demands coax instead of hookup wire?
An oversimplified answer is one-tenth. Is the length of the transmission line (if hookup) longer than one-tenth of the signal’s wavelength? While there is relative infidelity as a length approaches a tenth, at one tenth, a digital signal doubles in frequency as seen by a frequency counter instrument due to reflections from impedance mismatching.
Let’s think about this. We have a signal source operating at, let us say 300 MHz. The wavelength of a 300 MHz signal is 1 meter. What is one-tenth of 1 meter? The answer: 10 cm or 0.1 meter. If you intend to send that signal at a distance greater than 10 cm (4 inches), you definitely and without question need to use coax for a transmission media so as to match impedances and minimize reflections. But let us say that the distance is only 5 cm. You might get away with using hookup wire, depending on the need for fidelity but the signal distortion will probably be intolerable.
More will be written on this later to show analytically and with computer simulation why this is so.