View Single Post
Old 03-24-2013, 01:26 PM   #10
NN5I
Carl, nn5i
 
NN5I's Avatar
 
Join Date: Sep 2011
Location: Tallahassee, FL
Posts: 1,441
Default

Well, I'd say it must be one of two things: (1) inexpensive SWR meter; (2) operator error.

The SWR is a a measure of the fraction of the power going up the transmission line toward the antenna that is reflected back down the transmission line by a mismatch at the antenna. It depends on two things: (1) the [complex] characteristic impedance of the transmission line, which in the case of commercially manufactured coaxial cable will be quite constant; and (2) the [compex] impedance of the load (the antenna), which likewise is independent of power level. The reflection, and thus the SWR, depends on nothing else.

If you ran so much power that you damaged the cable or the antenna, the SWR would change; but then it wouldn't change back when you reduced the power.

So, that leaves the instrument or the operator. I'm betting on the instrument, which may perhaps be not quite linear with power changes. Many (nearly all) inexpensive SWR gauges use diodes to rectify the RF for measurement. Since diodes have a nearly constant forward drop, this means that the meter is less sensitive at low power levels than at high levels. This makes for a large error at low levels and a smaller error at high levels. Thus the SWR measurement is usually less accurate (lower!) at low levels than at high levels; you ought to trust the high-power reading more than you trust the low-power reading. Also note that the SWR error from the diode forward drop will always result in an artifically low SWR reading at lower levels.

So I'd say that your actual SWR is nearer to 2.5 than to 2.0. That's a little high for good performance at 144 MHz with 35 feet of coax.
__________________
-- Carl
NN5I is offline   Reply With Quote