MAX2870 Signal Generator
Frequency Accuracy


A MAX2870 Signal Generator has been modified to accept an external reference derived from a Rubidium Frequency Source (RFS) to provide  stable frequency signal outputs in the GHz frequency range. The frequency accuracy of those GHz output signals also needs to be known.

This page investigates the frequency accuracy of the MAX2870 Signal Generator.

Establishing the Accuracy of the RFS

Before attempting to measure the accuracy of the MAX2870 Sig Gen, the accuracy of the RFS used as an external frequency reference needs to be established.

As a GPS-derived standard is not at hand, two RFS units are compared against each other. Both are set to nominally 10 MHz, but due to different internal reference frequencies they are programmed with slightly different divider values as shown below in the table.

Note the actual frequency programmed is 9.999999998 MHz as this is the closest frequency to 10 MHz which can be set with the 0.0117 Hz resolution afforded by the 32-bit divider.

Adding the 10 MHz outputs from two RFS using an oscilloscope and timing the delay between anti-phase epochs (where the sum of the two signals equals zero) the frequency difference between the sources can be calculated.

The delay between minimums (anti-phase) was timed as 1m:46s (106 s) which is equivalent to a difference of 1/106 = 9.4 mHz = 0.0094 Hz. This is 0.00094 ppm at 10 MHz. This within the 0.0012 ppm step in frequency setting resolution of the RFS units.  Unless an unlikely event has occurred - where both RFS units have drifted by the same amount over their age, it is reasonable to assume that the RFS units are accurate to 0.005 ppm. When funds are available a GPS-derived 10 MHz standard will be purchased to verify both RFS units.

Establishing the Accuracy of the PLL-based Clock Multiplier

This exercise is also necessary before testing the MAX2870 for accuracy because the clock multiplier generates the reference for the MAX2870. The version on hand is shown on the right and uses an NB3N501 PLL Clock Multiplier IC. The ratios are selected by either 'blobbing' S1 and S2 (top-right in the image on the right) to 'H' or 'L'.  The selection lines have three states - tied high (H), tied low (L), or left floating (M). This give extra states (9) over the 4 available if only H and L states were valid.

The flip-side of the PCB has a screen print table of the various valid combinations and the associated frequency multiplication ratios as shown in the second image on the right.

The other version ('in the mail' as of 25th April '23) uses the NB3N502 IC - which has 6 selectable ratios - including x2.5 and x3.  The x2.5 ratio would be useful as many instruments use 25 MHz for the reference frequency.

Both ICs have a quoted feature "Zero ppm Clock Multiplication Error". This would seem to indicate that the ratios are exact - so a 10 MHz signal from an RFS will be multiplied to a frequency of 30 MHz with the same accuracy as the 10 MHz RFS signal. 

Nonetheless, this will be tested.

Test Setup

Two signal paths will compared to check the Clock Multiplier frequency accuracy.

Comparing the two 30 MHz signals was done by feeding in the output of the comb generator into CH1 of the oscilloscope (and using it as the trigger) and the output of the clock multiplier to CH2. The phase relationship did change from the initial state as shown below even after an hour. This shows the 30 MHz output of the Clock Multiplier can be taken as having RFS level accuracy.

NOTE: the whole 'phase-locked' composite display has moved to the right slightly due to the analogue trigger level drifting over the hour. This has no bearing on the phase relationship between the two input signals.

Establishing the Accuracy of the FFT Analysis Process

From this stage onwards use is made of a software FFT analyser backend to measure frequency. It consists of a Nooelec NESDR Smart dongle (RTL-SDR) driven by an application which captures data and performs an FFT - essentially a software spectrum analyser.

As the RTL-SDR can be programmed to a range of input frequencies, it is reasonable to expect that there would be some frequency offset between the actual tuned centre frequency and the programmed centre frequency of a magnitude inversely proportional to 2^N - where N is the number of bits in the internal tuning frequency multiplier (presumably driven by the onboard 28.8 MHz crystal oscillator).

To test this, a function was coded which searches for a number of comb marker generator harmonics of the 30 MHz RFS-derived clock (1500, 1530 and 1560 MHz) and scans each one by incrementing the centre frequency in steps of 10 kHz such that the selected harmonic moves through about 1.8 MHz of the 2 MHz bandwidth of the RTL-SDR.  At each step the frequency of the harmonic is measured and results output to a *.CSV file for further analysis in a spreadsheet.

 The result for each harmonic (1500, 1530 and 1560 MHz) had similar characteristics and so just the result for 1500 MHz is shown here above. Three runs of the test were done spaced apart by 2 hours. As can be seen the results are identical (i.e., run #1 overlays and obscures runs #2 and #3) for the three runs showing that the characteristic of the plot is not the result of noise or some random offset. There is evidence of cyclic behaviour as the finite frequency resolution of the frequency tuning hardware attempts to land on the programmed frequency. It can be seen that the offset ranges between +0.6 ppm and -0.4 ppm, or 1.0 ppm peak-to-peak.  This would seem to indicate that the number of bits in the frequency setting hardware is 20 ( 1 / (2^20) = 0.95 ppm).  The average value of +0.09 ppm would appear to be the error in the RTL-SDR sample clock.


The frequency tuning error of the RTL-SDR can be of the order of +/-0.5 ppm - but this error is deterministic.  Accordingly - to calibrate out the error, the same tuning frequency must be used for both the calibration against the RFS-derived output of the comb marker generator and the measurement of the target frequency.  That is, both frequencies must fit inside the approx. 2 MHz bandwidth of the RTL-SDR. This in turn means that frequencies to be measured around 1.5 GHz must lie within +/-1.0 MHz around 1.5, 1.53 or 1.56 GHz - or any harmonic of 30 MHz. There is a case for changing to a 25 MHz RFS-derived clock - if only to have nice round harmonic frequency numbers, four for each 100 MHz.

Measuring MAX2870 Accuracy

Having established the frequency accuracy of the external 30 MHz reference signal and the frequency offset of the software FFT frequency measurement, the frequency accuracy of the MAX2870 can now be examined. As the test output frequency is about 1.5 GHz neither the 40 MHz oscilloscope nor the 3 GHz spectrum analyser on hand can be used. The first because 1.5 GHz is outside its bandwidth, and the second because its minimum resolution bandwidth (RBW) is 30 kHz - which is ~20 ppm at 1.5 GHz. Instead an RTL-SDR dongle receiver (tuned to 1.5 GHz) is used with a software FFT analyser backend set to 10 Hz RBW (0.0065 ppm @ 1.5 GHz). 

Test Setup


Frequency Measurement Results

The 'flat-top' to the spectrum is probably caused by the behavior of either the RTL-SDR sampling clock jitter, the FFT algorithm or the RFS.  In any case the 50 Hz width is equal to 0.033 ppm and the symmetry allows accepting the mean as the frequency of the peak.

Comb Marker Generator - 1500 MHz harmonic: In this case the 1530.000000 MHz RFS-derived frequency is read as 1530.000587 MHz - giving an error of +587 Hz at 1.53 GHz, or about +0.384 ppm. This is the RTL-SDR calibration correction to be used for MAX2870 frequency measurements.

MAX2870 1530.000 MHz: The results for the MAX2870 Signal Generator also displayed the 'flat-top' to the spectrum - but this effect is not due to the MAX2870 as evident from the RFS-derived result above.

Here the 1530.000 MHz frequency is read as 1530.000572 MHz - giving an error of -15 Hz at 1.53 GHz when taking into account the frequency error of the RTL-SDR, or about -0.01 ppm.

This is of the order of the RBW of the measurement (10 Hz) - so the actual error might range between about 0 to -0.02 ppm.

MAX2870 1530.001 MHz: Here the 1530.001 MHz frequency is read as 1530.001312 MHz - giving an error of -275 Hz at 1.53 GHz when taking into account the frequency error of the RTL-SDR, or about -0.18 ppm.

MAX2870 1528.700 MHz: Here the 1528.700 MHz frequency is read as 1528.700075 MHz - giving an error of -512 Hz at 1.528700 GHz when taking into account the frequency error of the RTL-SDR, or about -0.33 ppm.


The MAX2870 Signal Generator frequency accuracy is observed to be of the same order as the RTL-SDR (or possibly worse). Getting the finest frequency resolution of the MAX2870 involves finding the best combination of such things as VCO frequency and division ratios. It is unclear whether this signal generator has that optimisation function built into the control software.

This means that the MAX2870 Signal Generator will also need to be calibrated for each frequency setting used.