If you wish to contribute or participate in the discussions about articles you are invited to contact the Editor
Real Time Kinematics: Difference between revisions
Line 14: | Line 14: | ||
The RTK technique follows the same general principle as [[Work in Progress:DGNSS Fundamentals|classical DGNSS]], but instead of using corrections to C/A code pseudoranges, it uses the carrier phase as its signal. | The RTK technique follows the same general principle as [[Work in Progress:DGNSS Fundamentals|classical DGNSS]], but instead of using corrections to C/A code pseudoranges, it uses the carrier phase as its signal. | ||
[[File:Summit_rtk_survey.jpg|RTK Support Polar Survey]] | [[File:Summit_rtk_survey.jpg|300px|thumb|RTK Support Polar Survey]] | ||
The accuracy of the resulting range measurement is generally a function of the ability of the receiver's electronics to accurately compare the two signals. In general receivers are able to align the signals to about 1% of one bit-width. For instance, the coarse-acquisition (C/A) code sent on the GPS system sends a bit every 0.98 microsecond, so a receiver is accurate to 0.01 microsecond, or about 3 metres in terms of distance. Other effects introduce errors much greater than this, and accuracy based on an uncorrected C/A signal is generally about 15 m. The military-only P(Y) signal sent by the same satellites is clocked ten times as fast, so with similar techniques the receiver will be accurate to about 30 cm. Therefore, in RTK system using the satellite's carrier as its signal, the improvement possible using this signal is potentially very high if one continues to assume a 1% accuracy in locking. For instance, the GPS coarse-acquisition (C/A) code broadcast in the L1 signal changes phase at 1.023 MHz, but the L1 carrier itself is 1575.42 MHz, over a thousand times as fast. This frequency corresponds to a wavelength of 19 cm for the L1 signal. Thus a ±1% error in L1 carrier phase measurement corresponds to a ±1.9 mm error in baseline estimation.<ref>[http://en.wikipedia.org/wiki/Real_Time_Kinematic RTK in Wikipedia]</ref> | The accuracy of the resulting range measurement is generally a function of the ability of the receiver's electronics to accurately compare the two signals. In general receivers are able to align the signals to about 1% of one bit-width. For instance, the coarse-acquisition (C/A) code sent on the GPS system sends a bit every 0.98 microsecond, so a receiver is accurate to 0.01 microsecond, or about 3 metres in terms of distance. Other effects introduce errors much greater than this, and accuracy based on an uncorrected C/A signal is generally about 15 m. The military-only P(Y) signal sent by the same satellites is clocked ten times as fast, so with similar techniques the receiver will be accurate to about 30 cm. Therefore, in RTK system using the satellite's carrier as its signal, the improvement possible using this signal is potentially very high if one continues to assume a 1% accuracy in locking. For instance, the GPS coarse-acquisition (C/A) code broadcast in the L1 signal changes phase at 1.023 MHz, but the L1 carrier itself is 1575.42 MHz, over a thousand times as fast. This frequency corresponds to a wavelength of 19 cm for the L1 signal. Thus a ±1% error in L1 carrier phase measurement corresponds to a ±1.9 mm error in baseline estimation.<ref>[http://en.wikipedia.org/wiki/Real_Time_Kinematic RTK in Wikipedia]</ref> |
Revision as of 07:25, 9 June 2011
Fundamentals | |
---|---|
Title | Real Time Kinematics |
Author(s) | GMV |
Level | Basic |
Year of Publication | 2011 |
Real Time Kinematic (RTK) satellite navigation is a DGNSS technique used in land survey and in hydrographic survey based on the use of carrier phase measurements of the GPS, GLONASS and/or Galileo signals where a single reference station provides the real-time corrections, providing up to centimetre-level accuracy. When referring to GPS in particular, the system is also commonly referred to as Carrier-Phase Enhancement, CPGPS.
Introduction RTK
The RTK technique follows the same general principle as classical DGNSS, but instead of using corrections to C/A code pseudoranges, it uses the carrier phase as its signal.
The accuracy of the resulting range measurement is generally a function of the ability of the receiver's electronics to accurately compare the two signals. In general receivers are able to align the signals to about 1% of one bit-width. For instance, the coarse-acquisition (C/A) code sent on the GPS system sends a bit every 0.98 microsecond, so a receiver is accurate to 0.01 microsecond, or about 3 metres in terms of distance. Other effects introduce errors much greater than this, and accuracy based on an uncorrected C/A signal is generally about 15 m. The military-only P(Y) signal sent by the same satellites is clocked ten times as fast, so with similar techniques the receiver will be accurate to about 30 cm. Therefore, in RTK system using the satellite's carrier as its signal, the improvement possible using this signal is potentially very high if one continues to assume a 1% accuracy in locking. For instance, the GPS coarse-acquisition (C/A) code broadcast in the L1 signal changes phase at 1.023 MHz, but the L1 carrier itself is 1575.42 MHz, over a thousand times as fast. This frequency corresponds to a wavelength of 19 cm for the L1 signal. Thus a ±1% error in L1 carrier phase measurement corresponds to a ±1.9 mm error in baseline estimation.[1]
The difficulty of the use of carrier phase data comes at a cost in terms of overall system complexity because the measurements are ambiguous (i.e. every cycle of the carrier is similar to every other). This makes it extremely difficult to know if you have properly aligned the signals or if they are "off by one" and are thus introducing an error of 20 cm, or a larger multiple of 20 cm. Solving this problem requires that ambiguity resolution (AR) algorithms must be incorporated as an integral part of the data processing. This integer ambiguity problem can be addressed to some degree with sophisticated statistical methods that compare the measurements from the C/A signals and by comparing the resulting ranges between multiple satellites. However, none of these methods can reduce this error to zero.
In practice, RTK systems use a single base station receiver and a number of mobile units. The base station re-broadcasts the phase of the carrier that it measured, and the mobile units compare their own phase measurements with the ones received from the base station. This allows the units to calculate their relative position to millimeters, although their absolute position is accurate only to the same accuracy as the position of the base station. The typical nominal accuracy for these dual-frequency systems is 1 centimetre ± 2 parts-per-million (ppm) horizontally and 2 centimetres ± 2 ppm vertically.
Although these parameters limit the usefulness of the RTK technique in terms of general navigation, it is perfectly suited to roles like surveying. RTK has also found uses in autodrive/autopilot systems, precision farming and similar roles. The Virtual Reference Station (VRS) method extends the use of RTK to a whole area of a reference station network. Operational reliability and the accuracies to be achieved depend on the density and capabilities of the reference station network.[2]
RTK Related Articles
The following articles include further information about different important topics related to RTK:
Notes