- Support portal
- Evaluation Kits and partner products
u-blox Support
- Product documentation
Documentation
- About
- Sustainability
- Partners and Alliances
- Contact
About u-blox
- Investor relations
Investor relations
Insights
|
24 Nov 2023
From standarization to optimization
The accurate measurement of time in the modern world is crucial for numerous human activities: coordinating global communication networks, synchronizing intricate technological processes, ensuring data accuracy in financial transactions, supporting a wide range of scientific endeavors, and enabling precise navigation and positioning systems. Positioning receivers, primarily used to locate and track objects, people, or animals, rely on accurate GNSS time synchronization for correct operation.
For these applications, the often overlooked time variable is considered even more critical than positioning information. Accurate time measurement, essential for the proper functioning of positioning receivers, wouldn’t have been possible without the development of three key temporal concepts over the past 150 years: time standardization, synchronization, and optimization.
Global standardization of time laid the foundation for consistent and synchronized communication, seamless international cooperation, and accurate coordination of activities across various sectors and regions.
This process commenced in the 19th century when European and American railway systems required precise timing coordination. Due to trains covering vast distances, significant time differences could arise, for instance, between the US East and West Coasts.
The global synchronization of time began in the 1940s. It holds immense relevance for a wide range of technologies and industries – from navigation and telecommunications to finance and power grids.
Lastly, time optimization focuses on achieving precision, made possible by the development of sophisticated atomic clocks. These highly accurate clocks play a crucial role in the precise measurement of time, enabling satellite navigation systems to locate objects on Earth with extreme accuracy.
To some extent, global time standardization has its roots in the coordination of train arrivals and departures in local systems. Before the industrialization era and the advent of railway systems, local times presented only minor inconveniences. Travelers, such as those journeying from London to Manchester by coach, would simply adjust their watches upon arrival.
This changed significantly as the train network expanded and demanded greater organization. Coordinating train traffic between towns and cities without time standardization posed a considerable logistical challenge. Consequently, countries like France and the UK adopted time standardization following the International Meridian Conference (1884).
This was likely one of the earliest instances where the concept of time became crucial for a transnational organization that encompassed transportation and international communication. In this case, though, time remained separate from technological advancements, as it did not play any role within the trains’ mechanisms.
A century later, global developments in industry, finance, science, and technology triggered the demand for precision that surpassed what conventional clocks could offer. To keep pace with this progress, these sectors needed more than just standardized time and the relative precision of quartz clocks. As engineers overcame technical challenges, a more sophisticated method of measuring time emerged.
Over the years, time synchronization has become essential to global technological advancements. Precise time measurement was to be the first step. In 1968, the National Institute of Standards and Technology (NIST) introduced a more accurate measurement of the second, with atomic clocks playing a fundamental role in this achievement.
The concept of measuring time based on the frequency of an electromagnetic wave originated in the 1870s. James Clerk Maxwell was the first to suggest that time measurement could be determined via the periodic time of vibration of a specific type of light, with its wavelength serving as the unit of length.
Later, in the 1910s, Niels Bohr proposed that electrons have quantized energy states. These two ideas form the foundations of how an atomic clock functions.
Atomic clocks operate based on the principle of atomic resonance, utilizing atoms' unique energy levels and transitions. A chosen atom, like cesium-133, is exposed to microwave radiation at its resonance frequency. This exposure causes electrons to change energy levels. The precise microwave frequency then becomes the ticking mechanism of the clock. By counting these transitions, it is possible to define time. The number of transitions between energy levels in specific atoms defines the second.
Maxwell and Bohr’s theoretical contributions materialized years later when Isidor Rabi first attempted to build an atomic clock. In the 1940s, Harold Lyons and his team made further progress in developing such a device. But it wasn’t until the following decade that Louis Essen created the first atomic clock (1955). Twelve years later, a second was defined as the duration of 9,192,631,770 oscillations of a cesium atom.
With this technology in place, terrestrial applications were the first target. Yet, the synchronization of time for satellites, crucial for positioning applications, quickly emerged as an application.
A combination of scientific and technological advancements in space and satellite technology influenced the idea of using precise time synchronization for global positioning and navigation. While there isn't a single individual who can be pinpointed as the sole originator of this concept, the U.S. Naval Research Laboratory (NRL) played a significant role in exploring the use of atomic clocks for accurate timekeeping in satellites.
Scientists and engineers at NRL recognized the potential of precise time measurement in shaping a revolutionary navigation system. While the initial satellite under the TIMATION program didn’t incorporate an atomic clock, the program laid the groundwork for developing the Global Positioning System (GPS). Following the merger of the TIMATION program with the Air Force 621B program, the subsequent NAVSTAR GPS program featured satellites equipped with the first atomic clocks.
The NAVSTAR GPS program launched the Navigation Technology Satellite 1(NTS-1) in 1974. This marked the debut of satellites equipped with an atomic clock to test global positioning through precise time synchronization.
The mission demonstrated that atomic clocks in orbit, specifically the rubidium frequency standard atomic clock used, could achieve highly accurate timekeeping and precise Earth positioning.
The success of NTS-1's launch contributed significantly to establishing the GPS network. NTS-1 was the precursor of GPS, which relies on atomic clocks to provide exact time and, hence, position information. Today, GNSS satellites use three types of atomic clocks: rubidium vapor cells, cesium atomic beams, and hydrogen masers.
While atomic clocks are highly accurate, they are not infallible timekeepers. Time measurement via satellite atomic clocks is susceptible to clock imperfections, temperature variations, oscillator characteristics, aging, relativistic effects, and atmospheric delays.
Temperature variations affect atoms and the oscillators within atomic clocks, leading to clock frequency and stability fluctuations, which in turn impact the clock’s accuracy.
Positioning satellites orbit at an altitude of approximately 20,200 km above Earth. At this height, gravitational forces differ slightly from those experienced on the Earth’s surface.
This variation in gravity affects the passage of time for satellites relative to Earth observers, resulting in time flowing faster for satellites. The relativistic time dilation experienced by satellites has practical implications for time measurement. Since satellites are in constant motion, they experience variable gravitational forces from the Earth, adding to the complexities of timekeeping.
Atmospheric delays are another source of error in accurately measuring time. In a world without atmosphere, calculating the time it takes for a signal to travel from a satellite to a GNSS receiver would be a walk in the park. Unfortunately, the ionosphere influences radio signal propagation by altering its path, leading to time estimation inaccuracies.
Due to these errors, the timekeeping of satellite atomic clocks needs constant comparison with reference clocks on Earth.
‘We have to bear in mind that all our propositions involving time are always propositions about simultaneous events.’ Albert Einstein, On the electrodynamics of moving bodies, 1905.
To determine the location of a person or an object on Earth using satellite technology, a minimum of four satellites must communicate their positions and times to a positioning receiver. Accurate positioning can't occur if time readings differ, a situation arising from the phenomena mentioned in the previous section.
The synchronization of time information among satellites is crucial for computing precise positioning data. This synchronization is continuously monitored and adjusted by ground-based GNSS monitoring stations.
Apart from continuously observing and collecting data on the signals transmitted by satellites, ground-based monitoring stations also measure atmospheric data, such as ionospheric and tropospheric delays, which primarily impact the accuracy of GNSS signals. The collected data is then processed to identify errors and variations in the GNSS signals.
GNSS monitoring stations are pivotal in refining the accuracy of calculated positions by providing corrections to GNSS receivers. This process ensures synchronization and is contingent on precise timekeeping by atomic clocks and GNSS time servers.
The development of GNSS monitoring stations parallels the history and expansion of satellite navigation systems. The first satellite monitoring station was established at the Johns Hopkins University Applied Physics Laboratory (APL) in the 1950s. This station played a crucial role in the development of the Transit system, the world's first operational satellite navigation system, which paved the way for developing civilian satellite navigation systems like GPS.
GNSS monitoring stations are just one part of the larger ground segment. This segment, crucial to GNSS satellite constellations, comprises diverse facilities like control centers, segments, antennas, and monitoring stations. Over the past 70 years, the GNSS ground segment has evolved, yet one of its core tasks has persisted: ensuring the integrity of the timing signal.
There is more to accurately positioning an entity on Earth than just spatial information. Assigning different time zones worldwide, precisely measuring the fundamental unit of time, and deploying atomic clocks on satellites were all essential factors in the development of current GNSS time measurement, which, in turn, enabled the accuracy of positioning receivers.
That said, even the most precise atomic clocks may experience drift. Discrepancies or drift in satellite clocks and atmospheric conditions introduce errors that alter positioning calculations, impacting the system's accuracy. Since differences in elapsed time ultimately affect positioning measurements, they must be corrected by ensuring timekeeping accuracy on Earth.
Many positioning applications depend on time synchronization to function correctly. Now you have a clearer picture of how the organization of the railway systems, Maxwell’s developments, and atomic clocks have influenced the current GNSS time measurement for accurately locating objects and living beings.
Samuli Pietila
Director of product management for Timing and Infrastructure GNSS