Jump to content

Radar mile

From Wikipedia, the free encyclopedia

Radar mile or radar nautical mile is an auxiliary constant for converting a (delay) time to the corresponding scale distance on the radar display.[1]

Radar timing is usually expressed in microseconds. To relate radar timing to distances traveled by radar energy, the speed is used to calculate it. With speed of radar waves at approximately the speed of light in vacuum or 299,792,458 metres per second (300 m/μs; 984 ft/μs) and nautical mile at 1,852 metres (6,076 ft), the delay per nautical mile until the wave return is calculated as:

The radar pulse takes a certain amount of time between transmitting the sounding signal to receiving the echo - if the object is exactly one mile away, that time is one radar mile.

A pulse-type radar set transmits a short burst of electromagnetic energy. The target range is determined by measuring elapsed time while the pulse travels to and returns from the target. Because two-way travel is involved, a total time of 12.35 microseconds per nautical mile will elapse between the start of the pulse from the antenna and its return to the antenna from a target in a range of 1 nautical mile. In equation form, this is:

[2]

References

[edit]
  1. ^ "NEETS - Naval Electrical Engineering Training Serie". Retrieved 2020-12-31.
  2. ^ "Radartutorial". C. Wolff. November 1998. Retrieved 2021-01-01.