Question:

Wave motions?

by  |  earlier

0 LIKES UnLike

A radar signal takes 2.7s to go to the moon and return How far was the moon at that time

 Tags:

   Report

4 ANSWERS


  1. Assuming the radar signal travels at 186,000 miles per second, its total length was 502,200 miles.  But that was to the Moon and back, so the Moon's distance was half of that number, or 251,100 miles.


  2. 510,000 miles away. not so sure about this answer.

  3. what is wave length of that signal?because u can find it with the formula :

                    distnace=(frequency)(wave length)(time)

    t= 2.7s

    f=.37HZ

    but no wave length known also u have to look that both the objects are continiuosly moving the radar waves also the moon so u have to rovide some extra details.

  4. All light travels at 186,000 miles per second or 300,000 km per second. So since 2.7 is a round trip (there and back) we have to divide the result by 2. SO 186,000 x 2.7=502200.

    502200/2=251100 miles away.

    Or

    300,000 x 2.7= 810000. 810000/2=405000 km away.
You're reading: Wave motions?

Question Stats

Latest activity: earlier.
This question has 4 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.