Kaplan Physics Q: Light waves

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

4563728

Full Member
10+ Year Member
Joined
Nov 3, 2012
Messages
165
Reaction score
107
A source emits light of wavelength 600 nm for 0.01 microseconds. how many complete waves are sent out?

The way I tried to do the problem was with d=vt, v being the speed of light (3e8) times the t=0.01 us. Then solving for distance I divided that by the total number of waves in that given time and the answer didn't match Kaplan's answer. The found the period first that solved to the complete number of waves, which makes sense but why is my solution wrong?
 
Find the frequency of that wavelength.

Frequency = cycles / second ... multiply that frequency by 0.01 x 10^-6 seconds. Seconds will cancel out and you're left with however many waves were completed.
 
I understand the by finding the frequency I can figure out the answer but why doesn't the equation d=vt work? If v was speed of light and t was the time given. Then I thought dividing distance d by the wavelength would give me the correct answer. Why is this method incorrect?
 
I understand the by finding the frequency I can figure out the answer but why doesn't the equation d=vt work? If v was speed of light and t was the time given. Then I thought dividing distance d by the wavelength would give me the correct answer. Why is this method incorrect?

Are all your units correct? What's the answer the give?
 
Top