Question
Question: When a tiger spots points prey at the distance of \( 200m \) . What is the minimum time it will take...
When a tiger spots points prey at the distance of 200m . What is the minimum time it will take to get its prey such that its average velocity 90Kmh−1 ?
Solution
Hint : In order to solve this question, we are first going to take the distance and the speed that are given, where the distance is given in meters, speed is in kilometers per hour, so, we convert the speed into meter per second and then calculate the time by dividing the distance and the speed.
To convert the speed in Kmh−1 to ms−1 , we multiply the speed given, with 1000 to convert into meters and divide it with 3600 to convert into second
i.e. 1Kmh−1=36001×1000ms−1
Time taken by the tiger can be calculated as
time=speeddistance
Complete Step By Step Answer:
In order to solve this question, we are going to first take the distance and the speed and convert them into the S.I. units
As it is given that,
distance = 100m \\\
speed = 90Km{h^{ - 1}} \\\
Converting Kmh−1 to ms−1
speed(ms−1)=60×6090×103=25ms−1
Now we know that the time taken can be calculated by dividing the distance covered with the speed of the tiger
time=speeddistance=25100=4s
Hence, the minimum time it will take to get its prey such that is average velocity is 90Kmh−1 : 4s
Note :
It is important to note that without the conversion of units, a question should never be solved, in this problem, before finding the time, first of all we need to make the distance as well as the speed into a same system of units only, then the time can be found either in hours or in seconds according to the conversion.