A computer can process 100 records of data in 200 milliseconds. If a millisecond is 1/1000 of a second, how many seconds does it take to process a file of 1000 records?
Solution 1 (using step-by-step reasoning):
• If it takes 200 milliseconds to process 100 records, it will take 10 times longer (2000 milliseconds) to process a file of 1000 records
• 2000 milliseconds is the same as 2 seconds, because it takes 1000 of the little 1/1000 second parts to make one second.
Thus, it takes 2 seconds to process the file.
Notice: The above reasoning used reciprocals of the information provided as input.
• If 100 records are processed in 200 milliseconds,
then it takes 200 milliseconds to process 100 records.
• If a millisecond is 1/1000 second, then one second is 1000 milliseconds.
Thinking in reciprocal terms makes the reasoning easier.
You might also like to view...
Which operators can be overloaded only as non-static class members?
What will be an ideal response?
The formatting options for Web pages and an Access report are the same
Indicate whether the statement is true or false
A(n) ________ is a symbol that displays on the screen in response to moving a mouse and it can be used to select objects and commands
A) tile B) pointer C) badge D) icon
Words such as "for example", "therefore", "next", and "finally" provide ____________________ to help show a reader the relationship of one sentence to another.
Fill in the blank(s) with the appropriate word(s).