A search engine is a web server that responds to client requests to search in its stored indexes and (concurrently) runs several web crawler tasks to build and update the indexes. What are the requirements for synchronization between these concurrent activities?

What will be an ideal response?


The crawler tasks could build partial indexes to new pages incrementally, then merge them with the active index (including deleting invalid references). This merging operation could be done on an off-line copy. Finally, the environment for processing client requests is changed to access the new index. The latter might need some concurrency control, but in principle it is just a change to one reference to the index which should be atomic.

Computer Science & Information Technology

You might also like to view...

Word checks your spelling and grammar as you type

Indicate whether the statement is true or false

Computer Science & Information Technology

____ layouts limit the control you have over symmetry and white space.

A. Liquid B. Elastic C. Fixed-width D. Indexed

Computer Science & Information Technology

What are getter and setter methods?

What will be an ideal response?

Computer Science & Information Technology

A company is deploying a new wireless network and requires 800Mbps network throughput. Which of the following is the MINIMUM configuration that would meet this need?

A. 802.11ac with 2 spatial streams and an 80MHz bandwidth B. 802.11ac with 3 spatial streams and a 20MHz bandwidth C. 802.11ac with 3 spatial streams and a 40MHz bandwidth D. 802.11ac with 4 spatial streams and a 160MHz bandwidth

Computer Science & Information Technology