Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Seamlessly Selecting the Best Copy from InternetWide Replicated Web Servers
 

Summary: Seamlessly Selecting the Best Copy from Internet­Wide
Replicated Web Servers
Yair Amir, Alec Peterson, and David Shaw
Department of Computer Science
Johns Hopkins University
{yairamir, chuckie, dshaw}@cs.jhu.edu
Abstract. The explosion of the web has led to a situation where a majority of the
traffic on the Internet is web related. Today, practically all of the popular web sites
are served from single locations. This necessitates frequent long distance network
transfers of data (potentially repeatedly) which results in a high response time for
users, and is wasteful of the available network bandwidth. Moreover, it commonly
creates a single point of failure between the web site and its Internet provider.
This paper presents a new approach to web replication, where each of the replicas
resides in a different part of the network, and the browser is automatically and
transparently directed to the ``best'' server. Implementing this architecture for popular
web sites will result in a better response­time and a higher availability of these sites.
Equally important, this architecture will potentially cut down a significant fraction of
the traffic on the Internet, freeing bandwidth for other uses.
1. Introduction
The explosion of the web has led to a situation where a majority of the traffic on the

  

Source: Amir, Yair - Department of Computer Science, Johns Hopkins University

 

Collections: Computer Technologies and Information Sciences