by Chris Adams and Mark L. Chambers
Once you've designed the content and structure of your Web site or company Intranet, it's tempting to jump directly into the hands-on work: creating HTML, writing Java or CGI code, adding graphics, testing your site, and so on. After all, the design phase is finished, isn't it?
Once your site is up and running, you may be satisfied that it works as planned, but have you really done everything necessary to ensure that your users can retrieve information from it as efficiently as possible? Are you using the right hardware, software, and design techniques to handle a large number of hits without your site slowing to a crawl? In this chapter, we'll discuss how you can optimize your site design for the fastest speeds and highest capacity, even before you go online!
The idea behind optimization is simple in theory; you need to streamline each component of your site for fastest access time and lowest system load by intelligently using the resources you have. Any user on the Internet knows just how important efficiency is; transfer speeds have plummeted as the Internet has grown to a size its designers would have thought impossible. If your site is slow as molasses, your users will find the information they're looking for elsewhere or even give up the search entirely. Even if your site is currently working well under full load, optimization is still important. Even a 10 percent gain in efficiency will improve service to every one of your users!
As a first step in optimizing your site design, you must determine who your target audience is. Unfortunately, those who will access your site are not created equal. The resources available to Internet users vary widely:
When you design your site, you must consider which section of the market is most important. One approach is simply to appeal to the lowest common denominator and use a minimum of extras on your site: no animated .gif images, forms, or frames, and no MIDI music. Don't force multimedia or advanced HTML on your users by default!
However, your target audience may determine whether your site must use the most advanced features; for example, a company that develops multimedia games may decide the company site needs the impact of animated graphics; a record company will probably want sound clips from its latest offerings available on its Web pages.
Tip |
You can limit the impact of multimedia on your users that don't have pcs powerful enough to display them or those users that don't want to wait the extra time. Offer multimedia extras on an alternate menu system and let the user pick text or graphical mode on your site's welcome page. If you'd rather keep a single menu system, keep graphics small and offer hotspots that play sound bytes or video clips. |
Your design and optimization process will inevitably lead to trade-offs. You can add features, but you will run the risk of locking out many potential users. After you have determined which users you are targeting and decided what resources are available on your end, you can assign a relative priority to the following considerations:
It's important to look at your projected site as your users will. Follow the same usability procedures that software developers follow; test your site using a computer similar to the average machine available to your target user. Try your site with several browsers. Have friends or coworkers use the site. What may seem intuitive to a computer programmer may be confusing to others. Watch the reactions of your test users and ask why they paused at certain menus or seemed uncertain about what to do.
Suppose that you are planning a corporate Web site to provide marketing material and technical support for a company that creates commercial animation. Based on reviews of competitors' sites, your site design has enthusiastic support from high-level management. The advertising department has just created a very good promotional video, and one of the VPs has asked you to consider putting an excerpt on the Web site. What would you do?
Establish the priorities:
Based on the above, you should recommend to management a T1 or faster connection hooked up to a fast server. The material online should be more than just raw technical data; you should have enough variety to convince a potential customer that you are delivering a complete solution to their needs. Demonstrating high-quality support is also vital.
As for the video, there's no reason why you shouldn't include it on the site; your target audience has the equipment to view it. However, make sure that you identify its length in megabytes and video format (.avi or .mov) so that users can decide for themselves whether they want to spend the time to download it. As we mentioned earlier, never force multimedia as an automatic default! Illustrate the video link with a few high-quality captured images as well so that potential customers get a preview of what the video contains.
After prioritizing, your Web site has already taken on a certain character. It will have some high-quality images to better demonstrate a graphics product and will provide solid information and support. Those users with more powerful pcs and faster connections can see the best your company has to offer. Clarifying the focus of the site early simplifies the rest of the design phase.
No matter what you end up putting on your site, a few techniques can offer dramatic improvements. These generic techniques also demonstrate the mindset necessary to successful optimization.
When you design your site, you have many goals and considerations in mind. Every decision carries an expense with it. This expense may be in the form of increased bandwidth, higher end-user requirements, server slow-downs, or increased storage space requirements. When optimizing your Web site, you must consider the relative expenses of each decision. Does the added functionality outweigh the performance hit?
In a typical environment, everyone connected to the organization will have features they feel must be present. The trouble with these dream sites is that many of the most requested features tend to be the most expensive in terms of efficiency. After the first version of the site has been created, consider it from an outside perspective. Are the features still worth it? The easiest way to improve a site involves trimming the features lowest on value and highest on expense.
Suppose that you decided it would be worthwhile to include a real-time Real Audio feed on your radio station's home page. Now your server is limping on its knees under the strain of handling the site. Time to consider whether that audio is still worth the trouble!
Other choices might not be so easy. To help you make these choices, ask users for their input. Having a prominently located mailto: link to a comments e-mail address is one idea. Better yet would be a form allowing users to rate their favorite features on a scale of one to ten. Such a list could be implemented as shown in Figure 46.1. The code for this form is in Listing 46.1.
Figure 46.1 : This form was created with the HTML code in Listing 46.1.
Listing 46.1. The HTML source code for a user survey designed to rate favorite features of a Web site.
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN"> <HTML> <HEAD> <TITLE>Features Survey</TITLE> <META NAME="GENERATOR" CONTENT="Mozilla/3.0b6Gold (Win16; I) [Netscape]"> </HEAD> <BODY> <P><FORM METHOD="POST" ACTION="http://www.megafoobar.com/cgi-bin/post-query"></P> Please rate the following features: Live Video Camera of our Accounting Department:<BR> <INPUT TYPE="radio" NAME="video" VALUE="bad" chECKED>Hoo-boy, does it stink <INPUT TYPE="radio" NAME="video" VALUE="soso">So-so <INPUT TYPE="radio" NAME="video" VALUE="okay">It's okay <INPUT TYPE="radio" NAME="video" VALUE="good">It's good <INPUT TYPE="radio" NAME="video" VALUE="great">I love it! <P> Document Search Engine:<BR> <INPUT TYPE="radio" NAME="searchengine" VALUE="bad">Hoo-boy, does it stink <INPUT TYPE="radio" NAME="searchengine" VALUE="soso">So-so <INPUT TYPE="radio" NAME="searchengine" VALUE="okay">It's okay <INPUT TYPE="radio" NAME="searchengine" VALUE="good" chECKED>It's good <INPUT TYPE="radio" NAME="searchengine" VALUE="great">I love it! <P> Hypertext Source Code Examples:<BR> <INPUT TYPE="radio" NAME="sourcecode" VALUE="bad">Hoo-boy, does it stink <INPUT TYPE="radio" NAME="sourcecode" VALUE="soso">So-so <INPUT TYPE="radio" NAME="sourcecode" VALUE="okay">It's okay <INPUT TYPE="radio" NAME="sourcecode" VALUE="good">It's good <INPUT TYPE="radio" NAME="sourcecode" VALUE="great" chECKED>I love it! <P> Technical Support email links:<BR> <INPUT TYPE="radio" NAME="supportemail" VALUE="bad">Hoo-boy, does it stink <INPUT TYPE="radio" NAME="supportemail" VALUE="soso">So-so <INPUT TYPE="radio" NAME="supportemail" VALUE="okay"chECKED>It's okay <INPUT TYPE="radio" NAME="supportemail" VALUE="good">It's good <INPUT TYPE="radio" NAME="supportemail" VALUE="great">I love it! <P> Links to 3rd Party Information:<BR> <INPUT TYPE="radio" NAME="thirdparty" VALUE="bad">Hoo-boy, does it stink <INPUT TYPE="radio" NAME="thirdparty" VALUE="soso">So-so <INPUT TYPE="radio" NAME="thirdparty" VALUE="okay">It's okay <INPUT TYPE="radio" NAME="thirdparty" VALUE="good" chECKED>It's good <INPUT TYPE="radio" NAME="thirdparty" VALUE="great">I love it! <P> User surveys:<BR> <INPUT TYPE="radio" NAME="surveys" VALUE="bad">Hoo-boy, does it stink <INPUT TYPE="radio" NAME="surveys" VALUE="soso">So-so <INPUT TYPE="radio" NAME="surveys" VALUE="okay">It's okay <INPUT TYPE="radio" NAME="surveys" VALUE="good">It's good <INPUT TYPE="radio" NAME="surveys" VALUE="great" chECKED>I love it! <P> <INPUT TYPE="button" NAME="Send" VALUE="Send"> </FORM> </BODY> </HTML>
Avoid putting complex graphics on a feedback page. Although it may be attractive, the excessive load time could scare off potential users. Getting an accurate sample is vital! With the information you receive, deciding what to trim should be easy. Don't be afraid to completely redo the site, however, if it turns out that the your users will be significantly better served. On the Web, potential users can pick what suits them best; if you don't provide what they're looking for, someone else will.
One of the most powerful tools in any programmer's kit is a profiler. A profiler tracks the time it takes a certain section of code to execute. Examining a profiler log can show where the bottlenecks are within a program. If used wisely, profiling can be one of the most important tools for a Webmaster as well. With a Web site, a profiler can tell you what to optimize. With a large site, merely knowing where to start is a great help.
To profile your Web site, check to see whether your Web server includes usage log capabilities; most servers do. Examine the logs over a statistically useful period of time-perhaps a week or two so you can avoid minor abnormalities. It's also helpful to see whether your server can be set to log the total number of bytes sent over that time period as well; this lets you track load as a percentage of total.
When you have your processed server logs, examine them carefully and identify the slowest resources. There are utilities that can help you sort through a log file; capabilities like sorting by transfer type, location, or speed are invaluable. Generally, if you have any one resource consuming a significant amount of time or space, you should reconsider whether it's worth the expense.
Load reduction involves looking for a more efficient substitute for existing resources. For instance, it may look good if you use Real Audio data streams. Certainly, it's an improvement over using a .wav file. Or is it? A .wav file can be cached. As soon as it is transferred (at the full bandwidth of the connection), the server's job is done. Playback characteristics are entirely dependent on the local computer, not the remote server or a network link. Any compression or processing is done once, eliminating a potential performance hit. Plus, a .wav file is usually much higher fidelity than a Real Audio feed!
If your existing server is too slow, you may be able to off-load most of the processing to the client side, as shown in this example. In this day of Pentium-level client machines, this solution is certainly feasible. The following sections deal with some potential trouble spots you may want to examine. It would be a good idea to use profiling techniques to see which of these items, if any, are important concerns on your Web site.
You can reduce a lot of overhead by converting images to different file formats. Picking the right file format is very important. By altering the bit-depth (the number of bits stored to record the color value of each pixel), you may reduce file size considerably. By default, most scanners use a 24-bit format. This means that three bytes are stored for each pixel. A single 640 by 480 image uses 921,600 bytes. In many cases, this same image can be reduced to 16- or 8-bit modes, potentially cutting file size by one third. Further reduction may be possible by using the right dithering algorithm. Dithering displays images using clusters of a few different colors instead of a single pixel in any color. It reduces the sharpness of an image, but it also reduces the number of colors needed.
Compression is also very important. .gif files use a form of compression that is well-suited for work with computer-generated images (which contain large areas of a single color). .jpg files, which use the JPEG compression scheme, are best for working with scanned images; in Figures 46.2 and 46.3, you can see the effects of compression. JPEG compression discards the least noticeable information for dramatic savings. For instance, the image in Figure 46.3 is only five percent of the size of the bitmap file shown in Figure 46.2! JPEG has a quality factor ranging from 1 to 100; at level 100, almost no compression is performed, and the maximum amount of detail is preserved. At level 1, compression is high, but there is noticeable degradation.
As you can see from the examples shown in Figures 46.2 and 46.3, picking the right image format can save considerable space and time. Converting image formats and reducing the size of images with a program such as Paint Shop Pro or Adobe Photoshop can result in further time savings for those using your site.
CGI scripts allow you to do almost anything, but this functionality comes with a cost. Figure 46.4 is a diagram of a CGI process. Notice that all but the first and last two steps are performed on the server. With a CGI program, any optimization is a big improvement. For proof, just multiply a one-percent increase in speed by a million users.
Figure 46.4 : A typical CGI setup.
The following are some general rules concerning CGI optimization:
Consider changing what language your CGI program is written in. If you pick a more efficient compiler or interpreter, it can lead to big savings in time and memory.
Optimize your CGI for speed; make sure that all unneeded steps are removed. Depending on your machine, even something as simple as removing most screen output can be important. Cache any results you use more than once.
If possible, configure your CGI processes to run on a separate machine. If the Web server isn't bogged down running CGI scripts, it will immediately be more responsive.
Depending on what your CGI script does, you may be able to replace it with a Java applet or even some JavaScript or VBScript. Off-loading processing to the client will save the server machine considerable work.
On any site, there are some files that get used more frequently than others, such as backgrounds or pictures on the first page. If you supply a custom plug-in or ActiveX control to view some of your content, this file will be downloaded quite frequently.
With such high-demand files, even the best attempts at optimization may not take enough load off of the server. Consider adding a separate machine that is devoted to serving these high-traffic items. Because of the nature of the Web, you can easily change your pages to reflect a new location. You may also want to consider establishing a mirror site, which is a site that contains identical contents but is located on another machine. Depending on the scale of your site, it may be best to have a server for certain geographic locations, particularly since the slowest links are often those that connect different continents or widely separated locations.
After you've tried all the various techniques mentioned in the last section, you may still need to upgrade the server machine. Before changing anything, however, find out where the greatest performance hit is coming from! In this section, we discuss physical changes you can make to the server itself to increase performance.
Not all computers are created equal (advertisement claims to the contrary)! There are well-documented examples showing that a better quality system can be significantly faster than a seemingly identical machine. As an example, this chapter was composed on a 100 MHz Pentium with a fast hard drive and a pcI video card. Several benchmarks have rated its performance as being more on par with some 133 MHz Pentiums.
On a server, the most important feature is high bandwidth. On many servers, the capability to pass large amounts of data to and from the disk subsystem is as important as the speed of the network connection. One factor in disk speed is the drive interface. In the desktop world, there are two main types of drives: EIDE/IDE and SCSI. SCSI is usually the best choice for a server. SCSI can handle multiple requests; if you have more than one (E)IDE device on a chain, you will only be able to access one drive at a time. Also, advanced SCSI drives even support multiple transactions. All of this increases speed while serving multiple users. Finally, SCSI drives tend to be focused at the high-performance market; some capabilities simply aren't available in EIDE.
A RAID (Redundant Array of Inexpensive Drives) controller will physically connect several drives but only show a single logical device to an operating system. RAID features can be selected to increase speed (RAID level 0) and provide advanced error correction (RAID levels 1-5), which is very important for mission-critical machines. In general, the maximum throughput for a logical device in a RAID 0 array is equal to the number of drives times the speed of the slowest drive. If you have a two-disk RAID 0 array using 2.5 M/s drives, you can expect around 5 M/s throughput. Couple this with the large disk sizes possible with a RAID array and a six or more drive array can be a compelling option. If you have configured your drives to use something other than RAID 0 (which does no extra error checking) and your controller supports it, you can even replace a failed drive without powering down the server. This feature can be important if your server must be up 24 hours a day, 7 days a week.
Caching will dramatically improve performance on a typical Web site, where certain resources are used far more than others. Many advanced disk controllers come with on-board cache memory. You should upgrade this memory to the largest size supported. Secondly, your server machine should have as much RAM as possible, so be generous when configuring the operating system disk cache.
More RAM will give you the largest available performance increase. Your server should contain as much RAM as the motherboard supports. More RAM generally provides a higher performance increase than even a next-generation processor upgrade. Also, some types of RAM are faster than others. Extended Data-Out (EDO) is currently the best choice, but technologies like Synchronous DRAM (SDRAM) are potentially much faster.
Symmetric Multiprocessing (SMP) machines contain several CPUs that share a common memory space. Generally, adding a second processor to an SMP machine provides a performance boost of at least 75 percent, which is cheap compared to the cost of equipping an auxiliary server. To use this capability, your operating system needs to support SMP; current SMP-capable operating systems include Windows NT, OS/2, Novell Netware, Macintosh, and many UNIX variants. Remember that all operating systems are not created equal. In a recent pc Week benchmark test, a beta version of IBM's OS/2 Warp Server SMP was up to 4.5 times faster than Windows NT and 30 times faster than Novell Netware on identical hardware.
If you have more than one machine available for use as a server and your network is sufficiently fast enough, you may also be able to link multiple servers across your network. This setup spreads the demand for file access and program execution across more than one machine. The end result is similar to the dedicated CGI machine mentioned earlier: a faster site with a more balanced workload on each server.
Network connections come in many different flavors, and each link in the network connection presents its own set of potential bottlenecks. Again, profiling techniques pay off, so it's time to examine those logs again. In many cases, a connection to the Internet will be the slowest point. Investing in a T1 (1.5 M/s) or T3 (45 M/s) connection can pay off through improved speed. Remember, if you are running a commercial site, it's wise to use a high-quality, high-speed connection.
The gateway is next on the list of bottlenecks. A gateway machine or router must be capable of handling every byte transferred beyond the LAN. This is not the place to skimp! Also, the security provided by a sophisticated firewall requires a hefty machine to run, but if a firewall's requirements seem excessive, you may want to evaluate other firewall packages.
Finally, the LAN can be a problem. If your LAN has not only the server but also a significant number of users, peak usage may exceed the capacity of your underlying link. This problem will not only slow site access but may also reduce local productivity. Isolating the server by connecting it directly to the gateway/firewall may be necessary. In any case, a fast fiber-optic network will certainly be welcome.
What if you find that you need a faster connection than you can afford? For most sites, there will be a certain time (or times) of the day where usage is the highest. During the rest of the day, usage is considerably lower. If the normal usage is well within the capabilities of your connection, but the peak load is well beyond them, try one of the following solutions:
The range of features offered in server software varies widely, and each software publisher is continuously expanding this range. Unfortunately, this variety means that performance will also vary just as widely. Before setting up a Web site, you should evaluate as many different servers (or server suites) as possible.
What should you compare when evaluating server software? Regardless of what service it provides (HTTP, FTP, Gopher, and so on), you need to consider each of the following issues:
Anyone using the Internet has noticed how slow some areas have become. The Web has been the heaviest load on the Internet for a quite some time, and there is no indication that this will change in the future. This situation is a direct result of the way the Web developed; because it rapidly grew in popularity, in most cases the protocols evolved quickly with more concern for functionality and development speed than efficiency. Now things have cooled down somewhat and more thought is being spent on these neglected issues. One of the most promising technologies involves the addition of caching as an intrinsic part of the protocol.
Consider a popular site such as Yahoo!, Excite, or Netscape; these sites can be almost impossible to use at times because of the number of simultaneous users. However, the vast majority of these users are accessing only a few specific pages. How many people use more than the query form at AltaVista? Figure 46.5 shows what a typical network map of a subsection of the Internet might look like. Every network link is a different speed; in some cases, a long series of faster links might be worse than a shorter slow link.
Without caching, when any of the six clients requests a file off of the server, the same process is completed:
Every step, with the possible exception of step 4, involves a network transmission across the entire link between the client and the server. There are a few spots, such as routers or bridges, where connections will bottleneck. The end result is that a connection between Client 2 and the server slows down every connection with the other clients, even Client 1!
Next, imagine that several network administrators have implemented a cache system. The regional network, the Internet service provider (ISP), and Client 4's subnet have all added cache servers. When Client 5 asks for the main page of the Web site on the server, it is routed through the high-speed connection to the regional network. The cache servers at the ISP and regional network add the IP address and the page to their cache records.
A few minutes later, Client 2 asks for the same page. The requested page is loaded directly off of the regional network's cache server, avoiding the traffic caused by Client 1 and the other machines on the local network. Later, Client 6 initiates the same request. Because the ISP is part of the cache hierarchy, the request is met locally. However, by the time Client 4 asks for the page, the ISP machine has already purged the local copy, so it is reloaded from the next level, the regional network.
Finally, Client 3 requests the same page from this popular site. Normally, this request would go through the two remote networks and the router directly to the server. However, the second network has been suffering from a partial overload and the routing software sends the request the other way instead. Normally, the ISP's cache would have quickly returned the page. Unfortunately, by now the page has been loaded for one hour, and the cache software has expired it. This time the page must be reloaded from the server and so the entire request chain is followed. With the caching software installed, the server handled the local load and only two long-distance requests. Caching allowed faster access for all but the first and last users.
You might have noticed the hierarchical nature of the cache system. Because of the way the cache is designed, a failed local request (at the ISP) went only to the next level (the regional network) instead of going all the way to the server. Coupling this with advanced routing software can yield substantially faster access patterns.
This example was not idle speculation, either. According to the results of "A Case for Caching File Objects Inside Internetworks" (Peter Danzig, Michael Schwartz, and Richard Hall, ACM SIGCOMM '93, Sept. 1993), hierarchical caching could reduce FTP traffic by 50 percent. Similar results are projected for the Web.
The Harvest Object cache is a hierarchical system that has been in operation for over two years at around 100 sites. Tests conducted by its authors show that the Harvest cache system can be up to an order of magnitude faster than the standard CERN cache. The Harvest Cache represents the work of Anawat Chankhunthod, Chuck Neerdaels, and Peter Danzig of the University of Southern California and Michael Schwartz and Duane Wessels of the University of Col-orado at Boulder. For more information about this cache, point your Web browser to http://excalibur.usc.edu/ at the University of Southern California.
In this chapter, you've learned the basic rules of optimization: