SimpleHost Webstats produced by Analog 4.15

Do you want to see a







The original top-ten mistakes of Web design

1. Using Frames

Splitting a page into frames is very confusing for users since frames break the fundamental user model of the web page. All of a sudden, you cannot bookmark the current page and return to it (the bookmark points to another version of the frameset), URLs stop working, and printouts become difficult. Even worse, the predictability of user actions goes out the door: who knows what information will appear where when you click on a link?

2. Gratuitous Use of Bleeding-Edge Technology

Don't try to attract users to your site by bragging about use of the latest web technology. You may attract a few nerds, but mainstream users will care more about useful content and your ability to offer good customer service. Using the latest and greatest before it is even out of beta is a sure way to discourage users: if their system crashes while visiting your site, you can bet that many of them will not be back. Unless you are in the business of selling Internet products or services, it is better to wait until some experience has been gained with respect to the appropriate ways of using new techniques. When desktop publishing was young, people put twenty fonts in their documents: let's avoid similar design bloat on the Web.

As an example: Use VRML if you actually have information that maps naturally onto a three-dimensional space (e.g., architectural design, shoot-them-up games, surgery planning). Don't use VRML if your data is N-dimensional since it is usually better to produce 2-dimensional overviews that fit with the actual display and input hardware available to the user.

3. Scrolling Text, Marquees, and Constantly Running Animations

Never include page elements that move incessantly. Moving images have an overpowering effect on the human peripheral vision. A web page should not emulate Times Square in New York City in its constant attack on the human senses: give your user some peace and quiet to actually read the text!

Of course, BLINK is simply evil. Enough said.

4. Complex URLs

Even though machine-level addressing like the URL should never have been exposed in the user interface, it is there and we have found that users actually try to decode the URLs of pages to infer the structure of web sites. Users do this because of the horrifying lack of support for navigation and sense of location in current web browsers. Thus, a URL should contain human-readable directory and file names that reflect the nature of the information space.

Also, users sometimes need to type in a URL, so try to minimize the risk of typos by using short names with all lower-case characters and no special characters (many people don't know how to type a ~).

5. Orphan Pages

Make sure that all pages include a clear indication of what web site they belong to since users may access pages directly without coming in through your home page. For the same reason, every page should have a link up to your home page as well as some indication of where they fit within the structure of your information space.

6. Long Scrolling Pages

Only 10% of users scroll beyond the information that is visible on the screen when a page comes up. All critical content and navigation options should be on the top part of the page. Note added December 1997: More recent studies show that users are more willing to scroll now than they were in the early years of the Web. I still recommend minimizing scrolling on navigation pages, but it is no longer an absolute ban.

7. Lack of Navigation Support

Don't assume that users know as much about your site as you do. They always have difficulty finding information, so they need support in the form of a strong sense of structure and place. Start your design with a good understanding of the structure of the information space and communicate this structure explicitly to the user. Provide a site map and let users know where they are and where they can go. Also, you will need a good search feature since even the best navigation support will never be enough.

8. Non-Standard Link Colors

Links to pages that have not been seen by the user are blue; links to previously seen pages are purple or red. Don't mess with these colors since the ability to understand what links have been followed is one of the few navigational aides that is standard in most web browsers. Consistency is key to teaching users what the link colors mean.

9. Outdated Information

Budget to hire a web gardener as part of your team. You need somebody to root out the weeds and replant the flowers as the website changes but most people would rather spend their time creating new content than on maintenance. In practice, maintenance is a cheap way of enhancing the content on your website since many old pages keep their relevance and should be linked into the new pages. Of course, some pages are better off being removed completely from the server after their expiration date.

10. Overly Long Download Times

I am placing this issue last because most people already know about it; not because it is the least important. Traditional human factors guidelines indicate 10 seconds as the maximum response time before users lose interest. On the web, users have been trained to endure so much suffering that it may be acceptable to increase this limit to 15 seconds for a few pages.

Even websites with high-end users need to consider download times: we have found that many of our customers access Sun's website from home computers in the evening because they are too busy to surf the web during working hours. Bandwidth is getting worse, not better, as the Internet adds users faster than the infrastructure can keep up.

The top-ten new mistakes of Web design

1. Breaking or Slowing Down the Back Button

The Back button is the lifeline of the Web user and the second-most used navigation feature (after following hypertext links). Users happily know that they can try anything on the Web and always be saved by a click or two on Back to return them to familiar territory.

Except, of course, for those sites that break Back by committing one of these design sins:

  • opening a new browser window (see mistake #2)
  • using an immediate redirect: every time the user clicks Back, the browser returns to a page that bounces the user forward to the undesired location
  • prevents caching such that the Back navigation requires a fresh trip to the server; all hypertext navigation should be sub-second and this goes double for backtracking

    2. Opening New Browser Windows

    Opening up new browser windows is like a vacuum cleaner sales person who starts a visit by emptying an ash tray on the customer's carpet. Don't pollute my screen with any more windows, thanks (particularly since current operating systems have miserable window management). If I want a new window, I will open it myself!

    Designers open new browser windows on the theory that it keeps users on their site. But even disregarding the user-hostile message implied in taking over the user's machine, the strategy is self-defeating since it disables the Back button which is the normal way users return to previous sites. Users often don't notice that a new window has opened, especially if they are using a small monitor where the windows are maximized to fill up the screen. So a user who tries to return to the origin will be confused by a grayed out Back button.

    3. Non-Standard Use of GUI Widgets

    Consistency is one of the most powerful usability principles: when things always behave the same, users don't have to worry about what will happen. Instead, they know what will happen based on earlier experience. Every time you release an apple over Sir Isaac Newton, it will drop on his head. That's good.

    The more users' expectations prove right, the more they will feel in control of the system and the more they will like it. And the more the system breaks users' expectations, the more they will feel insecure. Oops, maybe if I let go of this apple, it will turn into a tomato and jump a mile into the sky.

    Interaction consistency is an additional reason it's wrong to open new browser windows: the standard result of clicking a link is that the destination page replaces the origination page in the same browser window. Anything else is a violation of the users' expectations and makes them feel insecure in their mastery of the Web.

    Currently, the worst consistency violations on the Web are found in the use of GUI widgets such as radio buttons and checkboxes. The appropriate behavior of these design elements is defined in the Windows UI standard, the Macintosh UI standard, and the Java UI standard. Which of these standards to follow depends on the platform used by the majority of your users (good bet: Windows), but it hardly matters for the most basic widgets since all the standards have close-to-identical rules.

    For example, the rules for radio buttons state that they are used to select one among a set of options but that the choice of options does not take effect until the user has confirmed the choice by clicking an OK button. Unfortunately, I have seen many websites where radio buttons are used as action buttons that have an immediate result when clicked. Such wanton deviations from accepted interface standards make the Web harder to use.

    4. Lack of Biographies

    My first Web studies in 1994 showed that users want to know the people behind information on the Web. In particular, biographies and photographs of the authors help make the Web a less impersonal place and increase trust. Personality and point-of-view often wins over anonymous bits coming over the wire.

    Yet many sites still don't use columnists and avoid by-lines on their articles. Even sites with by-lines often forget the link to the author's biography and a way for the user to find other articles by the same author.

    It is particularly bad when a by-line is made into a mailto: link instead of a link to the author's biography. Two reasons:

  • it is much more common for a reader to want to know more about an author (including finding the writer's other articles) than it is for the reader to want to contact the author - sure, contact info is often a good part of the biography, but it should not be the primary or only piece of data about the author
  • it breaks the conventions of the Web when clicking on blue underlined text spawns an email message instead of activating a hypertext link to a new page; such inconsistency reduces usability by making the Web less predictable

    5. Lack of Archives

    Old information is often good information and can be useful to readers. Even when new information is more valuable than old information, there is almost always some value to the old stuff, and it is very cheap to keep it online. I estimate that having archives may add about 10% to the cost of running a site but increase its usefulness by about 50%.

    Archives are also necessary as the only way to eliminate linkrot and thus encourage other sites to link to you.

    6. Moving Pages to New URLs

    Anytime a page moves, you break any incoming links from other sites. Why hurt the people who send you free customer referrals?

    7. Headlines That Make No Sense Out of Context

    Headlines and other microcontent must be written very differently for the Web than for old media: they are actionable items that serve as UI elements and should help users navigate.

    Headlines are often removed from the context of the full page and used in tables of content (e.g., home pages or category pages) and in search engine results. In either case the writing needs to be very plain and meet two goals:

  • tell users what's at the other end of the link with no guesswork required
  • protect users from following the link if they would not be interested in the destination page (so no teasers - they may work once or twice to drive up traffic, but in the long run they will make users abandon the site and reduce its credibility)

    8. Jumping at the Latest Internet Buzzword

    The web is awash in money and people who proclaim to have found the way to salvation for all the sites that continue to lose money.

    Push, community, chat, free email, 3D sitemaps, auctions - you know the drill.

    But there is no magic bullet. Most Internet buzzwords have some substance and might bring small benefits to those few websites that can use them appropriately. Most of the time, most websites will be hurt by implementing the latest buzzword. The opportunity cost is high from focusing attention on a fad instead of spending the time, money, and management bandwidth on improving basic customer service and usability.

    There will be a new buzzword next month. Count on it. But don't jump at it just because Jupiter writes a report about it.

    9. Slow Server Response Times

    Slow response times are the worst offender against Web usability: in my survey of the original "top-ten" mistakes, major sites had a truly horrifying 84% violation score with respect to the response time rule.

    Bloated graphic design was the original offender in the response time area. Some sites still have too many graphics or too big graphics; or they use applets where plain or Dynamic HTML would have done the trick. So I am not giving up my crusade to minimize download times.

    The growth in web-based applications, e-commerce, and personalization often means that each page view must be computed on the fly. As a result, the experienced delay in loading the page is determined not simply by the download delay (bad as it is) but also by the server performance. Sometimes building a page also involves connections to back-end mainframes or database servers, slowing down the process even further.

    Users don't care why response times are slow. All they know is that the site doesn't offer good service: slow response times often translate directly into a reduced level of trust and they always cause a loss of traffic as users take their business elsewhere. So invest in a fast server and get a performance expert to review your system architecture and code quality to optimize response times.

    10. Anything That Looks Like Advertising

    Selective attention is very powerful, and Web users have learned to stop paying attention to any ads that get in the way of their goal-driven navigation. That's why click-through rates are being cut in half every year and why Web advertisements don't work.

    Unfortunately, users also ignore legitimate design elements that look like prevalent forms of advertising. After all, when you ignore something, you don't study it in detail to find out what it is.

    Therefore, it is best to avoid any designs that look like advertisements. The exact implications of this guideline will vary with new forms of ads; currently follow these rules:

    Banner blindness means that users never fixate their eyes on anything that looks like a banner ad due to shape or position on the page

    Animation avoidance makes users ignore areas with blinking or flashing text or other aggressive animations

    Pop-up purges mean that users close pop-up windoids before they have even fully rendered; sometimes with great viciousness (a sort of getting-back-at-GeoCities triumph). I don't want to ban pop-ups completely since they can sometimes be a productive part of an interface, but I advise making sure that there is an alternative way of using the site for users who never see the pop-ups.


    If you mention meta tags to most Web builders, they immediately think of creating keyword lists for search engines to use when indexing Web pages. That's a natural response to years of conventional wisdom, which held that the keyword meta tag and its brethren were essential ingredients in any search engine optimization effort.

    In their quest for higher search engine rankings, some unscrupulous Webmasters would load meta tags with high-ranking keywords that were unrelated to the content of the Web pages on which they appeared. Because of this abuse of the keyword meta tag, major search engines no longer use those keywords as a significant factor in their rankings. Now Web builders find that meta tags aren't worth the time and effort it takes to add them to Web pages.

    Metadata is data that describes other data. In the case of a Web page, the metadata describes the contents and characteristics of the Web page. Meta tags provide a convenient convention for storing that descriptive data in tag attributes. Meta tags are embedded between the head tags of an HTML document, so they're accessible to search engine robots but aren't displayed in the Web browser window.

    Strictly speaking, meta tags are optional. Your Web pages will display in a browser without any meta tags, so why bother with meta tags if they don't boost search engine rankings?

    Some search engines use the description meta tag to display a short description of your page in the search results. Support for the description meta tag isn't universal, so the Title Tag is used in it's place.

    Major search engines no longer use the keyword tag, however your internal site search can make good use of the information in the keyword tag. (ONLY IF YOU HAVE INTERNAL SEARCH FEATURES IN YOUR SITE)

    It used to be that all search engines would pull this information and use it as part of their search results. Not only would your search engine listing include information from your title tag, but also below it would be a copy of your Meta description tag.

    With this predictable structure, search engine marketers could manipulate the way their website listing was displayed in the search engine results by changing these tags.

    In addition, the Meta description tag would have great importance when determining which position your website showed up in the search results. The number of keywords, their relevance and density within the description tag could be manipulated to help achieve the elusive #1 position on the search engine results.

    The importance of meta tags has diminished considerably over the past couple of years.

    Then along came Google

    When Google became the search engine of preference, it ushered in a new era of how search engine listings are displayed. Google chose not to use the Meta description tag and instead rely on the content contained within a website.

    The biggest impact from this decision resulted in the Meta description having no significance whatsoever on where a website is positioned within Google’s results.

    A website owner could optimize their description tag to the highest degree, and it would have little effect on how their website was positioned in Google.

    The other effect of choosing to ignore the description tag was that Google did not use this information as part of the website’s listing; instead formulating it’s own description using content extracted from the web page itself (only if there is very little page content will you see Google display the meta description).

    When it became apparent that Google’s approach was successful, other search engines started following suit to the point that few search engines today spider and display the Meta description tag.


    The "pop2.domain.com" triggered my logic into believing that its a piece of foul spyware that must be eradicated without mercy. Thank you for the details by the way.

    To do this, we'll need a program called Spybot Search and Destroy. Don't worry, its free. Here we go: (read them through so you have a feel for what's going to happen)

    1. Goto http://fileforum.betanews.com/detail/Spybot_Search_and_Destroy/1043809773/1

    2. Click the "Download Now" link in the upper/middle/right quadrant of the website. It will take you to a new page, wait a moment and your browser should ask you if you want to save or open the file. Either will do fine, but save (say... to the Desktop) might be preferred.

    3. If you chose to open, then the installer will begin to run. Otherwise, go to the desktop and dbl click the spybotsd14d.exe file.

    *note* If you have Windows XP with Service Pack 2 (SP2) installed, the windows Firewall will ask you permission to run the file. You do wish to give it this permission (the box with the question will offer you the choice to run it or not, so choose to run it - it is safe).

    4. I'm not perfectly recalling the installer but it seemed fairly generic to me. Click through it (next button etc...) letting it do what is needed to get the program installed. If it asks you to restart your computer at the end, then do it.

    5. Upon reboot (if it was needed) - Run the Spybot program. Look for an entry in the program called "search for/look for updates" and click it. The program *might* even do this automatically [I'm sorry to use so many "mights" but I've gone through so many of these programs that they sort of blend with their procedures] and proceed to acquire any new updates to it. And there will be plenty, it might take several minutes to update the software. This is good since new spyware, like viruses, are created daily.

    6. Once the update is complete, look for a "Search for problems" button. Probably looks like a pair of binoculars. Click it and let it do its thing. This often takes at least 5-10minutes.

    7. Once completed, it will present a list of its search results. Everything in the list can be repaired by finding and clicking the "Fix selected problems" button.

    If you have an antivirus, it would be a good time to run its update program and perform a scan as well.

    This should take care of your problem. If it doesn't, don't hesitate to contact either Tim, Bill or myself (or simply: all).

    However, you shouldn't see this as wasted time if it doesn't fix the problem. Spyware is most heinous and incredibly wide spread (as in.. more than 70% of internet surfing computers have it). So even if your machine didn't have anything serious on it, its a good program to have lying around as a first line of defense. I just ran a scan and found about 5 little buggers sitting in my Internet temp file, heh.

    Keep me posted, and good luck.


    Hi Harold, I hear you're in the market for a video card.

    There are a great deal of cards to choose from and there are more than I can list here in terms of their mixing and matching of features. I will recommend two manufacturers as the best: ATI and NVIDIA. For extremely high end gaming, both manufacturers offer cards in the $500-600 range. Any one of those cards will kick some serious butt and will probably also offer neat-o features like tv-out.

    But in a more reasonable range...


    Top end gaming performance:

  • $226 - ATI RADEON X800 AGP 256MB
  • $226 - NVIDIA GeForce 6800 GE 256MB

    Mid Grade Gaming:

  • $165 - ATI RADEON 9800 Pro 256MB
  • $137 - NVIDIA GeForce 6600 AGP 256MB

    Budget Gaming:

  • $62 - NVIDIA GeForce FX 5500 256MB (good all around budget card)

    Some of the above cards have video-out features allowing you to attach a TV to your computer if you want to watch movies on something larger than a computer monitor. All of them are well suited for CAD applications as well as normal 2D (in windows) performance.


    If gaming and multimedia are less your thing and you need something for CAD purposes, ATI's FIRE GL series (ranging from $100 to $1000 - differentiated on application compatibility and ultimate horsepower) and NVIDIA's QUADRO series (same range) are the way to go. But I'm not familiar with these cards.


    A better version of my current card is my personal recommendation for multimedia and gaming applications (some CAD): $229 - ATI RADEON 9800


    This one has a built in TV tuner, has video-input capabilities (allowing you to record home videos onto the computer, and if you have a DVD burner, then transfer the videos to DVD), and HDTV output support allowing you to hook an HDTV up to the thing if need be. Darn thing even came with a remote so I can control some of my media applications (like video and audio players) from the couch. So, lotsa bells and whistles.


  • RAM: 128+ MB - needed especially for gaming
  • CPU: ATI: 9600 - 9800, X600-X850
    NVIDIA: Geforce - FX, 5200 - 6800+
  • TV/Video output (usually an RCA jack, sometimes SVIDEO)
  • Video IN (expensive feature - lets you record stuff onto the computer)
  • Multiple Monitors - (expensive usually if on above cards)
  • DVI (usually expensive) - DVI is for DVI-flatpanel LCD displays, while analog is for a regular CRT monitor.


    Flat panels are expensive, but the DVI models have good color and decent response times. The response time is in ms and effects the "smoothness" of moving images - especially important for movies and games. Regular, analog monitors are still better in quality in my opinion when it comes to size (they're cheaper so you can get 'em bigger), brightness, color and longevity. Most DVI cards come with a DVI-to-analog adapter so you can still plug a regular monitor into the thing.

    It's really all about your needs. What about your old video card do you not like? What is it?

    Good luck, and have some games ready (be ready to crank their in-game detail settings) to test out the new card.

    Glad to be of assistance, Harold.

    The "budget" card, as well as all the other cards, are fully DirectX 9.0 (with shaders) compatible. Doom 3 will not install on my sister's machine with her GeForce 2.

    $150 on a gaming card (minus the video in features and the tv tuners etc..) is a solid investment in terms of showing off to Bill in video games. Within that price range, the NVIDIA cards are superior to the ATI's. $300 is only justifiable if you are going to have your machine become the entertainment center of the household or - move the card to the "new" machine you plan to build a year from now (assuming current gaming trends apply).

    The FX 5200 is a good deal, and does play Doom 3. With this card, your machine is centered well in today's gaming market. The power requirements are also low on that card, which is good for heat and stability. But do avoid the lowest of low-end 5200's (64mb and around $35-40) - their performance is seriously affected by those few dollars.

    Any 256mb NVIDIA within your price range should impress you nicely. (Beats my card, and I'm still impressed by it).

    The prices I gave you were from http://www.pricewatch.com. They deal with bargain shops. But when I acquired my 9700pro from shopping at the site, I had to exchange it due to a malfunction of the card (I actually saw a boot print on the thing!) Which really only set me back a week and some shipping charges.


    Since you're probably not installing this thing on a freshly formatted computer, you'll want to remove the old NVIDIA drivers BEFORE you power down and swap the cards. Old drivers tend to cause some problems (especially if you end up getting another NVIDIA card). Here's a pretty sure fire procedure:

    1. Goto http://www.drivercleaner.net/ and scroll down to the bottom of the page. Click the little harddrive icon below the word "download". Save it somewhere you can find it, like the desktop.

    2. Install that program. Its a driver cleaner, meant to be run right after you run the normal driver unistaller, but before you reboot/shut off the machine to put the new card in.

    3. Goto Add/Remove Programs from within your computer's control panel (probably on the start menu, under settings, then control panel). And select the NVIDIA drivers from the list and click the remove button. Click whatever's necessary to help the uninstaller proceed. However, do not let it reboot the machine, if it gives you an ultimadum kind of response like "press ok to reboot" simply leave the window open and proceed to the next step. Ignore step 4 if NVIDIA or Geforce 2 is not in the list (just means you're running windows drivers).

    4. Run that driver cleaner software. Click OK on the "important info" dialog box that pops up. Select "nVidia" from the list and click the "start" button in the program's window. When it asks "Continue with cleaning?" select "Yes". Wait a moment and it should finish right up. Click Close.

    5. Shutdown the machine (don't worry about the reboot question still left open by the manufacturer's driver un-installer.

    6. Remove the power cord from the back of the tower and get the machine ready to open up.

    7. Open the case, ground yourself by touching the power supply (if you have a static strap, put it on before touching anything inside the machine). Touch the power supply every once in a while.

    8. Unscrew the mounting screw holding your old video card in (ensure the monitor cable is unplugged from the card of course). Pull out the old card. If you have a static-bag, put it in there. Use the new video card's static bag if need be. Cardboard boxes, or somewhere in your computer case, is a good temp storage if you're using the new card's static bag.

    9. If your current card is in a white slot (PCI), remove it, and set it aside. Chances are the dark brown, smaller slot (AGP) closest to the processor is not in use and might still have its cover on. Remove that cover and use it for the pci slot of the old card. If your current cart is agp, you're just performing a direct swap. Push the new card into the AGP slot. Screw it in.

    10. If its a card that needs an external power supply cable, plug this in now (see * below for more information about this). Use a splitter if you need one.

    11. Close up the case, set the machine back up, turn it on and have your driver CD ready.

    12. Ignore/cancel window's attempt to install the software and go into your cdrom (from My Computer) directly and run the installation from there. You will have to restart your computer one more time before you get to use your card.

    If a game is not working (properly even), you might need to update the drivers for the video card, in which case you'll have to visit either http://www.nvidia.com or http://www.ati.com to get them. Message me back if you have issues.

    *This was a first for me, but my Radeon All-in-Wonder 128mb 9700pro (the pro gives just a notch better in gaming performance by the way) required a plug from my power supply to connect _directly_ to the card, for its extra power needs. I don't think the FX has this, but do keep an eye on this since the card will not run without this cable plugged in. Luckily it takes a cable that you should have a spare of - an extra floppy power cable I believe. Sometimes the manufacturer includes a splitter in case you're out of extras in your machine. Otherwise your local ma and pa computer shop would have one (and probably even Best Buy actually - they are selling these kinds of cards) for cheap (less than 5-6 bucks).

    Rock on,



    I'm assuming you're using Windows XP.

    The most common font-problem with laptops (lappies) is due to the font smoothing windows neglects to put on by default. To turn this on, try this:

  • 1. Right click some clear spot on the desktop (not an icon) and select Properties.
  • 2. Click the Appearance tab along the top of the window.
  • 3. Look for and click an Effects... button near the bottom right of the window.
  • 4. A new window pops up, Check the box "Use the following method to smooth edges of screen fonts" and then, from the little drop-down list, select ClearType.
  • 5. Click OK on each window and wait for the settings to take effect.
  • If icons, text and images are still looking a little cruddy/blurry then this is a side effect of the nature of LCDs (laptop displays). Each LCD was manufactured with a "hardwired" resolution, meaning things will look clearest only at this resolution, and any resolution lower than that will attempt to blow the contents of the screen up to the laptop's default hardwired resolution. Unless you know what this resolution is (and it looks like "a number x another number" like: 1280x1024 or 1600x1200), the key is finding it. A hint is, its usually the highest resolution the laptop display can support.

    Try this:

  • 1. Right click some clear spot on the desktop again and select Properties.
  • 2. Click the Settings tab and look for a section called "Screen Resolution". There's a little slider in it where if you move the slider to the right, the resolution numbers will increase.
  • 3. Start small and move the slider one notch above what it is currently at and click apply. Repeat the process until the clarity becomes acceptable.
  • The common screen resolutions are:

  • 800x600
  • 1024x768
  • 1280x1024
  • 1600x1200
  • ...but only for "square" displays. If you have a widescreen laptop, these numbers will simply look different and often break into the 2000's range.

    Once you find that magic number, and everything on the screen simply becomes too small, its up to you (and mostly your daughter) to decide whether you want to deal with the image blurring or not caused by lower screen resolutions. My dad's lappy naturally sits at 1600x1200, and its sharp as hell, but too small for his eyes so he simply has to resort to 1024x768 and deal with the blurring caused by the LCD.

    Hope this helps,




    Important Note: *Some 3D accelerator cards with the chipset listed here may not be compatible with the 3D accelerator features utilized by Doom 3. Please refer to your hardware manufacturer for 100% DirectX 9.0b compatibility. This product does not support Microsoft® Windows® 95/98/ME or NT.

  • SUPPORTED CHIPSETS: ATI® Radeon(tm) 8500
  • ATI® Radeon(tm) 9000
  • ATI® Radeon(tm) 9200
  • ATI® Radeon(tm) 9500
  • ATI® Radeon(tm) 9600
  • ATI® Radeon(tm) 9700
  • ATI® Radeon(tm) 9800
  • All nVidia® GeForce(tm) 3/Ti series
  • All nVidia® GeForce(tm) 4MX series
  • All nVidia® GeForce(tm) 4/Ti series
  • All nVidia® GeForce(tm) FX series
  • nVidia® GeForce(tm) 6800
  • 6/20/2005

    Unlike Quake 2 and the original doom, doom 3 is hardware accelerated from the get go. If you have anything older than the ati 8500 or the geforce 3, it simply won't run. Don't even try to get something to "emulate" the special effects, it will be a slide show on any hardware less than their requirements (they were really serious about these). It really is a machine killer.

    Also, only XP and 2000 and greater are supported, if you want to try it on Windows 98 (to say nothing about 95) then you can try this link: http://www.flexbeta.net/forums/lofiversion/index.php/t4091.html at the bottom there's a "patcher" that lets it go. Provided the installer lets you get that far if you don't have the proper hardware.

    To give you an idea of performance: On my Athlon XP 1800 (1.5Ghz) with a GIG of ram and an ATI 9700 Pro, it ran "acceptable" - that is to say between 15fps and 30fps. Quake 2 runs at well over 260fps on this machine and quake 1 is well over 800fps. It's playable, if I turn some settings down. It was all dependant on the video card really. "Chop" and "Loading time" and "clutter on screen" were all dependant on the rest of the hardware - especially the RAM. This game is HUGE, and recommends 2GIGs of ram. So even if you go out and grab a new video card for your dad's machine (a 900mhz if I remember) it will still be quite irritating and certainly not worth the expenditure of both the video card and the game.

    That's doom3 for ya.



    Hi, I am now playing DOOM LEGACY, and I am having some problems in configuring properly your WAD files to the game.

    Would you help me out with it? Telling me the right way to do it, or pointing the better files to use with LEGACY.


    Hello Mario,

    I'm not familiar with your experience in windows-based computers so forgive me if I seem like I'm not giving you enough help, or even, too much help. Message me back at doomkid@hotcity.com if you need more explanation.

    I find the easiest way to execute stuff through the Legacy software is through the LAUNCHER.EXE - so I'm going to guide you through using that procedure. This file is located in the same place in which your Legacy installation exists. Feel free to use the Windows SEARCH utility (start menu --> search --> for files or folders) to look for the launcher.exe (case doesn't matter) in all of your local hard drives.

    I'm going to assume that if DooM Legacy runs on your computer then at least part of it is configured correctly, so we won't need to do much. Else, if DooM doesn't even run then we have an entirely different thing to fix. Now, all of the levels that you should find at the PIR website are for use with Doom II, as a first step we need to make sure that DooM II is being used under Legacy. To check this, fire up the Launcher.exe file, click on the "Game Mode..." button on the right side of the Legacy window and make sure that under the "Select Game WAD File" section, the DOOM2.WAD file location is bulleted/selected.

    To load DOOM WAD files into the game, click on the "Single Map" button found at the main menu of the Legacy launcher window, look down under the "Customize" section and check the "Additional files..." checkbox. Then, look over to the right side of the drop list box and click on a button with a "+" icon on it. Locate the file on your hard drive that you wish to load and click OK. Once this is complete, the only thing stopping you from starting the game is the OK button on the bottom of the window that you have now been brought to.

    Be forewarned though that many levels don't necessarily start on MAP01 of DooM II. There should be a text file that comes with the level ( or level pack ) that explains which map you are to begin on. The Launcher should let you specify which level you want to start on in the same 'Single map' window of Legacy.

    Also, if you've downloaded a TC, or total conversion, for DooM or DooM II, then getting it to work under Legacy will require a lot more effort and time. I can guide you, but I would need more feedback before doing so.

    Good Luck,


    ADDENDUM to my previous message:

    In regards to playing old-skool DOOM, you will need to play it under Win95 through a DOS box or something. If you want to use a hardcore DOS mode, then you will be forced to MANUALLY install and START the IPX protocol in DOS prior to starting the game -- which is a BITCH.

    I like bringing new meaning to the term overkill when I can.



    Aight, here's the full dilly-o :

    If all those computers can see each other through the network through file and print sharing, then they sure as hell can see each other in any other way.

    You can:

  • 1.) Use regular Doom2 and any wad of your choice to play with all of those computers over your network using just the stock doom2 software (and a modification that I'm going to divulge below).
  • 2.) Use Legacy just as described above - except legacy is more compatible network wise, but needs some modification to get the launcher to work properly (I will also divulge that below).
  • I'm sorry but I'm going to have lay down the ejacashun on ya - and I'm going to do it by assuming that you know NOTHING - no offense intended - (its for reusing this email later): [and it makes me feel like a big man =)]


    node = a computer on the network.

    ip address = unique 12 digit number (ex. used to identify every node on the network. To find out your ip, just type winipcfg from either the RUN menu or from a dos box. Its different for Win2k and XP (ipconfig is used for those OSes)

    HISTORY 101:

    Computers can only communicate through a network medium (modem or Ethernet) if they share the same protocol (namely a language that they use to communicate with eachother). For Windows, there are 2 main "Languages" available: TCP/IP(Transmission Control Protocol/Internet Protocol - it was invented in Hawaii in conjunction with Stanford University [it was invented in Hawaii because the guy did it LOVED surfing]) and IPX/SPX/NetBios Compatible Protocol. Every computer in the world that is on the Internet uses the TCP/IP Protocol.

    The IPX/SPX is a specific one that was designed by Xerox (the paper copying company you know now actually invented Ethernet), and it just so happens that DOOM and DOOM2 and other OLD DOS games run off that network protocol mainly because modems were too laggy over a network (a network like the Internet) for doom back then and it would have required so many more lines of network code to make it work perfectly on the Internet (IPX is an Ethernet only protocol by the way).

    Windows95 was also one of the first OSes (aside from Unix which has always been a Network orientated OS) to actually incorporate TCP/IP as a default setting in its networking options - which is why if you look at the network properties for your cable modem/Ethernet/modem/anything that is used to access websites through, you'll see the TCP/IP listed. Win95 also has the Microsoft Network protocol which allows Windows machines to see eachother over the network. At the time of Doom, Win95 was too new to use TCP/IP so they stuck to something they knew worked, namely IPX (see above paragraph).

    Macs have their own protocol called AppleTalk that they use religiously amongst themselves. So to talk to an apple computer over your windows network, you will need a windows version of the AppleTalk protocol to see/share files/printers. If you've ever played doom on a mac over a network, you would see the AppleTalk as one of the options of gameplay aside from IPX (yes, Macs have IPX available to them, which is why it has always been "possible" to play old games with mac/pc but because the software had differences in the design they could never play long enough without crashing).

    Back to Doom and How it Effects You:

    Old-skool DOOM/DOOM2 needs the IPX/SPX Protocol to run over the network inside Win95. Win95 cd has this thing. You need to install it on each and every one of your computers before you can play old skool doom. You can do this by:

  • 1.) Right Click on the Network Neighborhood Icon on the Desktop, select Properties.
  • 2.) You will be presented with a window that shows you which Network Components are installed, namely: Client For Microsoft Networks (usually there) TCP/IP protocol (maybe you have 2 depending on if you have 2 Ethernet cards or 1 Ethernet and 1 modem etc...) IPX/SPX-Compatible Protocol (maybe or maybe not, definitely not by Win95 default) File and Print Sharing for Microsoft Networks (usually there if you're sharing files) ***you may have more depending on special software you have installed like netware etc...***
  • 3.) If you do NOT have the IPX/SPX protocol there, you will need to install it. If you DO have it, you're done and goto the DOOM LAUNCHING instructions I have below.
  • 4.) If you're at this step, you DON'T have IPX installed. So, from the network config window click the button ADD. Click PROTOCOL from the list and click another button called ADD that is "awakened" from selecting the word "protocol".
  • 5.) You will need the windows cd handy for these steps here. Choose MICROSOFT from the list of Manufacturers, and then choose IPX/SPX-Compatible Protocol from the list of "Network Protocols".
  • 6.) Click OK. Some files should start to be copied over to the machine - this is where the CD is needed, and it will ask for it here if you don't already have it in the drive.
  • 7.) Just restart the computer. Done. Not brain surgery. You'll get it down to an art form soon.
  • 8.) If you check the network properties window after the restart you should see the ipx protocol listed there and "binded" to your hardware adapters (ONLY if you have multiple network devices). For performance options, the ipx ONLY needs to be bound to the Ethernet controller as its useless for a 56k (analog style) modem (but may be useful for a cable modem!) so if you have a 56 (or slower) modem, delete the ipx-->modem binding to prevent slowdowns.
  • DOOM LAUNCHING: You can use DM.exe (a front end that i find is a lot better for old doom network games) or you can use Doom's SETUP.exe.

    In either case, you need to fire each computer's game-network-frontend individually (people cannot connect to a doom game while the game is already running, they're only able to connect by starting altogether - as I'm sure you've known this much for old skool doom). Just select how many computers there are, make sure "already connected" is selected (if the front end even has that option) and then make sure that IPX/SPX is selected. Then hit GO! or whatever thing that launches the net game session. The computers should find eachother and you'll be good to go. This will also become an art form when you get the hang of it.

    For running WADs, I don't know if Doom's setup.exe can handle it so I always use DM. You should have this program, I got my first copy of it with the Ultimate doom. It's on my ftp if you want too. Oh, and every computer has to have the same copy of the wad in its doom directory in order to play the wad. The wad versions have to be exactly the same, or the game could crash randomly.


    I put legacy doom into its own category because it's a revamped version of the old thing. I say Legacy is more network-compatible because they've updated the launcher to have the ability of communicating through the TCP/IP protocol - meaning that:

  • 1.) you don't need to hassle with the IPX if you don't want to
  • 2.) you can play over the Internet with anyone who also has legacy - you'll just need their IP address handy.
  • Legacy can only achieve this by rigging up a SERVER/CLIENT setup with Doom. Meaning that people can join and exit and join the game at any given time during gameplay - and it also means support for more than 4 players because the code has been upgraded.

    So when playing using Legacy, one computer needs to be the server and every other one is a client machine. The server needs to fire up the game first too. This is all from memory though as I haven't done it in a while.

    The most hassleful part of the launcher is configuring it to work with Legacy. Do so by the following:
  • 1.) Fire up Launcher.exe
  • 2.) Click the GAME SETUP button.
  • 3.) Under the "Doom Legacy Program" you will need to click the "..." button and search for and select the LEGACY.EXE file.
  • 4.) Under "Select Game WAD file" you will need to choose a version of DOOM that you want to play (there are multiple spots so you can just switch between doom/doom2/final doom just by the bullets on the left of the white boxes.
  • 5.) Click ok. To test it, just click LAUNCH. It should run.
  • 6.) If you want to add WADs and stuff, and play single player, then that's what the single player button is for.
  • 7.) Same for the multiplayer button.
  • Things should be pretty self explanatory for the launcher. Help files are available. I'm usually available.

    If you have any problems or questions, don't hesitate to fire off an email to me.

    Good Luck. Have fun with it! Experiment....



    Subject: Intro to Logic: The Fruits of the Chootie Tree

    Hello Chris,

    Your previous emails (regarding true premises leading to a false conclusion) have been illuminating; I thank you. If you have time, I'd like your wisdom on a slightly different viewpoint (though conveys the same idea in a much better way) on that same problem.

    The following can be accredited to my colleague Gabriel Ilgiovine and myself.

    The problem:

    "Is it possible to produce an example of argument having the first form that also has premises that are all obviously true and a conclusion that is a obviously false? If so, produce such an argument."

    The form is:

  • 1. X or Y.
  • 2. It is not true that X.
  • 3. Therefore Y.
  • First, we'll present the general form of the argument and then lead to specific examples.

    General Argument Form:

    The book's question mentions the word obvious. So to make the premises obvious, We'll define statements {X,Y} as being both X and Y but not X xor Y. That is, the statements X and Y work only together, and not as separate parts. Since the OR allows them to work together, the truth value of the first premise still holds as true (I am exploiting the "weird case" with the logical OR.)

    And so an example of an argument of that form can be produced by merely defining the statements used within the premises to be false when they stand alone.

    A real world example:

  • 1. Light is a wave or Light is a particle.
  • 2. It is not true that Light is a wave.
  • 3. Therefore Light is a particle.
  • The truth of the second premise makes the conclusion impossible, even though the first premise is true (that light is BOTH a wave and a particle, but neither one individually.)

    Another example:

    Let's define an imaginary tree and call it a "Chootie Tree."

    A Chootie Tree is a theoretical tree such that:

  • 1. it always grows apples and simultaneously
  • 2. it never grows apples
  • Now, it seems intuitive to say that these two properties cannot simultaneously exist, but there is no concrete reason to assume they can't. Just because we cannot comprehend it doesn't make it not true. (Science is full of such contradictions. e.g. 4 dimensional space-time is incomprehensible, yet it exists. Light is both all particle and all wave. Gravity is both the curvature of space-time and the emission of graviton particles. Some subatomic particles must be rotated more than 360 degrees before returning to their original orientation. There are infinitely many rational numbers with no spaces in between, and yet infinitely many more irrational numbers between them. Etc.)

    Similarly, a Chootie Tree cannot be comprehended, but CAN (possibly) exist. The logical form breaks down here though. It cannot withstand self-contradictory situations, yet such situations exist abundantly in the universe. So this logical form is unfortunately only valid the relatively small amount of cases in the universe in which the subject is conceptually distinct from its opposite and thus comprehensible to humans. Usually though, the argument form is doomed to failure.



    Hey Bill -

    There's something I've come across that I think you'll have some interest in. It's a hunka chunka shareware (still fully functional for what you'll want it for, with no nag screens) called 98Lite that takes Windows 98's uber beefy shell and slims that thing down to practically nothing. It basically shaves Internet Exploder from the shell and has even the power to allow you to remove it from ever existing in your current install of windows.

    There are three levels of shaving that you can do to your machine. The most efficient one removes the current beefy 98 one and installs the windows 95 one in its place (lot more stable, less memory, hell of a lot faster).The core of your system will be Win98 (with the added benefits of better hard drive performance using Fat32 and stuff) but it will have the classic Win95 look and feel. The other two levels of shaving just give you a cross between the features (lets you keep some of the 98 features you might like or might have grown accustomed to). One downside (expected) to using the 95 interface is how it detects the size of your hard drives, remember that win95 didn't allow for drives over 2Gb in size (bring that is what the limit of FAT16 partitions could do) - and although there are no problems related to it, its a slight inconvenience to have to fire up a DOS window just to see how much free space you have left.

    To do it, you will just need to grab the software, and you will also need both the 98 and 95 cds handy (I can provide them on my FTP if you wish). The software is completely automated and stable. Its DOS based, so you know it has to be of some quality (in the sense that the guy made a hardcore interface for it and all, so he knew what he was doing) and it has full uninstall features. It doesn't dick with any of your current programs and doesn't effect how they run (except that they're noticeably faster).

    I figured that you would like it because I know how much you used to bitch about 98 - and especially the internet exploder integration. =)

    I have done it to my second machine, a 333 with 48mb ram, and find that the performance of the machine boosted as if I threw in another 166Mhz. Today I just tossed a stick of 128mb ram in it and the machine is even SNAPPIER than my 733 running XP with 512mb ram. Plus the thing hasn't given me shit, and has also increased the network browsing speed on that machine. Games seem to run even smoother than before. Hell, I'm running Counter Strike at 800x600 on a 333 with a damned dirty voodoo2 card, sheesh.

    Anyways, I was thoroughly impressed. It will be a culture shock unless you're really friendly with the Win95 interface.

    Give 'er a shot on one of the older machines that you might have "upgraded" to 98....




    I suppose I could never learn to accept the real media format only because it basically molested the video we did and doesn't really show off the effort. I've seen Real Video CNN stuff, even on a fast connection, and found it to only trigger a bowel movement (personal opinion there). Something I've found, although it ain't no 1/4MB, is a new video/audio codec. I have yet to figger out how to get the thing streaming, but it's called Mpeg4.

    A little Background: DVD's are simply 5GB Cd's with an AVI (or MPG) on them containting a low compression codec called Mpeg2. Uncompressed it looks awesome, near gawdlike and hence the reason why DVD's are so popular, but the second you compress them for file size, well, you can see how our video came out to be 36MB in size, and it didn't look too good... (granted it wasn't bad visually).

    This new MPEG4 format has been dubbed by Microshaft, and their Media player is capable of streaming this stuff. I have a copy of that Wassup thing (36MB file) and re-encoded it into Mpeg4 (code-named DivX) to where it shrunk the file size down to 9MB. This is still unfeasable to a modem user, but after Zipped it goes to 1.4MB. Thus there is still room for more compression elsewhere.

    Why should you give a damn? Because it's DVD quality.

    That's the kicker. 9MB DVD quality video. Nice. I've taken 5GB DVD movies and compressed them down to 630MB in size so they can fit on a single CDR to be distributed, and they look gawd-like - perfect motion and superior quality (if you have a beefy enough CPU, you can playback these great movies at no quality loss from the original DVD).

    While the current media plater has Mpeg4 support, I've never known a player, by default, to come with the DivX codec, so it has to be downloaded and installed seperately ~300K. The main reason people are using it is because if they have a high bandwidth connection, they are capable of downloading entire movies to 630MB on their drives and watching them on their computers @ dvd quality without the useage of a DVD player, decoder or even a blockbuster card to rent the DVDs with. It's used primarily for pirating DVDs now.

    Unfortunately MSNBC decided to make this information public, so more people are ripping and making movies and the movie industry is getting boned because of it. They say it's not a threat JUST yet because not everyone can pull down 600MB with their 56K modems, but because DSL and CABLE is so widely available, the probability of it becoming an issue in the near future is grand. I think I would encourage you to download as many of these things as feasable (if/when you get that ISDN) and burn them onto CD for private storage.

    You really must see this thing to believe it. It's amazed the hell out of anyone who's seen it over here. I've tricked many of my less-computer whizzy friends into thinking that I have a DVD Rom.... hehe.

    This is just something to consider - it is new technology - it is capable of being streamed, but I haven't figgered that part out just yet. Ifilms.com does it all the time.... using the mpeg4 type of encoding.

    Just a breakdown of File Formats to avoid confusion:


  • 1.) MP2 - low quality audio (small file size)
  • 2.) MP3 - high quality audio (small file size)
  • 3.) MP4 - higher quality audio (same file size as Mp3)

  • 1.)AVI - decent to sh*t quality, large file size (expect 50MB out of 10 seconds)
  • 2.) MPEG-1 - File sizes shrunk from AVi while deteriorating quality
  • 3.) MPEG-2 - File sizes are smaller than AVI, look 2x better, requires more CPU than AVI - when low compressed, they can superceed the quality an AVI can ever be.
  • 4.) MPEG-4 - Think Maximum Mpeg-2 compression @ an 18:1 ratio ... it utilizes Mp3 as the audio format for the video, and doesn't require anything more than a decent CPU (minimum 233 to be safe).
  • Consider this just another "Tech Update"

    Questions, Threats, comments?


    Parker Information Resources
    Houston, Texas
    E-mail: bparker@parkerinfo.com

    The HTML Writers Guild
    Notepad only