From pshuang Wed Jul 24 18:54:09 1991 To: PCTECH-L%TREARN.BITNET@uga.cc.uga.edu cc: RATLIFF%IPFWCVAX.BITNET@uga.cc.uga.edu In-reply-to: Deacon Blues's message of Wed, 24 Jul 91 13:55:00 EST <9107241915.AA23148@ATHENA.MIT.EDU> Subject: Display's in general... BCC: pshuang > There are so many display schemes out there now that I have never > understood all the differences. Could someone with the knowledge of the > advancement of display's through the original pc's to the current SVGA > stuff please write up a quick advancement chart. -------- cut here -------- Some of the standard products widely available for DOS video: (1981) IBM Monochrome adaptor = 80x25 text only, no graphics. Did support text attributes (bold, blink, underline). (1982?) Hercules Monographics card (HGC) = 80x25 text + 720x348 bit-addressable graphics. Compatible with software and monitor expecting IBM's monochrome adaptor. Lotus 1-2-3, because it was adapted early on to work with the Hercules, help make it the first popular non-IBM standard. (1981) IBM Color Graphics Adaptor (CGA) = 40x25 & 80x25 text modes in sixteen colors, 160x100 in sixteen colors (non-standard?), 320x200 graphics in four colors, 640x200 in two colors. Prone to snowy effects on screen when applications wrote to the video memory without taking refresh timing into account. (1985) IBM Enhanced Graphics Adaptor (EGA) = 40x25, 80x25, 80x43 text modes in sixteen colors chosen from palette of sixty-four. Includes CGA modes plus 640x350, again 16 colors out of 64. (1986?) IBM Professional Graphics Adaptor (PGA) = not very common, I don't know specific details, has been an orphaned product for a long time. [Above this paragraph, the cards send TTL signals (discrete levels) to the monitor; below cards use analog signals instead (monitors no longer limited to fix number of color levels, in theory).] (1987) IBM Multi-Color Graphics Adaptor (MCGA) = first introduced as part integrated onto "low-end" PS/2 motherboards (Model 30). Offers EGA modes plus selection of colors from 256Kb palette and 640x480x2 plus 320x200x256 graphics modes. (1987) IBM Video Graphics Adaptor (VGA) = first introduced as part integrated onto "high-end" PS/2 motherboards (50/60/70/80). Offers MCGA modes plus 640x480x16. First tolerable graphics standard. {grin} (1988?) IBM 8514/A = introduced as a solution envisioned for high-end PS/2 workstations being used intensively for CAD and graphics work. First IBM card to contain intelligence (coprocessor is capable of doing simple graphics primitives, like draw lines or fill regions). Interlaced at 1024x768x16... bad move for market acceptance as was price. Supports 1024x768x256 if you upgrade from 512Kb to 1Mb. (1991) IBM eXtended Graphics Adaptor (XGA) = first introduced as part integrated onto PS/2 Model 90 and 95 motherboards. Offers VGA modes plus non-interlaced 1024x768x256 (and probably 800x600x256 as well), as well as a 640x480x65,536 (16 bit/pixel) true colors mode. ======== Other terms and comments: (1988?) VESA = industry consortium formed to set standards (both in terms of the kind of monitor hardware is needed, and how software should access new capabilities) for the many clone cards which were available at the time that provides high resolutions, more colors, or both, than the IBM VGA (commonly known as... {drumroll please} SuperVGA!) Relevant when buying card, but not critical anymore if you're buying a card with a popular chipset (i.e. Tseng 4000, Trident). Texas Instruments 34010/34020 = graphics coprocessors which are extremely powerful (the 34020 provides tens of MIPS... unless you put it into a fast 486, it is likely to be more powerful than the CPU) but software need to have good drivers written to take advantage of cards with these chips (i.e. drivers which know to offload graphics operations to the graphics card to perform). 24-bit color: cards that provide more than 8-bit (256) color were outright outrageously expensive several years ago, and are now only expensive. Macintoshes have the advantage of operating system support and device independence, so applications can more easily take advantage of more colors (16,777,216 to be exact), but PC's are catching up (with some help from Windows 3.0), although Macs are still a moving target. This is actually an arena where PC's don't have as huge a price advantage as is prevalent elsewhere. Portrait/landscape displays: monitor/card combinations which provide high resolutions (well past 1024x768) so that you can see more on the screen at once, often used for CAD/DTP purposes (see the entire drawing or a two-page spread helps you see the overall design). Again, Macs have better support for these through the OS, and PC solutions aren't much cheaper. It is somewhat surprising that with Windows 3.0, there aren't more such products out being marketed for the PC. --- Above text where applicable is (c) Copyleft 1991, all rights deserved by: UNIX:/etc/ping instantiated (Ping Huang) [INTERNET: pshuang@athena.mit.edu] -------- cut here -------- Any questions? Signing off, UNIX:/etc/ping From pshuang Thu Jul 25 10:47:02 1991 To: AS.AMM@Forsythe.Stanford.EDU cc: PCTECH-L%TREARN.BITNET@uga.cc.uga.edu In-reply-to: "Angel M. Mayorga"'s message of Wed, 24 Jul 91 23:25:48 PDT <9107250625.AA10817@ATHENA.MIT.EDU> Subject: Interlace versus non-interlace BCC: pshuang In e-mail, Angel M. Mayorga asks: > I've never kept close touch with these ... what do the terms > "interlaced" and "non-interlaced" refer to? Someone already posted some good comments on interlacing to PCTECH-L. In case you lost that, the difference between the two is that interlacing only "paints" every other line on the screen on each sweep, whereas non-interlaced images paint every single line every single sweep. Most VGA monitors and below are non-interlaced; televisions are interlaced. SuperVGA monitors which are cheaper tend to be interlaced at their highest resolutions (1024x768), the more expensive ones are not (when I bought my computer system, for example, I had the choice of paying an extra $100 for a better, non-interlaced monitor) -- since interlacing means that at the same scanning frequency, the monitor doesn't have to put as many pixels on the screen per second, it makes sense that there's a hardware price differential. Interlacing reduces image clarity in that unless you have *REALLY* long-duration phosphors (which has the side-effect that movement leaves ghost streaks on the screen for a few seconds), you can perceive a jumpiness in fine details on the screen, especially with fine lines and horizontal borders between different areas on the screen. Some people also believe that interlaced displays give them headaches. --- Above text where applicable is (c) Copyleft 1991, all rights deserved by: UNIX:/etc/ping instantiated (Ping Huang) [INTERNET: pshuang@athena.mit.edu]