[go: up one dir, main page]
More Web Proxy on the site http://driver.im/Jump to content

Matrox G400

From Wikipedia, the free encyclopedia

The G400 is a video card made by Matrox, released in September 1999. The graphics processor contains a 2D GUI, video, and Direct3D 6.0 3D accelerator. Codenamed "Toucan", it was a more powerful and refined version of its predecessor, the G200.

Overview

[edit]
A Matrox G400 Max

The Matrox G200 graphics processor had been a successful product, competing with the various 2D & 3D combination cards available in 1998. Matrox took the technology developed from the G200 project, refined it, and essentially doubled it up to form the G400 processor. The new chip featured several new and innovative additions, such as multiple monitor output support, an all-around 32-bit rendering pipeline with high performance, further improved 2D and video acceleration, and a new 3D feature known as Environment Mapped Bump Mapping.

Internally the G400 is a 256-bit processor, using what Matrox calls a "DualBus" architecture. This is an evolution of G200's "DualBus", which had been 128-bit. A Matrox "DualBus" chip consists of twin unidirectional buses internally, each moving data into or out of the chip. This increases the efficiency and bandwidth of data flow within the chip to each of its functional units. G400's 3D engine consists of 2 parallel pixel pipelines with 1 texture unit each, providing single-pass dual-texturing capability. The Millennium G400 MAX is capable of 333 megapixels per second fillrate at its 166 MHz core clock speed. It is purely a Direct3D 6.0 accelerator and, as such, lacks support for the later hardware transform and lighting acceleration of Direct3D 7.0 cards.

The chip's external memory interface is 128-bit and is designed to use either SDRAM or SGRAM. Matrox released both 16 MiB and 32 MiB versions of the G400 boards, and used both types of RAM. The slowest models are equipped with 166 MHz SDRAM, while the fastest (G400 MAX) uses 200 MHz SGRAM. G400MAX had the highest memory bandwidth of any card before the release of the DDR-equipped version of NVIDIA GeForce 256.

Perhaps the most notable feature of G400 is its ability to drive two separate monitors to display a single desktop. This feature is known as "DualHead" and was a decisive edge for Matrox over the card's competitors at the time. The DualHead capability not only offered desktop widening but also desktop cloning (two screens showing the same thing) and a special "DVDMAX" mode which outputs video overlays onto the second monitor. Matrox's award-winning Powerdesk display drivers and control panel integrated Dualhead in a very flexible and functional way that become world-renowned for its effectiveness. However, contrary to the video mode's name, G400 does not support full DVD decoding hardware acceleration. G400 does have partial support for the DVD video decoding process but it does not perform inverse discrete cosine transform IDCT or motion compensation in hardware (the two most demanding steps of the process).

Matrox G400 Tech Demo with EMBM

The G400 chip supports, in hardware, a texture-based surface detailing method called Environment Mapped Bump Mapping (EMBM). EMBM was actually created by BitBoys Oy and licensed to Matrox. EMBM was not supported by several competitors such as NVIDIA's GeForce 256 through GeForce 2, which only supported the simpler Dot-3 BM, but was available on the ATI Radeon 7200. Due to this lack of industry-wide support, and its toll on the limited graphics hardware of the time, EMBM only saw limited use during G400's time. Only a few games supported the feature, such as Dungeon Keeper 2 and Millennium Soldier: Expendable. EMBM requires either specialized hardware within the chip for its calculations or a more flexible and programmable graphics pipeline, such as later DirectX 8.0 accelerators like the GeForce 3 and Radeon 8500.

G400's rendering pipelined uses what Matrox called "Vibrant Color Quality 2" (VCQ2), a functionality in which all internal 3D calculations are done with 32-bit precision. The goal was to prevent dithering and other artifacts caused by inadequate precision when performing calculations. The result was the best quality 16-bit and 32-bit color modes available at the time.

Matrox was known for their quality analog display output on prior cards and the G400 is no exception. G400 was the benchmark for signal quality for several years, significantly outperforming some competitors (notably pre-GeForce4 NVIDIA cards). Where many cards were crippled by blurry output, especially as the resolution and refresh rate increased, the Matrox cards delivered very sharp and clear images.

G400 is the first Matrox board compatible with AGP 4X. Most (REV. A) G400 boards actually only support 2X mode, but there are later revisions (REV. B), that are fully 4X compliant and run at the higher speed if the motherboard is capable as well.

Performance

[edit]

G400 was known for being particularly dependent on the host system's CPU for high 3D performance. This was attributed both to its architecture and to the poor drivers it relied on for much of its life (especially OpenGL ICD). With regard to its hardware, G400's triangle setup engine, called the "Warp Engine" ironically, was somewhat slower than the counterparts aboard the competition's cards. However, the Warp engine was programmable which theoretically enhanced flexibility of the chip. Unfortunately Matrox never described the functionality of this component in-depth so little is known about it.

As said earlier, G400 suffered at launch from driver problems. While its Direct3D performance was admirable, its OpenGL installable client driver (ICD) component was very poor. The situation was eerily similar to what had happened with the older G200, with its near-total lack of credible OpenGL support. Matrox made it very clear that they were committed to supporting OpenGL, however, and development rapidly progressed. G400 initially launched with an OpenGL to Direct3D wrapper driver, like G200, that translated an application's OpenGL calls into Direct3D (a slow and buggy solution). Eventually a native OpenGL driver called "TurboGL" was released, but it was only designed to support several popular games of the time (e.g. Quake3). This driver was a precursor to a fully functional OpenGL ICD driver, a quick development to improve performance as fast as possible by offering an interim solution. Since TurboGL didn't support all OpenGL applications, it was essentially a "Mini ICD" much like 3DFX had used with their Voodoo boards. TurboGL included support for then-new SIMD technologies from AMD and Intel, including SSE1 and 3DNow!. In mid-2000 the G400 received a fully compliant OpenGL ICD which offered capable performance in most OpenGL-supporting software. The G400 continually received official driver updates into 2006.

Even with initial driver difficulties, Matrox G400 was very competitive. 2D and Direct3D performance were more than competitive with the NVIDIA RIVA TNT2, 3dfx Voodoo3, and ATI Rage 128 Pro. In fact, prior to the release of the NVIDIA GeForce 256 that supported Direct3D 7.0 transform and lighting acceleration, the Millennium G400 MAX was a respectable Direct3D card, competitive with Voodoo3 3500 and TNT2 Ultra. 3dfx had an edge in some games with its low-overhead Glide API and NVIDIA was, for a long time, king of OpenGL.

Marvel G400-TV – Zoran chip

[edit]

Matrox stopped support for Marvel G400-TV early because there was no way to make it fully functional in Windows 2000. The problem was with the Zoran chip used for hardware MJPEG video compression on the Marvel G400 card. Matrox tried to make stable drivers for several months but with no luck. A Matrox user going by name Adis hacked original drivers to make the card work under Windows 2000.[1][2][3] The driver was later updated for Windows XP, and then for Windows Server 2003. Video capturing was possible but drivers are still based on VfW. Hardware MJPEG capturing can be unstable but software compression, using a good video codec, gives much better results anyway. There are no WDM drivers available for this card.

Matrox G450

[edit]

In Fall of 2000, Matrox introduced the G450 chip (codenamed Condor) as a successor to the G400 line. Like the G250 was to the G200, G450 was primarily a die shrink of the G400 core from the 250 nm semiconductor fabrication process to 180 nm. By shrinking the core, costs are reduced because more chips are made per wafer at the factory, and Matrox can take the time to fix earlier mistakes in the core, and trim or add new functionality. Matrox clocked the G450 core at 125 MHz, just like the plain G400. Overclocking tests showed that the core was unable to achieve higher speeds than G400 even though it was manufactured on a newer process.[4]

Perhaps the biggest addition to G450 was that Matrox moved the previously external second RAMDAC, for the second monitor connector (DualHead), into the G450 chip itself. RAMDAC speeds were still different though, with the primary running at an excellent 360 MHz, but the secondary running at only 230 MHz. This meant that the primary monitor could run much higher resolutions and refresh rates than the secondary. This was the same as G400. The G450 also had native support for TMDS signaling, and thus DVI, but this was not a standard issue connector. Boards shipped with dual analog VGA connectors.

G450 was adapted to use a DDR SDRAM memory interface, instead of the older single data rate (SDR) SGRAM and SDRAM used on G400. By doing this they were able to switch to a 64-bit memory bus and use the DDR memory to equal the previous memory bandwidth by clocking the RAM again at 166 MHz. A 64-bit bus reduces the board's complexity (and cost) because fewer traces have to be used, and potentially the pin-count of the graphics processor can be significantly reduced if the chip is designed only for a 64-bit bus. However, DDR has a higher inherent latency than SDR given the same bandwidth, so performance dropped somewhat.[4]

The new G450 again had support for AGP 4X, like some later-produced G400 boards. The 3D capabilities of G450 were identical to G400. Unfortunately, because of the identical core clock and due to lower memory bandwidth, G450 was slower than G400 in games.[5]

Marvel G450 eTV not only had a TV tuner, but also was a launchpad for Matrox's new eDualHead dual display enhancement. It added some new features to DualHead that worked with Internet Explorer to make pages show up on both screens at once.[6]

Matrox G550

[edit]

MGA-G550 processor added a second pixel pipeline, hardware transform and lighting, and the HeadCasting Engine, a hardware implementation of a vertex shader for accelerated matrix palette skinning. It does this by improving on the 96 constant registers specified for by DirectX 8.0 to a total of 256. Despite the feature, it is inaccessible by DirectX driver. Matrox only supports HeadCasting feature through the bundled Matrox Digimask software, which have never become popular.[7]

On 2005-7-13, Matrox Graphics Inc. announced the availability of Millennium G550 PCIe, the world's first PCI Express x1 graphics card.[8] The card uses Texas Instruments XIO2000 bridge controller to achieve PCI Express support.[9]

Unreleased products

[edit]

Findings within a release of Matrox graphics drivers (MGA64.sys v4.77.027) mentioned a never-released Matrox Millennium G800.[10][11] The MGA-G800, codenamed Condor 2, would have been clocked at 200 MHz core with 200 MHz DDR memory (6.4 GB/s bandwidth). The chip had 3 pixel pipelines with 3 texture units each. It was also equipped with a hardware transform and lighting unit capable of processing 20–30 million triangles per second. Further speculation included a memory controller that could support DDR SDRAM and DDR FC-RAM, DirectX 8.0 compliance, and a faster version running at 250 MHz. These specifications are somewhat reminiscent of Matrox Parhelia, in that Parhelia is a 4 pipeline DirectX 8 GPU with 4 texture units per pipeline.

Models

[edit]
Board
Name
Core
Type
Process Core
(MHz)
Memory
(MHz)
Pipe
Config
T&L Memory
Interface
Notes
Millennium G400 Toucan 250 nm 125 166 2x1 N 128-bit 32 MiB SGRAM or 16 MiB SGRAM/SDRAM
Millennium G400 MAX Toucan 250 nm 150 200 2x1 N 128-bit 32 MiB SGRAM. Needs fan. Highest memory bandwidth until GeForce 256 DDR. 3.2 GB/s
Marvel G400-TV Toucan 250 nm 125 166 2x1 N 128-bit 16 MiB SGRAM. Video capture & TV tuner.
Millennium G450 Condor 180 nm 125 166 2x1 N 64-bit DDR SDRAM. Integrated 2nd RAMDAC into core. TMDS/DVI option.
Marvel G450 eTV Condor 180 nm 2x1 N 64-bit TV tuner. eDualHead.
Millennium G550 Condor 180 nm 125 166 2x2x1 Y 64-bit 32 MiB DDR SDRAM

References

[edit]
  1. ^ "Matrox user forum". Matrox.[permanent dead link]
  2. ^ Adis (May 3, 2004). "Matrox Marvel G400-TV Support page". www.adis.szm.com. Archived from the original on 2004-07-28.
  3. ^ lordsmurf (August 6, 2010). "Matrox G400 Windows XP capture drivers [DOWNLOAD]". Forum > Digital Video > Video Project Help > Capture, Record, Transfer. The Digital FAQ / digitalFAQ Forum. Archived from the original on 2024-04-17. Retrieved 2024-09-23.
  4. ^ a b Matrox Millennium G450
  5. ^ Matrox Millennium G450 Review Archived December 1, 2005, at the Wayback Machine
  6. ^ Matrox G450e-TV Review – Page 1 – Introduction & Specs
  7. ^ Matrox Millennium G550
  8. ^ Matrox Graphics – Matrox announces world's first PCI Express x1 graphics card
  9. ^ X-bit labs – Matrox Unveils World’s First PCI Express x1 Graphics Card [UPDATED] Archived 2006-01-12 at the Wayback Machine
  10. ^ Matrox Millennium G800?
  11. ^ Specs (?) Matrox G800, 3dfx Specter und nVidia NV20/NV25
[edit]