home > GeForce 6800 GT

 
GeForce 6800 GT

PixelView GeForce 6800 GT is a Graphics Processing Unit, produced using 130 nm technological process. The GPU is based on NVIDIA GeForce 6800 GT, and it is aimed at desktop market. The card has graphics clocked at 350 MHz. It also has 16 pixel shaders, 16 ROPs, along with 16 texture units. The GeForce 6800 GT comes with 256 MB of GDDR3 memory. Since the memory runs at 500 MHz, and uses 256 bit interface, the effective memory bandwidth is 32 GB/s. The GPU supports PCI Express 1.0 interface, and needs a single slot on the motherboard. 3D graphics and gaming performance of the PixelView GeForce 6800 GT can be improved if it is paired with a similar card.

More detailed specifications of the GPU you will find below.

PixelView GeForce 6800 GT GPU specifications

Name / Brand / Architecture

Manufacturer: PixelView
Model: GeForce 6800 GT
Based on: NVIDIA GeForce 6800 GT [Compare]
Target market segment: Desktop
Die name: NV45
Architecture: Third Generation CineFX Shading Architecture
Fabrication process: 130 nm
Transistors: 222 million
Bus interface: PCI-E 1.0 x 16

Frequency

Graphics clock: 350 MHz
Vertex Shader Clock: 350 MHz

Memory specifications

Memory size: 256 MB
Memory type: GDDR3
Memory clock: 500 MHz
Memory clock (effective): 1000 MHz
Memory interface width: 256-bit
Memory bandwidth: 32 GB/s

Cores / Texture

ROPs: 16
Pixel shader processors: 16
Vertex shader processors: 6
Texture units: 16
RAMDACs: 400 MHz

SLI / Crossfire

Maximum SLI options: 2-way

Video features

Maximum digital resolution: 1920 x 1200
Maximum VGA resolution: 2048 x 1536
Anti-Aliasing technologies: Transparency AA

Performance

Pixel fill rate: 5.6 Gigapixels/s
Texture fill rate: 5.6 Gigatexels/s

External connectors

Standard display connectors: 2 x Single-Link DVI-I
S-Video
Power connectors: 1 x 6-pin

Dimensions

Width: Single-Slot

Other features / Support

Other features: PureVideo HD  ? 
OpenGL support: 2.1
DirectX support: 9.0c
Pixel shader model: 3.0
Vertex shader model: 3.0
Email:info@Mobilegpu.com  All rights reserved.