home > GeForce 8600 GT / GV-NX86T256H-ZL

 
GeForce 8600 GT / GV-NX86T256H-ZL

Specifications of the GIGABYTE GeForce 8600 GT GPU


GIGABYTE GeForce 8600 GT (part number GV-NX86T256H-ZL) Graphics Processing Unit for desktop market segment is based on NVIDIA GeForce 8600 GT, and produced on 80 nm technology. The card has 600 MHz graphics clock frequency. It also sports 32 CUDA cores, 16 texture units, together with 8 ROPs. The GeForce 8600 GT embeds 256 MB of GDDR3 memory, using 128 bit bus. The memory is clocked at 720 MHz, which gives 23.04 GB/s memory bandwidth. The GPU supports PCI Express 1.0 interface, and needs a single slot on the motherboard. To increase GIGABYTE GeForce 8600 GT game and 3D graphics performance, it can be paired with a similar graphics card in SLI mode.

More detailed specifications of the GIGABYTE GeForce 8600 GT GPU are available below.

Name / Brand / Architecture

Manufacturer:GIGABYTE
Model:GeForce 8600 GT
Part number:GV-NX86T**H-ZL
Based on:NVIDIA GeForce 8600 GT [Compare]
Target market segment:Desktop
Die name:G84
Architecture:Unified Shader Architecture
Fabrication process:80 nm
Transistors:289 million
Bus interface:PCI-E 1.0 x 16

Frequency

Graphics clock:600 MHz

Memory specifications

Memory size:** MB
Memory type:GDDR3
Memory clock:720 MHz
Memory clock (effective):1.44 GHz
Memory interface width:128-bit
Memory bandwidth:23.04 GB/s

Cores / Texture

CUDA:1.1
CUDA cores:32
ROPs:8
Texture units:16
RAMDACs:400 MHz

SLI / Crossfire

Maximum SLI options:2-way

Video features

Maximum digital resolution:**0 x 1600
Maximum VGA resolution:2048 x 1536
Anti-Aliasing technologies:16 x CSAA
16 x FSAA
HDMI:Via Adapter
HDMI version:1.3a

Performance

Pixel fill rate:4.8 Gigapixels/s
Texture fill rate:9.6 Gigatexels/s

External connectors

Standard display connectors:2 x Dual-Link DVI-I
S-Video

Dimensions

Width:Single-Slot

Other features / Support

Other features:HDCP
High Dynamic Range (HDR) Support
PhysX
PureVideo HD  ? 
OpenGL support:2.0
DirectX support:10.0
Shader model:4.0
Email:info@Mobilegpu.com  All rights reserved.