Four Multi-GPU Z77 Boards from $280-$350 - PLX PEX 8747 featuring Gigabyte, ASRock, ECS and EVGA
by Ian Cutress on August 22, 2012 9:15 AM ESTGigabyte G1.Sniper 3 Overview
Based on our Gigabyte Z77X-UD5H review, if we extrapolate up to the G1.Sniper 3, then we should be in for a real treat. In our testing, the Gigabyte G1.Sniper 3 did definitely throw up a few surprises, and given the feature set, is one of the more competitive price versus functionality motherboards in this combined review.
Visually on its own, the G1.Sniper 3 has a good balance on green and black on the motherboard, however this may me a lost aesthetic inside a case with a lot of PCIe devices inside it – the intended market for this product. The G1.Sniper 3 aims at gamers, specifically those who want an Ivy Bridge platform setup as well as two or more GPUs. With the PLX chip, it also performs at peak x8/x8/x8/x8 GPU setups, wired such that in dual usage it retains x16/x16.
The G1.Sniper 3 currently retails for $280 on Newegg, and for the money there is a lot of added functionality that people will use. We have one Killer NIC for gaming, an Intel NIC for networking features, a Creative CA0132 audio chip, an mSATA port, a total of ten USB 3.0 ports, all four video outputs and a PS/2 port all wrapped up in an E-ATX form factor (one inch wider than ATX).
The package really shines when we open the box, which contains a WiFi PCIe x1 card, dual antenna, eight SATA cables, a USB 3.0 front panel and rigid SLI connectors. There is a little question mark over the choice to include a WiFi card, given the system already has increased cost due to the dual Killer/Intel NIC scenario.
Performance also gives us an extra point to think about. Some Z77 manufacturers are taking advantage of what has been called ‘MultiCore Enhancement’, where instead of running an i7-3770K CPU at 39x/39x/38x/37x during 1/2/3/4 threaded load, the CPU will run 39x/39x/39x/39x. Gigabyte does this on their main channel boards, the UD3H and UD5H. They go a step further on the G1.Sniper 3, by providing a default overclock at any load to 40x (it will idle at 16x). This gives the G1.Sniper 3, out of the box, essentially a clean sweep in all our CPU tests. It sometimes translates in our GPU testing as well. But without an overclock, the G1.Sniper 3 is the first board to come with a ‘better than the highest Turbo multiplier’ overclock.
Visual Inspection
The first thing to notice about the Gigabyte G1.Sniper 3 is the size of the board. It is one of two boards in this roundup that uses the E-ATX form factor, which means that there is an extra inch of width to the motherboard. This allows Gigabyte to place a larger chipset heatsink if it wanted, more controllers, or an improved layout. Real estate is often a premium on a high end product, so jumping up to the next motherboard size can make sense given most gamers this product is aimed at will have full sized ATX cases.
The green and black livery is accented by a gold skull and knife on the chipset heatsink. All the heatsinks are solid and finned, designed to take air in one direction each to aid cooling, and are also connected by heatpipes in order to distribute the heat away from the warm areas. The socket area is surrounded on three sides by these heatsinks, which measure ~30mm up from the PCB at the highest point, though they are set away from the Intel minimum designated socket area. The socket area has access to three fan headers nearby – a system 4-pin fan header to the top left near the 8-pin CPU power connector, a CPU 4-pin fan header to the right of the top heatsink, and a second 4-pin system fan header next to the two-digit debug button on the right.
Along the right hand side of this 6-layer PCB (there is a number on the reverse bottom of all GB boards that tells us this) are a large number of connectors that Gigabyte have added with the extra PCB space we have. At the top is a trio of power/reset/ClearCMOS buttons, with the power button being big and red. I would like Gigabyte in the future to make the reset button distinct as well, such that enthusiasts will not accidentally be clearing their BIOSes instead of resetting their systems.
Underneath this is a two-digit debug display, useful for diagnosing POST issues and a series of voltage read points if users are willing to solder on their own meters. Beside the 24-pin ATX power supply is a USB 3.0 header, powered by a nearby VIA chip. This header is in the perfect place for the USB 3.0 front panel included in the box - the VIA chip also powers the USB 3.0 header at the bottom right of the board. Beside the SATA ports we have a SATA power connector, for supplying more power to multi-GPU setups on board. I prefer having this type of connector here, as opposed to the molex or 6-pin connectors we sometimes find just above the PCIe slots.
Gigabyte have thankfully split their SATA ports by color, making them easier to understand – the white ports are the chipset SATA 6 Gbps ports, the black are the chipset SATA 3 Gbps ports, and the grey are the SATA 6 Gbps powered by two Marvell controllers. The mSATA port just beside the normal SATA ports shares lanes with one of the chipset SATA 3 Gbps ports, so it is something to bear in mind if you are using an mSATA device.
The bottom of the board on most motherboards is often populated with USB 2.0 headers (as the Z77 chipset allows a lot of them) or power/reset buttons, but typically Gigabyte takes a different approach. Here Gigabyte are using a VIA VT6308P chip (the really big one at the bottom left) to power two IEEE1394 headers, should anyone require these ports. Alongside these are two USB 2.0 headers, the front panel header section, a BIOS switch (to go between the two onboard BIOSes), a pair of fan headers, and a TPM module. As we have discussed in previous Gigabyte reviews, Gigabyte like to add a TPM module as it is often requested by their business partners. How useful it may be on the top of their range gaming board, I am not sure.
The PCIe ports are what all these motherboards reviewed today are all about, and Gigabyte point out which ones are for GPUs with the bright green color. By using the PLX PEX 8747 chip underneath the middle heatsink, the PCIe lanes are split such that:
One GPU: x16/-/-/-
Two GPUs: x16/-/x16/-
Three GPUs: x16/-/x8/x8* or x8/x8/x16/-
Four GPUs: x8/x8/x8/x8*
*These configurations use a GPU in the bottom PCIe slot. If the GPU is dual slot, it could restrict some of the space available for these bottom headers, such as my power connectors shown here:
The audio on the G1.Sniper 3 is one point which sets this board apart from the others in this review – Gigabyte have decided to use a Creative CA0132 chip, separated in its own area of the PCB with an EM shield to help reduce interference. The chip is also accented with audio-specific Nichicon capacitors, which are reputed to be better than the standard audio caps used.
The rear IO panel is covered in a myriad of features – from left to right we have a combination PS/2 port, two USB 3.0 ports, D-Sub, DVI-D, DisplayPort, HDMI, four more USB 3.0 ports, an Intel 82579V NIC, a Qualcomm Atheros Killer NIC (2201-B), audio jacks and a S/PDIF output. While Gigabyte has included all the video outputs, legacy is partly in the mind with D-Sub and PS/2 here. There is no USB 2.0, which makes life difficult as only some of the USB 3.0 ports work on first install of an OS – two of the six should work as they are based off the chipset. However, here is another point – Gigabyte have two VIA USB 3.0 controllers on board for 8 USB 3.0 ports, and the chipset should provide four more to make twelve in total. But Gigabyte only uses two from the chipset, as per their chipset diagram.
Board Features
Gigabyte G1.Sniper 3 | |
Price | Link |
Size | ATX |
CPU Interface | LGA-1155 |
Chipset | Intel Z77 |
Memory Slots |
Four DIMMs, Supporting up to 32GB DDR3 1066-2666 MHz, Non-ECC |
Video Outputs |
D-Sub DVI-D HDMI DisplayPort |
Onboard LAN |
Qualcomm Atheros Killer E2200 Intel GbE |
Onboard Audio | Creative CA0132 |
Expansion Slots |
2 x PCIe 3.0 x16 (x8 when slots underneath are occupied) 2 x PCIe 3.0 x8 2 x PCIe 2.0 x1 1 x PCI |
Onboard SATA/RAID |
2 x SATA 6 Gbps (Intel), Supports RAID 0, 1, 5, 10 4 x SATA 3 Gbps (Intel), Supports RAID 0, 1, 5, 10 1 x mSATA 3Gbps (Intel) shared with SATA2_5 4 x SATA 6 Gbps (Marvell 9172), Supports RAID 0, 1 |
USB |
2 x USB 3.0 (Intel) [2 back panel] 8 x USB 3.0 (VIA VL810) [4 back panel, 4 onboard] 4 x USB 2.0 (Intel) [4 onboard] |
Onboard |
6 x SATA 6 Gbps 4 x SATA 3 Gbps 1 x mSATA 3 Gbps 5 x Fan Headers 2 x USB 3.0 Headers 2 x USB 2.0 Headers 2 x IEEE1394 Headers Power/Reset Buttons ClearCMOS Button BIOS Switch Voltage Measurement Points TPM Header |
Power Connectors |
1 x 24-pin ATX Power Connector 1 x 8-pin CPU Power Connector 1 x SATA Power Connector (for PCIe) |
Fan Headers |
1 x CPU 4 x SYS |
IO Panel |
1 x PS/2 Combination Port 6 x USB 3.0 1 x D-Sub 1 x DVI-D 1 x HDMI 1 x DisplayPort 2 x Network Ports (Intel, Killer) 1 x Optical S/PDIF Output Audio Jacks |
Warranty Period | 3 Years |
Product Page | Link |
The main features of the Gigabyte G1.Sniper 3 are in terms of functionality against price. For $280 of your hard earned cash, we get dual NIC (one Intel, one Killer), ten USB 3.0 ports, 10 SATA ports, 4-way SLI and CrossfireX through a PLX PEX 8747 chip, any video output for the iGPU wanted, a TPM module, and improved audio through the Creative CA0132 chip. For legacy, we have a PS/2, D-Sub, a PCI port and IEEE1394 headers.
24 Comments
View All Comments
ultimatex - Wednesday, August 22, 2012 - link
I got this MOBO from Newegg the first day they had it available , I couldn't believe the price since it offered 8x8x8x8x , Picked it up the first day and havent looked back. Doesnt look as cool as the Asrock extreme9 but it still looks good. Awesome Job Gygabyte , Anandtech should have given them a Gold not bronze though since the fan issue is a minor issue.Arbie - Wednesday, August 22, 2012 - link
For gaming, at least, how many people are really going to build a 2xGPU system? Let alone 3x or 4x. The are so few PC games that can use anything more than one strong card AND are worth playing for more than 10 minutes. I actually don't know of any such games, but tastes differ. And some folks will have multi-monitor setups, and possibly need two cards. But overall I'd think the target audience for these mobos is extremely small.Maybe for scientific computing?
Belard - Wednesday, August 22, 2012 - link
Yep.... considering that most AAA PC games are just ports from consoles... having 3-4 GPUs is pointless. The returns get worse after the first 2 cards.Only those with 2~6 monitors can benefit with 2-3 cards.
Also, even $80 Gigabyte boards will do 8x x 8x SLI/CF just fine.
But hey, someone wants to spend $300 on a board... more power to them.
cmdrdredd - Wednesday, August 22, 2012 - link
"Only those with 2~6 monitors can benefit with 2-3 cards."Oh really? 2560x1440 on a single card is garbage in my view. I am not happy with 50fps average.
rarson - Wednesday, August 22, 2012 - link
If you're going multi-GPU on a single monitor, you're wasting money.Sabresiberian - Wednesday, August 22, 2012 - link
Because everyone should build to your standards, O god of all things computer.Do some reading; get a clue.
Steveymoo - Thursday, August 23, 2012 - link
Incorrect.If you have a 120hz monitor, 2 GPUs make a tonne of difference. Before you come back with a "no one can see 120hz" jibe. That is also incorrect.... My eyes have orgasms every once in a while when you get those ultra detail 100+ fps moments in battlefield, that look great!
von Krupp - Friday, August 24, 2012 - link
No. Metro 2033 is not happy at 2560x1440 with just a single HD 7970, and neither are Battlefield 3 or Crysis. The Total War series also crawls at maximum settings.I bought the U2711 specifically to take advantage of two cards (and for accurate colours, mind you). I have a distaste for multi-monitor gaming and will continue to have such as long as they keep making bezels on monitors.
So please, don't go claiming that multi-card is useless on a single monitor because that just isn't true.
swing848 - Monday, December 8, 2014 - link
At this date, December 2014, with maximum eye candy turned on, there are games that drop a refrence AMD R9 290 below 60 fps on a single monitor at 1920x1080 [using an Intel i5-3570K at 4GHz to 4.2GHz]Sabresiberian - Wednesday, August 22, 2012 - link
This is not 1998, there are many games built for the PC only, and even previously console-oriented publishers aren't just making ports for the PC, they are developing their games to take advantage of the goodness only PCs can bring to the table. Despite what console fanboys continue to spew, PC gaming is on the rise, and console gaming is on the relative decline.