Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Intel DG1 GPU Teardown, Failed Benchmarks, and Why It Won't Work on Most Systems
#1
Information 
Quote:
[Image: eaJAozAG8Kmjn8w8XSgqSi-1024-80.jpg.webp]

DG1 in the nude

German publication Igor's Lab has nailed a world-exclusive look at Intel's DG1 discrete graphics card. The chipmaker showcased the DG1 last year at CES 2020, running Warframe, but Intel's entry-level Iris Xe development graphics cards are exclusively available to system integrators and OEMs. In fact, Intel has put up some barriers in place to make sure that the DG1 only works on a handful of selected systems. Therefore, you really can't just rip out the DG1 from an OEM system and test the graphics card on another PC. After analyzing the images, the teardown helps explain why. 

Wallossek managed to get his hands on a complete OEM system with the original DG1 SDV (Software Development Vehicle). In order to protect his sources, Igor only shared the basic specifications of the system, which includes a Core i7 non-K processor and a Z390 mini-ITX motherboard.

First up, let's look at why the card won't work on most motherboards. 

Intel DG1

Intel has limited support for the card to a handful of OEM systems and motherboard chipsets, sparking speculation about why the company isn't selling the cards on the broader retail market. It turns out there's a plausible technical explanation. 

Hardware-hacker Cybercat 2077 (@0xCats) recently tweeted out (below) that the DG1 cards lack the EEPROM chip that holds the firmware, largely because they were originally designed for laptops and thus don't have the SPI lines required for connection. These EEPROM chips are present on the quad-GPU XG310 cards for data centers that use the same graphics engines, but as we can see in the naked PCB shot from Igor's Lab above, those same chips aren't present on the DG1 board. 

According to Cybercat 2077, that means the card's firmware has to be stored on the motherboard, hence the limited compatibility. Intel hasn't confirmed this hypothesis, but it makes perfect sense. 

The DG1 SDV reportedly features a DirectX 12 chip produced with Intel's 10nm SuperFin process node and checks in with 96 Execution Units (EUs), which amounts to 768 shaders. That's 20% more shaders than the cut-down version that Asus and other partners will offer. The DG1 features 8GB of LPDDR4 memory with a 2,133 MHz clock speed. The memory is reportedly connected to a 128-bit memory interface and supports PCIe 4.0, although it's limited to x8 speeds. 

At idle, the graphics card runs at 600 MHz with a power consumption of 4W. The fans spin up to 850 RPM and keep the graphics card relatively cool at 30 degrees Celsius. With a full load, the clock speed jumps up to 1,550 MHz, and the power consumption scales to 20W. In terms of thermals, the graphics card's operating temperature got to 50 degrees Celsius with the fan spinning at 1,800 RPM. Wallossek thinks that the DG1's total power draw should be between 27W to 30W.

The DG1 is equipped with a light alloy cover with a single 80mm PWM cooling fan and an aluminum heatsink underneath. Design-wise, the DG1 leverages a two-phase power delivery subsystem that consists of a buck controller and one PowerStage for each phase. The Xe GPU is surrounded by four 2GB Micron LPDDR4 memory chips.

Given the low power consumption, the DG1 draws what it needs from the PCIe slot alone and doesn't depend on any PCIe power connectors. Display outputs include one HDMI 2.1 port and three DisplayPort outputs. 

However, Wallossek noted that while you can get an image from the HDMI port, it causes system instability. He thinks that the firmware and driver prevent you from establishing a direct connection with the DG1, which explains why Intel recommends using the motherboard display outputs instead. The DG1 in Wallossek's hands is a test sample. Despite the many driver updates, the graphics card is still finicky, and its display outputs are unusable.

The DG1's performance should be right in the alley of Nvidia's GeForce GT 1030, but there are no benchmarks or tests to support this claim. Wallossek couldn't provide any, either. Apparently, benchmarks simply crash the system, or they end up in an infinite loop. Wallossek could only get AIDA64's GPGPU benchmark to budge, but that doesn't really tell us anything meaningful about graphics performance.
...
Continue Reading
[-] The following 1 user says Thank You to harlan4096 for this post:
  • silversurfer
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)
[-]
Welcome
You have to register before you can post on our site.

Username/Email:


Password:





[-]
Recent Posts
AMD reportedly set to launch EPYC 4004 ...
AMD launches EPYC 40...harlan4096 — 09:39
NoVirusThanks OSArmor v2.0.0.0
OSArmor has been u...harlan4096 — 07:10
Apple releases iOS 17.5.1 to fix Photo g...
Apple has released...harlan4096 — 07:08
Microsoft announces Copilot+ PCs and AI-...
On a special event...harlan4096 — 07:06
1.0.98 release (2024/05/19)
1.0.98 release (20...harlan4096 — 06:32

[-]
Birthdays
Today's Birthdays
No birthdays today.
Upcoming Birthdays
avatar (37)axuben
avatar (38)ihijudu
avatar (48)Mirzojap
avatar (34)idilysaju
avatar (38)odukoromu
avatar (44)Joanna4589

[-]
Online Staff
There are no staff members currently online.

>