AMD Aldebaran: Professional graphics cards get two GPU dies and 224 CUs

0
254

The new AMD professional graphics card with two GPUs on one package is taking shape in the form of Aldebaran. 224 CUs are currently suspected, with which AMD would not use the maximum configuration of the new accelerators of the Instinct family, but would probably start a size smaller due to the yield.

The code name Aldebaran has been used in various entries since the beginning of the year, which indicates that the software is initially supporting new GPUs. This is constantly being expanded and the latest finding is the confirmation that the card from the Radeon Instinct family, probably called MI200, is a multi-chip module that combines at least two GPU dies on one package. This is shown by patch notes from a senior member of the AMD Technical Staff.

On aldebaran, only primary die fetches valid power data. Show power/energy values ​​as 0 on secondary die. Also, power limit should not be set through secondary die.

Full expansion should offer a little more CUs

Fittingly, there are rumors that the full configuration should be 128 CUs (8,192 FP32-ALUs) per GPU, which would be a small jump compared to the current series, which offers 120 CUs and thus 7,680 shaders. With two dies, the number would increase to a maximum of 256 CUs and a remarkable 16,384 ALUs. At the beginning, however, AMD will probably launch a variant with slightly fewer CUs on the market, a procedure that is not uncommon for high-end GPUs and CPUs in order to increase the yield and keep power consumption in check. This implementation would then come with twice 112 CUs and thus 14,336 FP32-ALUs.

New storage type HBM2e

In addition to the two GPUs, HBM2e will also find its place on the package, as is currently the case with the Instinct MI100 and thus its direct predecessor. This is not per se considered to be energy-saving and also contributes its share to the board power of the card. It is not yet clear whether there will be adjustments to the memory controller and the amount of memory in the “MI200”. The MI100 only uses four chips with 8 GB each, Nvidia's Ampere solution with five with 16 GB each, which represents the maximum in the market with 80 GB HBM2e. The previous sixth dummy die could also be used by Nvidia in a next generation, so that AMD is forced to act.

AMD CEO Lisa Su recently stated that the CDNA-2 architecture for the professional segment will be presented later this year. The steadily increasing entries in the corresponding software underpin this and should deliver a steadily more accurate picture until the start.