integrated-graphs

Myths and realities of the integrated graphs: did they fulfill their promise?

When you talk about gamers, you usually do it with those high-performance PCs in mind that play PC gaming at very high resolutions and frequencies with huge levels of detail. These gamers take advantage of the power of dedicated NVIDIA and AMD graphics cards, but the truth is that this is a niche segment because the vast majority of users do not need those cards.

That’s where integrated graphics cards come in which both Intel and AMD have been investing for years. The promise of this alternative to dedicated graphics from NVIDIA and AMD was clear: if you are an occasional player and do not need so much resolution or detail, our solutions will provide you with a great experience. Is that promise fulfilled?

Intel has been working in this segment for more than a decade

Still the truth is that Intel has evolved remarkably in this segment. At Tom’s Hardware they reviewed the evolution of the company’s graphics cards. In 1998 they had their first dedicated graphics, the i740, while in 1999 it came with the i810 (“Whitney”) and i815 (“Solano”) that made the GPU integrate into the Northbridge. It was the prehistory of integrated graphics from Intel, which for the first time provided such solutions.

Later developments like Intel Extreme Graphics – the “Extreme” was a bit exaggerated, we fear – did not last long, because in 2004 would be the Graphics Media Accelerator (GMA), the integrated Intel GPUs that became the absolute protagonists of its strategy Even though its performance remained clearly behind dedicated graphics.