Raja does throw a hint in there for discrete GPUs. However, if they aren't even willing to discuss a timeframe yet, even though they've announced the HPC offering and general availability, it is indeed looking like 5 years before their first discrete (non-HPC) launch. That is rather sad. If the performance is there, even AMD enthusiasts would be willing to buy an Intel GPU.
But why no questions about the scale of microarchitectural differences between the 3? Are they all grounds up (or divergent from Gen)? Or is it just a matter of how many 64-bit ALUs they have per 32-bit ALU? Or anything in between?
We cover Intel announcements and disclosures. We cover AMD announcements and disclosures. We cover arm announcements and disclosures. Sometimes a company has a lot at once. There's no favoritism here. At the show I also had an interview with Papermaster, which will be going up soon.
No reason to be so diplomatic, Ian. There are flat-earthers, and then there are diluted tech fanboys. If Intel's onto something, it's good you're covering it. Healthy competition, you know. I'm still skeptical that they'll get 7nm out by their announced release date, but if they do, more power to them!
Really great interview at this point in time about Xe. The memory coherency recurring answers gives us all a hint about what Intel is trying to solve and how it could put it appart from many others solutions.
really is that why this Morning there are 3 articles about intel in one form or another up front and featured? I understand news is released and broadcast but you do the same job as a lot of other sites yet this site is the only one to feature this slew, for example where is the article about intels supply issues
if you say so but am afraid for what substance there was this might have well been 3 lines "intel is working on a GPU plans to release it in the next 2 years"
done didnt need all the waffle about Raja himself!
Ian, I'm still remembering how incredibly biased your very first Threadripper review was. It has unfortunately clouded my view of any article or review that you do. That review was so extremely bad, that I'm not sure if you can be trusted.
This is a tech site and they announce new tech stuff and Intel are one of the largest tech companies so it kinda seems like a very good fit to me! But feel free to ignore this article if it's bit techie for you or if it doesn't feature a company on your approved list. Yawn.
Why the heck people all of a sudden started to believe Intel can't design chips/GPUs/CPU? Intel has fabrication issues, not design talent issues. Get that straight into your head. Also, I don't understand this hatred. Ok, they had big prices in the past. But don't think AMD would still sell you amazing CPUs for peanuts if Intel go bust. Also, a lot of hatred comes from security issues and people don't understand the fact that researchers focused on Intel simply because they are more common and their architecture is known better. If focus would change to AMD, be sure that Zen also has lots of security vulnerabilities. It is just that they weren't found yet. So yeah, you have a problem dude.
That's easy to answer; Since the sole single reason why they couldn't kept up with AMD and nVidia in any past, wasn't money but that both of the latter together with PowerVR (and thereby, Imagination Technologies) are holding the overwhelmingly vast majority of patents on graphics-IP on the whole global market.
The market's graphics are already patented from top to bottom for every fundamental efficient principle of drawing/rendering graphics since ages – and Intel ain't one of those holding any majority over them, not to mention any significant share of those, but AMD, nVidia and PowerVR are those.
→ Today, you just can't do graphics without having to license from AMD, nVidia and PowerVR. And sure enough, that's what Intel is trying to avoid at all costs (for the sake of revenge?).
It isn't that they wouldn't be capable of doing proper graphics, but that Intel ain't just (legally) allowed to so. Since no one these days can make any decent graphics without violating even a good dozen of either AMD's, nVidia's or PowerVR's given patents on graphics – or alternatively all of theirs at the same time, not even Intel. … at least without playing dirty, I might add.
The market on graphics really is virtually split into ATi/AMD, nVidia and PowerVR (in order of percentage), while ATi (and with that, AMD) holds the majority of them by about +65 per cent or so (feel free to correct me!). Qualcomm bought AMD's Embedded & Mobile-graphics division back then and there's a reason why Qualcomm's mobile graphics-implementation is called Adreno – it's just an anagram of Radeon. It's essentially AMD vs. ATI (Qualcomm) in the mobile and embedded space, though not talking about notebooks here.
Something to note here, is, that the patent-portfolio ATi/AMD holds have has grown significantly since AMD bought ATi though own new patents by AMD itself (and former ATi-staff working for AMD now).
𝑻𝒉𝒆 𝑩𝒊𝒈𝒈𝒆𝒓 𝑷𝒊𝒄𝒕𝒖𝒓𝒆 That's why they went out for getting that patent cross-license' agreement with nVidia in 2011 (which allegedly and officially was just about chipsets, so they say…) by paying nVidia 1.5B of Intel-money for getting a hold on their precious IPs. However, Jen-Hsun Huang was smart and paranoid enough to look through was Intel was about to archive, and just joyfully held his hand open for the money. After it went rather fruitless (for obvious reasons) for Intel, they cancelled that unnecessary agreement as they knew nVidia smelled that rat.
And that's also when informed people instantly were thinking that this quick side-trip from their own iGPU on Kabylake-G with integrated AMD graphics, was just so they get hands on viable, precious patented graphics-IP they could reverse-engineer afterwards for their own use – just to get sued for patent infringement, and drag those proceedings just long enough so that they'll come up with something new in the meantime (thanks to the help of some highly useful but now necessarily de-classified insights on their given competitor's technology during the law-suits), it's that easy.
Unfortunately, AMD – which Intel was the next one to get in touch with regarding that matter – wasn't as bright (or paranoid) as nVidia was. They in fact were stupid (or short-sighted/naive) enough and actually gave away their precious crown-jewels by teasming up with Intel on Kaby-Lake G. That being said, we shouldn't be at all surprised if their adventures on Kaby Lake-G incorporating AMD's graphics-IP first and foremost served as some test-balloon on how to get their hands on viable patented IP (for being reverse-engineered afterwards) – or even solely served the very purpose for doing so on AMD.
That being said, even the very roots of Intel's graphics dating back from others they just bought up or licensed instead of inventing them by themselves. For instance take their first approach in graphics, it was when they licensed NEC's µPD7220 in 1982 and calling it the Intel 82720 Graphics Display Controller afterwards. Its performance despite being good, it wasn't stellar and outdated rather quickly (largely by the fact that Intel needed a fairly long time to implement even such ready-to-implement licensed graphics-chip around their own chipsets).
Next stop was when back then, GE Aerospace had invented some flight-simulation system – and with that, graphics, for obvious reasons of needing it in that sector (to train Apollo Astronauts to dock the Command Module and Lunar Module), I might add. That was Project Apollo. However, that infamous CEO Jack Welch of General Electric was aggressively downsizing/restructuring GE at that time, and virtually sold all of its independent Aerospace-division GE Aerospace to Martin Marietta in 1992. In 1995, Martin Marietta merged with Lockheed to form Lockheed Martin.
In January 1995, Lockheed Martin re-organized their divisions and formed Real3D in order to bring their 3D experience to the civilian market. Real3D had some success, even sold some bigger designs to Sega. Later on they teamed up with Intel and Chips and Technologies into some joint-venture, which gave birth to Intel's infamous dedicated i740 graphics-card back then in '98. The card was a major fail and Intel took it off the market within months.
Curiously enough (or not at all for people who familiar with such sort of things…) even already at the stage of designing the card, Intel found itself within some claims of patent and copyright-infringement by Real3D. They exempted those lawsuits by buying some 20% stakes into Realm3D for a fair bunch of millions ($40m). It shouldn't look too far-fetched to say that Intel's interest were mostly into Real3D's extensive intellectual property and patent portfolio, relating to real-time computer image-generation and 3D graphics, rather than working with them on graphics.
Anyway, in 1999, Lockheed Martin regrouped itself and closed Real3D. Intel purchased the company's intellectual property, as part of a series on ongoing lawsuits, but laid off the remaining skeleton staff. Some staff were picked up as contractors within Intel, while a majority were hired by ATI and moved to a new office.
They really didn't invented that much by themselves but bought theirselves into the market early on to no greater avail – just to sit on a bunch of rather basic graphics-IPs (basic ≠ fundamental!). … and now you know why their iGPUs never really made its ways out of the early stages of development.
It's all about power-density, thus given patented Graphics-IP. You have to have the very magic. You either have it (like AMD, nVidia, PowerVR and Qualcomm) or you don't (like Intel).
𝒕𝒍;𝒅𝒓: Intel can't make any (decent) graphics and they never could, for a particular reason.
Patents expire, by 2021 any patent filed before 2001 will expire (so basic patents such as US7,548,238 will be public domain).
And besides that, Intel has hardly been sitting idle regarding graphics IP. For example, out of all post 2001 granted US patents on the classification G06T15/80 9.2% belong to Nvidia and 8.9% belong to Intel. So effectively both companies have comparable IP portfolios, which kind-of invalidates your point isn't it?
Nice and very long post. Thanks ! I think the "Let's go for it and drag it out in court if necessary" approach is probably what Intel will go for. It is no coincidence they hired employees with technical knowledge away from their competition.
And sadly, I also agree on the AMD being naive part (generally speaking).
Do some research, patents get augmented/modified and tied to new revisions. They continue indefinitely when possible. Graphics IP/CPU IP they don't just expire.
An actual comment worthy of article status which drastically overshadows the fluff pieces Ian seems to have for Raja--the master bs artist extraordinaire.
I thought AMD manufactured the gpu chiplets for the Kaby Lake G, and Intel just connected them up via a local pcie interconnect. How does Intel swipe any IP from that relationship?
Intel hasn't fabrication issues, but first and foremost design-talent issues. Get that straight into your head! All the flaws since ages and well over a decade are resulting from talent-issues being only majorly influenced by rogue-flavoured carelessness.
The claim that the reason for Intel's major flaws are solely rooted by its products' very market-share and prevalence rate, is just what it always was: Some flimsy excuse which aims to negligently downplay the next security-flaw's vulnerability.
For instance, their Hyper-Threading was found out to be effectively pretty close to broken and rather inefficient from the very beginning of its implementation even back then. It was also pretty clear and well-known (not only) to Intel. They just didn't cared about it, as cash swept in.
Also, and as pointed out countless times, Intel was a) very well aware of the issues and flaws their implementations might bring in anytime in the future and b) independent and third-party security-researchers fairly shortly after their implementation at Intel warned them about it. Intel ignored them deliberately! They gave NIL fucks.
𝑱𝒖𝒔𝒕 𝒇𝒐𝒓 𝒖𝒏𝒅𝒆𝒓𝒔𝒕𝒂𝒏𝒅𝒊𝒏𝒈 … E.g. the explicit security gap or -flaw Meltdown is not new, not even a tad. Anyone who claims the contrary – in contempt of glaring sources stating and proofing the exact opposite – either (hopefully) doesn't know it any better or deliberately and wilfully suppresses these facts.
The fact that everyone got surprised by the danger of such risks all of a sudden and was hit completely unprepared doesn't even correspond to the facts one bit, not even slightly. The whole topic, respective theoretical rudiments and so forth are and were some hotly debated topic since years within the security industry or among processor experts respectively.
Heck, the very basics for timed- and thus side channel attacks were developed back in 1992 and have been repeatedly explained/elucidated by security experts ever since. Just because such methods and attack vectors – while being known since many years – were only used 'publicly' in '17, doesn't mean they weren't used under the radar for many years prior to that date.
… and yes, especially the style of handling the caches the way they were used explicitly by Intel was not only known but also a frequently discussed crux and central subject-matter of security researches. This means that, as a collective within the industry (of chip-engineering) you were very well aware of given respective - at least theoretically - highly safety-critical exploits – and this was already brought up towards Intel some time ago, more than once.
𝐾𝑒𝑦𝑤𝑜𝑟𝑑 ‚𝑹𝒊𝒔𝒌 𝒎𝒂𝒏𝒂𝒈𝒆𝒎𝒆𝒏𝒕‘ ... and yes, Intel always considered these attack-scenarios to be too insignificant and such resulting speed advantages as too severe in order to drop them – in favour of thereby increased security. If I recall correctly, the topic is almost as old as the given Intel'ian implementation in those same processors. If I remember correctly, at least since '06 it has been considered se-ri-ous-ly critical how Intel addresses or manages their caches. Intel knew that and ignored it.
𝑩𝒍𝒂𝒄𝒌 𝑯𝒂𝒕 𝑩𝒓𝒊𝒆𝒇𝒊𝒏𝒈𝒔 At the very latest '16 such issues resulting eventually in Meltdown (or at least parts of it) were actually brought up again being made public while being a major agenda item and got openly discussed in great detail at the well-known Blackhat '16 on 3rd and 4th of August that year – while the very same subject was at least broached at the same security conference in '14. Wasn't it already known even before that?
Reading: Joseph Sharkey, Ph.D. – Siege Technologies: „Breaking Hardware-Enforced Security with Hypervisors“ Yeongjin Jang et al. – „Breaking Kernel Address Space Layout Randomization with Intel TSX“ John Harrison - Formal Verification at Intel – Katholieke Universiteit Nijmegen, 21 June 2002 John Harrison - Formal Methods at Intel: An Overview – Second NASA Formal Methods Symposium, Washington DC, 14 April 2010
𝒕𝒍;𝒅𝒓: Intel (and some prime employees) knew, at least from 2002 onwards, about the potential risk. They gave no fucks.
There's lots of Unicode options out there. For example, here's Mathematical Italic Small T: 𝑡 Of course you may need to have a font that implements it on your device as well. Just because a codepoint existed since March 2001 doesn't mean it's been used.
It takes thousands of engineers and other employees to bring new mass-market computing products like GPUs to fruition, and I've always found it amusing that when Intel grabs a couple of AMD's people that the journalist tech-world immediately sees something huge in the deal (or when AMD grabs high-profile employees from other companies...;)) It got ridiculous of late when Intel hired some very nice employee from AMD's PR department...;) Indeed, it got the "Intel grabs another one from AMD" treatment. It was pretty funny, I thought. In this case, however, my thoughts about Raja's departure have been along the lines that he simply had no more to give to AMD--that he'd reached his zenith and knew that the upcoming expectations were likely to be a bit beyond his reach. More or less, I just think he'd "shot his load" so to speak in advanced GPU tech @ AMD and preferred the notion of getting in with Intel on a basic, new product line not nearly so challenging as where AMD was headed. Intel's earlier foray into 3d-compatible discrete GPUs--the horrid i7xx series that looked to AGP texturing for performance that was never there--while both nVidia and 3dfx at the time were using the almost indescribably faster local bus ram from which to texture (it's still being used today, of course!) So Intel got into that market by buying out "Real 3d" (I think it was) and then Intel very quickly abandoned that market as it was obvious their products were not competitive--not even a little bit. I believe I bought (and returned) at least three of them, myself. Talking about hatred and so on--my thoughts about Intel generating bad vibes had to do mostly with how it behaved itself as a company 20-25 years ago--Intel worked very hard to monopolize the markets and to crush every would-be competitor with highly unethical market behavior. Indeed, the *only* CPU company to actually succeed in dodging the Intel corporate axe--was AMD--and the company AMD merged with early on--can't recall the name. But anyway--that explains Intel's bad reputation in certain quarters, I think. And last here--yeeeeman--try and come up with something different from a comparison with AMD's Zen security holes that *haven't yet been found* and the plethora of Intel's vulnerabilities that *have been found*--indeed it seems like more are found every week! It's just not a convincing argument...;)
I almost got a PC with an i7xx GPU in it back when they were released. Ads had benchmarks results that showed them far ahead of competing products in the same price range. Thankfully, I did some more checking with friends and on the web (not as easy as it is now).
I honestly cannot say what I ended up with, but thankfully it wasn't i7xx.
Yes! In those days my thing was buying through a local Best Buy because of its 14-day return policy on hardware! That's why I kept buying them--but wound up keeping none of them. I think I must have come close to buying at least one of every 3d accelerator made in those days--at least that looked interesting enough to try and which I could find on a BB shelf.....;) None of the i7xx GPUs were competitive with what 3dfx and nVidia were selling in those days--especially 3dfx. IIRC, the most local ram any of them had was 8 MBs...;) As soon as the GPU needed to reach out through the AGP bus to access the dog-slow system ram for texturing--bam--performance just died--slide-show city, instantly. And that was all Intel wrote on discreet 3d accelerators, and has remained in the iGPU markets ever since. With the ~25-year lead both nVidia and AMD have on Intel in the discrete 3d API GPU product space, I'd say it will be many years from now before Intel can bring something competitive to market--maybe, if they don't drop out of that market again, a la the i7xx debacle. CPUs are enormously different from discrete GPUs, the markets, the manufacturing, and all connected, of course, and I'm not sure that Intel has the drive of ATi/AMD or nVidia in their discrete GPU market. I only know that if Intel once again tries to tie its discreet GPUs to its chipsets and its DDR/5 system--thinking that PCIe5 will be "fast enough" then they'll fail with it, just like they when they made their i7xx GPUs *dependent* on system ram and system-wide buses.
If I recall, the 'guts' of the discreet i740 became the basis of the original integrated graphics chipset __ the Intel i810 (which quickly became the i815 chipset - for Socket A and 'old' Slot-1 processors) ...
No one is upset about the discovery of security issues. They are upset that even when given overly generous timeframes to patch the security flaws they still ask for egregious extensions. The universities should adopt the project zero model and give every exploit 90 days full stop no exceptions.
It seems they are largely playing them, yes. All that fluffy wishy-washy and insubstantial talking about great ideas, architectures and visions and stuff, doesn't really impress, like at all.
It's telling that Intel is still biting their lips about anything technical, while feeding the mass with some slideshows about imaginary products which are yet to come, yet he's talking about reeling in the excitement?! What kind of bad joke is that, when Intel itself are the ones continuously hyping Xe with nothing substantial but community-renderings? What's also telling, is, that they rehired most of the Larabee-team for Xe!
The fact that they talking so darn much about that ‘exascale for everyone’-vision and architecture and whatnot, together with those bits on different modes between SIMT and SIMD and how it's giving their 'software-guys lots of tools to do more', how they're trying to ‘Leverage, Optimize, Scale’ only their 'existing integrated software stack and integrated graphics IP', trying to 'optimize their existing IP' and how they see it's 'needed to scale over 1000x' – without doing ANYTHING new in terms of graphics-IP, tells me exactly one thing and one thing only;
Xe will be Larrabee 3.0, Pardon me! Xeon Phi 2.0 – they manically trying to hope bringing even a third time being lucky – while trying to fix the massive flaws and reasons why both of its predecessors failed spectacularly at the software-side of things.
… and if I may say that; Combine that with the fact that it again (!) is some Intel graphics-product being hyped and publicly stemmed from virtually nothing but marketing, doesn't really paint a very bright picture – especially if you consider that both of their previous (failed) attempts on graphics were accompanied by a massive marketing for alleged performance too, which evidently collapsed into thin air the moment people finally got ahold of their products after release. It really paints a very grim picture for Xe, to say the least – as it was exactly the same the last two times on Larrabee and their infamous Xeon Phi desasters. i740 comes to mind too.
I want them having a solid graphics-product too (also, to get down prices through competition), though it really does not look like they'd come up with anything meaningful in any near future, yet. But with that …?
A more obvious sign is near zero coverage of Intel related security vulnerabilities from Anandtech. But to be fair other tech sites made little to no mention of it as well.
Huge AMD fan(boy) here, but I had to comment because I disagree with your comment. And it gives us a bad name. As Ian mentioned below, AT covers pretty much every company's major events and yeah, Intel's been pretty damn busy lately with all the Xe stuff. AT doesn't come across as a marketing mouthpiece for Intel, or any company.
Some of the editors probaby have favourites or whatever, but we're all human, that's fine. But it doesn't strike me as bias.
I wonder if they will do as they historically have done /been sued for, that is, to release as they see fit even if they KNOW full well is right against laws in my countries which is "ok" as they still can sell a boat load of them elsewhere even if very much patent infringing e.g hedging loss to wins
Hopefully they do not do such things, though I am sure it would not at all surprise many "in the know" for everyone else will end up being "Intel have to have it, not know much else"
IMHO based on many years of seeing/hearing/researching such stuff, they might/might not have already decided "fudge everyone, we need sales while we can get them"
---------------
will be interesting either way, that One thingy referenced will be likely extrmely interesting as the marketplace (globally) is super entrenched into DX / Vulkan and the like, they may have the $$$$$ and x86 patents and such, though one cannot bully their way over EVERYONES yard constantly and get away "scott free" (least they bloody well should not be)
That is the thing about laws, they are easily interpreted however you like. Remember the rule, you can sue anyone for anything, if you win or not its a different story.
Cross licensing dude. AMD didn't invented the GPU, Not even nvidia. Apple copy pasted Imagination tech GPU and they got away with it. So don't worry, Intel is not the first company to use some common IP.
Given their shady acting in the past, we shouldn't be at all surprised if their adventures on Kaby Lake-G incorporating AMD's graphics-IP first and foremost served as some test-balloon on how to get their hands on viable patented IP (for being reverse-engineered afterwards) – or even solely served the very purpose for doing so on AMD.
What I am feeling about it, is that Raja has his butt on the line there. He is doing 12 hours shift and he is busy to deliver, probably because Intel let him know about the delivery.
And on a side note, pretty much confirm that MCM is not for gaming yet, it is for compute and server.
The biggest problem with gaming on MCM is software. While Koduri might not solve it, Intel does have enough manpower to solve it. Nvidia is another alternative and they have a lot more interaction with game developers.
One thing I'd like to know is if they borrowed any AI processing features from NNP-I that would make the GPU implementation equally efficient.
Another question I had relates to the recent announcement about porting FB Glow for the NNP chips, and how that effort fits into their oneAPI. Perhaps that was just done to get their foot in the door at FB?
I thought when people were talking about failed Larrabee - they were talking about a failure PC graphics card like NVidia and such. But having it be Xeon Phi as higher end specialize market and not direct competitor to mainstream graphics cards. I also seen Phi as cool technology but out of touch for such device.
An important clearup is about Gen graphics and Xe, it seems logical that Gen will be replace but I would even think that it is possible that Gen 11 borrows some of stuff coming from Xe. I think initially Xe was referred as Gen 12
I be interested in how far up performance scale they planed to go in mobile laptop market. Of course integrated will be there - but would think in even on gaming level they will products out there. Frameworks like EMiB would be perfect for this with dedicated memory and such.
Larrabee *was* supposed to be a consumer GPU on an add-in card. Their marketing people were even claiming that it would be competitive with the then highest-end GeForce products. In reality, the performance wasn’t remotely close to competitive (competing GPUs were multiple times faster than the hardware Intel demo’d), so the GPU plans were canned and it was repurposed and transformed into Xeon Phi
I kept of video cards for long time - even having old VooDoo 5 and such. Many GeForce cards and the Larrabee seem to escape me. But during the days it came I was solid in NVidia cards but in last decade, I went mobile and honest GPU's cards became less important to me.
But in the late 80's and 90's, external GPU were more important but now even the integrated GPU's have as much power as external GPU's then.
Boy, does this guy have the managerial speak down, no wonder he has been able to land plum jobs everywhere yet nowhere he went thinks he was indispensable. The interview is full of self promotions mixed in with lots of buzz words with some flattery but revealed very little substance.
You are giving intel's publicity too much fellas. It's starting to get awkward. Perhaps changing the site's name to anandintelchs? Come on. Regardless of you doing your job OR NOT. Be less partial.
Papermaster interview probably won't happen, in much the same way as the Intel security/performance article we were promised months ago has failed to materialise - are you still waiting for Intel to answer your questions LOL? Dr Intel could have written a second security/vulnerability/performance article by now on JCC, with a third on TSA, and no doubt more down the pike.
Intel security issues are the gift that keeps on giving if you are a tech writer with a focus on CPU design, yet we get nothing - zero coverage - from Anadtech. Why is that, is it some sort of Intel "Quid Pro Quo"? Is losing your integrity and impartiality really the price to pay for tours of a fab and a few interviews?
For TSA I meant TAA ("ZombieLoad TAA"). Next up is "iTLB Multihit". Every single Intel vulnerability means ever increasing performance penalties in order to workaround the design flaws, sometimes up to a 40% reduction in performance. And yet not a peep from Anandtech on the subject. I do hope Anandtech benchmark Intel vs AMD only with full mitigations these days - any results without the latest mitigations (in particular Intel) are now not worthy of comparison.
Why don't you have a conversation about this increasingly negative performance issue, I'm sure your readers would be very interested.
What I am feeling about it, is that Raja has his butt on the line there. He is doing 12 hours shift and he is busy to deliver, probably because Intel let him know about the delivery https://bestpornsites.mobi/ .
And on a side note, pretty much confirm that MCM is not for gaming yet, it is for compute and server.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
73 Comments
Back to Article
extide - Wednesday, November 20, 2019 - link
Excellent interview Ian, nice work. Now see if you can get Jim Keller :)Ian Cutress - Wednesday, November 20, 2019 - link
You meanhttps://www.anandtech.com/show/13048/an-anandtech-...
? 😁
extide - Wednesday, November 20, 2019 - link
Oh snap, looks like I forgot about that one. Thanks for the link :)eek2121 - Thursday, November 21, 2019 - link
Raja does throw a hint in there for discrete GPUs. However, if they aren't even willing to discuss a timeframe yet, even though they've announced the HPC offering and general availability, it is indeed looking like 5 years before their first discrete (non-HPC) launch. That is rather sad. If the performance is there, even AMD enthusiasts would be willing to buy an Intel GPU.peevee - Friday, November 22, 2019 - link
But why no questions about the scale of microarchitectural differences between the 3? Are they all grounds up (or divergent from Gen)? Or is it just a matter of how many 64-bit ALUs they have per 32-bit ALU? Or anything in between?Ian Cutress - Friday, November 22, 2019 - link
I asked about Gen and Xe similarities. The rest you suggest are list of questions that he wouldn't be answering at this time.alufan - Wednesday, November 20, 2019 - link
ohh look another intel promo whats that 3 in a month now?Seriously if you guys cant see how intel is playing you then you need to really wake up lets see how long this is on the front page
Ian Cutress - Wednesday, November 20, 2019 - link
We cover Intel announcements and disclosures. We cover AMD announcements and disclosures. We cover arm announcements and disclosures. Sometimes a company has a lot at once. There's no favoritism here. At the show I also had an interview with Papermaster, which will be going up soon.Jorgp2 - Wednesday, November 20, 2019 - link
I blame /r/pcmasterrace and /r/amdSmartcom5 - Wednesday, November 20, 2019 - link
Don't worry, you're just doing your job Ian!Hifihedgehog - Thursday, November 21, 2019 - link
No reason to be so diplomatic, Ian. There are flat-earthers, and then there are diluted tech fanboys. If Intel's onto something, it's good you're covering it. Healthy competition, you know. I'm still skeptical that they'll get 7nm out by their announced release date, but if they do, more power to them!iAPX - Thursday, November 21, 2019 - link
Really great interview at this point in time about Xe.The memory coherency recurring answers gives us all a hint about what Intel is trying to solve and how it could put it appart from many others solutions.
alufan - Friday, November 22, 2019 - link
really is that why this Morning there are 3 articles about intel in one form or another up front and featured?I understand news is released and broadcast but you do the same job as a lot of other sites yet this site is the only one to feature this slew, for example where is the article about intels supply issues
https://newsroom.intel.com/news/intel-supply-updat...
there has to be balance put the negative news up as well or frankly you start to look like you do ...biased
Ian Cutress - Friday, November 22, 2019 - link
Right on the front page in our news section.https://www.anandtech.com/show/15132/intel-publish...
If you can't be bothered to look, I can't help you.
alufan - Friday, November 22, 2019 - link
if you say so but am afraid for what substance there was this might have well been 3 lines "intel is working on a GPU plans to release it in the next 2 years"done didnt need all the waffle about Raja himself!
Johan Steyn - Thursday, December 12, 2019 - link
Ian, I'm still remembering how incredibly biased your very first Threadripper review was. It has unfortunately clouded my view of any article or review that you do. That review was so extremely bad, that I'm not sure if you can be trusted.smilingcrow - Wednesday, November 20, 2019 - link
This is a tech site and they announce new tech stuff and Intel are one of the largest tech companies so it kinda seems like a very good fit to me!But feel free to ignore this article if it's bit techie for you or if it doesn't feature a company on your approved list. Yawn.
peevee - Friday, November 22, 2019 - link
"But feel free to ignore this article if it's bit techie for you"Techie, really? There are literally NO tech pieces in the article, just personal promotion of Raja.
yeeeeman - Wednesday, November 20, 2019 - link
Why the heck people all of a sudden started to believe Intel can't design chips/GPUs/CPU? Intel has fabrication issues, not design talent issues. Get that straight into your head. Also, I don't understand this hatred. Ok, they had big prices in the past. But don't think AMD would still sell you amazing CPUs for peanuts if Intel go bust. Also, a lot of hatred comes from security issues and people don't understand the fact that researchers focused on Intel simply because they are more common and their architecture is known better. If focus would change to AMD, be sure that Zen also has lots of security vulnerabilities. It is just that they weren't found yet. So yeah, you have a problem dude.Smartcom5 - Wednesday, November 20, 2019 - link
Why people believe Intel can't design GPUs?That's easy to answer;
Since the sole single reason why they couldn't kept up with AMD and nVidia in any past, wasn't money but that both of the latter together with PowerVR (and thereby, Imagination Technologies) are holding the overwhelmingly vast majority of patents on graphics-IP on the whole global market.
The market's graphics are already patented from top to bottom for every fundamental efficient principle of drawing/rendering graphics since ages – and Intel ain't one of those holding any majority over them, not to mention any significant share of those, but AMD, nVidia and PowerVR are those.
→ Today, you just can't do graphics without having to license from AMD, nVidia and PowerVR.
And sure enough, that's what Intel is trying to avoid at all costs (for the sake of revenge?).
It isn't that they wouldn't be capable of doing proper graphics, but that Intel ain't just (legally) allowed to so. Since no one these days can make any decent graphics without violating even a good dozen of either AMD's, nVidia's or PowerVR's given patents on graphics – or alternatively all of theirs at the same time, not even Intel. … at least without playing dirty, I might add.
The market on graphics really is virtually split into ATi/AMD, nVidia and PowerVR (in order of percentage), while ATi (and with that, AMD) holds the majority of them by about +65 per cent or so (feel free to correct me!). Qualcomm bought AMD's Embedded & Mobile-graphics division back then and there's a reason why Qualcomm's mobile graphics-implementation is called Adreno – it's just an anagram of Radeon. It's essentially AMD vs. ATI (Qualcomm) in the mobile and embedded space, though not talking about notebooks here.
Something to note here, is, that the patent-portfolio ATi/AMD holds have has grown significantly since AMD bought ATi though own new patents by AMD itself (and former ATi-staff working for AMD now).
𝑻𝒉𝒆 𝑩𝒊𝒈𝒈𝒆𝒓 𝑷𝒊𝒄𝒕𝒖𝒓𝒆
That's why they went out for getting that patent cross-license' agreement with nVidia in 2011 (which allegedly and officially was just about chipsets, so they say…) by paying nVidia 1.5B of Intel-money for getting a hold on their precious IPs. However, Jen-Hsun Huang was smart and paranoid enough to look through was Intel was about to archive, and just joyfully held his hand open for the money. After it went rather fruitless (for obvious reasons) for Intel, they cancelled that unnecessary agreement as they knew nVidia smelled that rat.
And that's also when informed people instantly were thinking that this quick side-trip from their own iGPU on Kabylake-G with integrated AMD graphics, was just so they get hands on viable, precious patented graphics-IP they could reverse-engineer afterwards for their own use – just to get sued for patent infringement, and drag those proceedings just long enough so that they'll come up with something new in the meantime (thanks to the help of some highly useful but now necessarily de-classified insights on their given competitor's technology during the law-suits), it's that easy.
Unfortunately, AMD – which Intel was the next one to get in touch with regarding that matter – wasn't as bright (or paranoid) as nVidia was. They in fact were stupid (or short-sighted/naive) enough and actually gave away their precious crown-jewels by teasming up with Intel on Kaby-Lake G. That being said, we shouldn't be at all surprised if their adventures on Kaby Lake-G incorporating AMD's graphics-IP first and foremost served as some test-balloon on how to get their hands on viable patented IP (for being reverse-engineered afterwards) – or even solely served the very purpose for doing so on AMD.
That being said, even the very roots of Intel's graphics dating back from others they just bought up or licensed instead of inventing them by themselves. For instance take their first approach in graphics, it was when they licensed NEC's µPD7220 in 1982 and calling it the Intel 82720 Graphics Display Controller afterwards. Its performance despite being good, it wasn't stellar and outdated rather quickly (largely by the fact that Intel needed a fairly long time to implement even such ready-to-implement licensed graphics-chip around their own chipsets).
Next stop was when back then, GE Aerospace had invented some flight-simulation system – and with that, graphics, for obvious reasons of needing it in that sector (to train Apollo Astronauts to dock the Command Module and Lunar Module), I might add. That was Project Apollo. However, that infamous CEO Jack Welch of General Electric was aggressively downsizing/restructuring GE at that time, and virtually sold all of its independent Aerospace-division GE Aerospace to Martin Marietta in 1992. In 1995, Martin Marietta merged with Lockheed to form Lockheed Martin.
In January 1995, Lockheed Martin re-organized their divisions and formed Real3D
in order to bring their 3D experience to the civilian market. Real3D had some success, even sold some bigger designs to Sega. Later on they teamed up with Intel and Chips and Technologies into some joint-venture, which gave birth to Intel's infamous dedicated i740 graphics-card back then in '98. The card was a major fail and Intel took it off the market within months.
Curiously enough (or not at all for people who familiar with such sort of things…) even already at the stage of designing the card, Intel found itself within some claims of patent and copyright-infringement by Real3D. They exempted those lawsuits by buying some 20% stakes into Realm3D for a fair bunch of millions ($40m). It shouldn't look too far-fetched to say that Intel's interest were mostly into Real3D's extensive intellectual property and patent portfolio, relating to real-time computer image-generation and 3D graphics, rather than working with them on graphics.
Anyway, in 1999, Lockheed Martin regrouped itself and closed Real3D. Intel purchased the company's intellectual property, as part of a series on ongoing lawsuits, but laid off the remaining skeleton staff. Some staff were picked up as contractors within Intel, while a majority were hired by ATI and moved to a new office.
They really didn't invented that much by themselves but bought theirselves into the market early on to no greater avail – just to sit on a bunch of rather basic graphics-IPs (basic ≠ fundamental!). … and now you know why their iGPUs never really made its ways out of the early stages of development.
It's all about power-density, thus given patented Graphics-IP. You have to have the very magic.
You either have it (like AMD, nVidia, PowerVR and Qualcomm) or you don't (like Intel).
𝒕𝒍;𝒅𝒓: Intel can't make any (decent) graphics and they never could, for a particular reason.
UltraWide - Wednesday, November 20, 2019 - link
Very interesting read, I enjoyed it more than the actual article. Thank you.wishgranter - Wednesday, November 20, 2019 - link
Woow, waited on something like this long explanation all the time... THANX for summarizing the problematic for us.Arsenica - Wednesday, November 20, 2019 - link
Patents expire, by 2021 any patent filed before 2001 will expire (so basic patents such as US7,548,238 will be public domain).And besides that, Intel has hardly been sitting idle regarding graphics IP. For example, out of all post 2001 granted US patents on the classification G06T15/80 9.2% belong to Nvidia and 8.9% belong to Intel. So effectively both companies have comparable IP portfolios, which kind-of invalidates your point isn't it?
mdriftmeyer - Saturday, November 23, 2019 - link
Patents get renewed or did you think for some absurd reason they never seem to disappear but get amended?Irata - Thursday, November 21, 2019 - link
Nice and very long post. Thanks !I think the "Let's go for it and drag it out in court if necessary" approach is probably what Intel will go for. It is no coincidence they hired employees with technical knowledge away from their competition.
And sadly, I also agree on the AMD being naive part (generally speaking).
JayNor - Thursday, November 21, 2019 - link
Raja is apparently claiming a big increase in performance from Xe, although it isn't clear if that claim is wrt hpc ai and data analysis.Tiger Lake integrated graphics is the first Xe architecture GPU, so we won't have to wait for the HPC Xe to get more info on the Xe architecture.
"So we preserved lots of the good elements of Gen, but we had to get an order of magnitude increase in performance."
peevee - Friday, November 22, 2019 - link
"The market's graphics are already patented from top to bottom for every fundamental efficient principle of drawing/rendering graphics since ages"Exactly, ages. You know that patents expire, right?
mdriftmeyer - Saturday, November 23, 2019 - link
Do some research, patents get augmented/modified and tied to new revisions. They continue indefinitely when possible. Graphics IP/CPU IP they don't just expire.mdriftmeyer - Saturday, November 23, 2019 - link
An actual comment worthy of article status which drastically overshadows the fluff pieces Ian seems to have for Raja--the master bs artist extraordinaire.JayNor - Saturday, December 7, 2019 - link
I thought AMD manufactured the gpu chiplets for the Kaby Lake G, and Intel just connected them up via a local pcie interconnect. How does Intel swipe any IP from that relationship?https://spectrum.ieee.org/tech-talk/semiconductors...
Smartcom5 - Wednesday, November 20, 2019 - link
Intel hasn't fabrication issues, but first and foremost design-talent issues. Get that straight into your head! All the flaws since ages and well over a decade are resulting from talent-issues being only majorly influenced by rogue-flavoured carelessness.The claim that the reason for Intel's major flaws are solely rooted by its products' very market-share and prevalence rate, is just what it always was: Some flimsy excuse which aims to negligently downplay the next security-flaw's vulnerability.
For instance, their Hyper-Threading was found out to be effectively pretty close to broken and rather inefficient from the very beginning of its implementation even back then. It was also pretty clear and well-known (not only) to Intel. They just didn't cared about it, as cash swept in.
Also, and as pointed out countless times, Intel was a) very well aware of the issues and flaws their implementations might bring in anytime in the future and b) independent and third-party security-researchers fairly shortly after their implementation at Intel warned them about it. Intel ignored them deliberately! They gave NIL fucks.
𝑱𝒖𝒔𝒕 𝒇𝒐𝒓 𝒖𝒏𝒅𝒆𝒓𝒔𝒕𝒂𝒏𝒅𝒊𝒏𝒈 …
E.g. the explicit security gap or -flaw Meltdown is not new, not even a tad. Anyone who claims the contrary – in contempt of glaring sources stating and proofing the exact opposite – either (hopefully) doesn't know it any better or deliberately and wilfully suppresses these facts.
The fact that everyone got surprised by the danger of such risks all of a sudden and was hit completely unprepared doesn't even correspond to the facts one bit, not even slightly. The whole topic, respective theoretical rudiments and so forth are and were some hotly debated topic since years within the security industry or among processor experts respectively.
Heck, the very basics for timed- and thus side channel attacks were developed back in 1992 and have been repeatedly explained/elucidated by security experts ever since. Just because such methods and attack vectors – while being known since many years – were only used 'publicly' in '17, doesn't mean they weren't used under the radar for many years prior to that date.
… and yes, especially the style of handling the caches the way they were used explicitly by Intel was not only known but also a frequently discussed crux and central subject-matter of security researches. This means that, as a collective within the industry (of chip-engineering) you were very well aware of given respective - at least theoretically - highly safety-critical exploits – and this was already brought up towards Intel some time ago, more than once.
𝐾𝑒𝑦𝑤𝑜𝑟𝑑 ‚𝑹𝒊𝒔𝒌 𝒎𝒂𝒏𝒂𝒈𝒆𝒎𝒆𝒏𝒕‘
... and yes, Intel always considered these attack-scenarios to be too insignificant and such resulting speed advantages as too severe in order to drop them – in favour of thereby increased security. If I recall correctly, the topic is almost as old as the given Intel'ian implementation in those same processors. If I remember correctly, at least since '06 it has been considered se-ri-ous-ly critical how Intel addresses or manages their caches. Intel knew that and ignored it.
𝑩𝒍𝒂𝒄𝒌 𝑯𝒂𝒕 𝑩𝒓𝒊𝒆𝒇𝒊𝒏𝒈𝒔
At the very latest '16 such issues resulting eventually in Meltdown (or at least parts of it) were actually brought up again being made public while being a major agenda item and got openly discussed in great detail at the well-known Blackhat '16 on 3rd and 4th of August that year – while the very same subject was at least broached at the same security conference in '14. Wasn't it already known even before that?
Reading:
Joseph Sharkey, Ph.D. – Siege Technologies: „Breaking Hardware-Enforced Security with Hypervisors“
Yeongjin Jang et al. – „Breaking Kernel Address Space Layout Randomization with Intel TSX“
John Harrison - Formal Verification at Intel – Katholieke Universiteit Nijmegen, 21 June 2002
John Harrison - Formal Methods at Intel: An Overview – Second NASA Formal Methods Symposium, Washington DC, 14 April 2010
𝒕𝒍;𝒅𝒓: Intel (and some prime employees) knew, at least from 2002 onwards, about the potential risk.
They gave no fucks.
Rudde - Thursday, November 21, 2019 - link
I read the 2002/2003 source, but I didn't find anything about sidechannel, hyperthreading nor cache. What did I miss? What were you trying to prove?DigitalFreak - Thursday, November 21, 2019 - link
Somebody's figured out how to do italics.Rudde - Thursday, November 21, 2019 - link
The italic letters show up as boxes for me (on mobile).GreenReaper - Friday, November 22, 2019 - link
There's lots of Unicode options out there. For example, here's Mathematical Italic Small T: 𝑡Of course you may need to have a font that implements it on your device as well.
Just because a codepoint existed since March 2001 doesn't mean it's been used.
mdriftmeyer - Saturday, November 23, 2019 - link
Colleagues of mine would beg to differ with your comments on fab issues. They know they have issues.WaltC - Wednesday, November 20, 2019 - link
It takes thousands of engineers and other employees to bring new mass-market computing products like GPUs to fruition, and I've always found it amusing that when Intel grabs a couple of AMD's people that the journalist tech-world immediately sees something huge in the deal (or when AMD grabs high-profile employees from other companies...;)) It got ridiculous of late when Intel hired some very nice employee from AMD's PR department...;) Indeed, it got the "Intel grabs another one from AMD" treatment. It was pretty funny, I thought. In this case, however, my thoughts about Raja's departure have been along the lines that he simply had no more to give to AMD--that he'd reached his zenith and knew that the upcoming expectations were likely to be a bit beyond his reach. More or less, I just think he'd "shot his load" so to speak in advanced GPU tech @ AMD and preferred the notion of getting in with Intel on a basic, new product line not nearly so challenging as where AMD was headed. Intel's earlier foray into 3d-compatible discrete GPUs--the horrid i7xx series that looked to AGP texturing for performance that was never there--while both nVidia and 3dfx at the time were using the almost indescribably faster local bus ram from which to texture (it's still being used today, of course!) So Intel got into that market by buying out "Real 3d" (I think it was) and then Intel very quickly abandoned that market as it was obvious their products were not competitive--not even a little bit. I believe I bought (and returned) at least three of them, myself. Talking about hatred and so on--my thoughts about Intel generating bad vibes had to do mostly with how it behaved itself as a company 20-25 years ago--Intel worked very hard to monopolize the markets and to crush every would-be competitor with highly unethical market behavior. Indeed, the *only* CPU company to actually succeed in dodging the Intel corporate axe--was AMD--and the company AMD merged with early on--can't recall the name. But anyway--that explains Intel's bad reputation in certain quarters, I think. And last here--yeeeeman--try and come up with something different from a comparison with AMD's Zen security holes that *haven't yet been found* and the plethora of Intel's vulnerabilities that *have been found*--indeed it seems like more are found every week! It's just not a convincing argument...;)Irata - Thursday, November 21, 2019 - link
I almost got a PC with an i7xx GPU in it back when they were released. Ads had benchmarks results that showed them far ahead of competing products in the same price range. Thankfully, I did some more checking with friends and on the web (not as easy as it is now).I honestly cannot say what I ended up with, but thankfully it wasn't i7xx.
WaltC - Sunday, November 24, 2019 - link
Yes! In those days my thing was buying through a local Best Buy because of its 14-day return policy on hardware! That's why I kept buying them--but wound up keeping none of them. I think I must have come close to buying at least one of every 3d accelerator made in those days--at least that looked interesting enough to try and which I could find on a BB shelf.....;) None of the i7xx GPUs were competitive with what 3dfx and nVidia were selling in those days--especially 3dfx. IIRC, the most local ram any of them had was 8 MBs...;) As soon as the GPU needed to reach out through the AGP bus to access the dog-slow system ram for texturing--bam--performance just died--slide-show city, instantly. And that was all Intel wrote on discreet 3d accelerators, and has remained in the iGPU markets ever since. With the ~25-year lead both nVidia and AMD have on Intel in the discrete 3d API GPU product space, I'd say it will be many years from now before Intel can bring something competitive to market--maybe, if they don't drop out of that market again, a la the i7xx debacle. CPUs are enormously different from discrete GPUs, the markets, the manufacturing, and all connected, of course, and I'm not sure that Intel has the drive of ATi/AMD or nVidia in their discrete GPU market. I only know that if Intel once again tries to tie its discreet GPUs to its chipsets and its DDR/5 system--thinking that PCIe5 will be "fast enough" then they'll fail with it, just like they when they made their i7xx GPUs *dependent* on system ram and system-wide buses.Smell This - Sunday, November 24, 2019 - link
If I recall, the 'guts' of the discreet i740 became the basis of the original integrated graphics chipset __ the Intel i810 (which quickly became the i815 chipset - for Socket A and 'old' Slot-1 processors) ...
willis936 - Thursday, November 21, 2019 - link
No one is upset about the discovery of security issues. They are upset that even when given overly generous timeframes to patch the security flaws they still ask for egregious extensions. The universities should adopt the project zero model and give every exploit 90 days full stop no exceptions.Smartcom5 - Wednesday, November 20, 2019 - link
It seems they are largely playing them, yes. All that fluffy wishy-washy and insubstantial talking about great ideas, architectures and visions and stuff, doesn't really impress, like at all.It's telling that Intel is still biting their lips about anything technical, while feeding the mass with some slideshows about imaginary products which are yet to come, yet he's talking about reeling in the excitement?! What kind of bad joke is that, when Intel itself are the ones continuously hyping Xe with nothing substantial but community-renderings? What's also telling, is, that they rehired most of the Larabee-team for Xe!
The fact that they talking so darn much about that ‘exascale for everyone’-vision and architecture and whatnot, together with those bits on different modes between SIMT and SIMD and how it's giving their 'software-guys lots of tools to do more', how they're trying to ‘Leverage, Optimize, Scale’ only their 'existing integrated software stack and integrated graphics IP', trying to 'optimize their existing IP' and how they see it's 'needed to scale over 1000x' – without doing ANYTHING new in terms of graphics-IP, tells me exactly one thing and one thing only;
Xe will be Larrabee 3.0, Pardon me! Xeon Phi 2.0 – they manically trying to hope bringing even a third time being lucky – while trying to fix the massive flaws and reasons why both of its predecessors failed spectacularly at the software-side of things.
… and if I may say that;
Combine that with the fact that it again (!) is some Intel graphics-product being hyped and publicly stemmed from virtually nothing but marketing, doesn't really paint a very bright picture – especially if you consider that both of their previous (failed) attempts on graphics were accompanied by a massive marketing for alleged performance too, which evidently collapsed into thin air the moment people finally got ahold of their products after release. It really paints a very grim picture for Xe, to say the least – as it was exactly the same the last two times on Larrabee and their infamous Xeon Phi desasters. i740 comes to mind too.
I want them having a solid graphics-product too (also, to get down prices through competition), though it really does not look like they'd come up with anything meaningful in any near future, yet.
But with that …?
Good luck, Intel. 𝑌𝑜𝑢'𝑙𝑙 𝑛𝑒𝑒𝑑 𝑖𝑡!
Cygni - Wednesday, November 20, 2019 - link
Ah yes, i remember Ian's great interview a few months a go with famous intel shill... Lisa Suuefi - Wednesday, November 20, 2019 - link
A more obvious sign is near zero coverage of Intel related security vulnerabilities from Anandtech. But to be fair other tech sites made little to no mention of it as well.AshlayW - Thursday, November 21, 2019 - link
Huge AMD fan(boy) here, but I had to comment because I disagree with your comment. And it gives us a bad name. As Ian mentioned below, AT covers pretty much every company's major events and yeah, Intel's been pretty damn busy lately with all the Xe stuff. AT doesn't come across as a marketing mouthpiece for Intel, or any company.Some of the editors probaby have favourites or whatever, but we're all human, that's fine. But it doesn't strike me as bias.
Hifihedgehog - Thursday, November 21, 2019 - link
HAHAHAHAHA....
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Tin foil hat much?
Dragonstongue - Wednesday, November 20, 2019 - link
I wonder if they will do as they historically have done /been sued for, that is, to release as they see fit even if they KNOW full well is right against laws in my countries which is "ok" as they still can sell a boat load of them elsewhere even if very much patent infringing e.g hedging loss to winsHopefully they do not do such things, though I am sure it would not at all surprise many "in the know" for everyone else will end up being "Intel have to have it, not know much else"
IMHO based on many years of seeing/hearing/researching such stuff, they might/might not have already decided "fudge everyone, we need sales while we can get them"
---------------
will be interesting either way, that One thingy referenced will be likely extrmely interesting as the marketplace (globally) is super entrenched into DX / Vulkan and the like, they may have the $$$$$ and x86 patents and such, though one cannot bully their way over EVERYONES yard constantly and get away "scott free" (least they bloody well should not be)
-- thanks for the read Ian
(^.^)
imaheadcase - Wednesday, November 20, 2019 - link
That is the thing about laws, they are easily interpreted however you like. Remember the rule, you can sue anyone for anything, if you win or not its a different story.yeeeeman - Wednesday, November 20, 2019 - link
Cross licensing dude. AMD didn't invented the GPU, Not even nvidia. Apple copy pasted Imagination tech GPU and they got away with it. So don't worry, Intel is not the first company to use some common IP.mdriftmeyer - Saturday, November 23, 2019 - link
Apple has a massive pool of GPU IP before Imagine Tech or do you think Image Technologies just was too stupid to sue?Smartcom5 - Wednesday, November 20, 2019 - link
Given their shady acting in the past, we shouldn't be at all surprised if their adventures on Kaby Lake-G incorporating AMD's graphics-IP first and foremost served as some test-balloon on how to get their hands on viable patented IP (for being reverse-engineered afterwards) – or even solely served the very purpose for doing so on AMD.Adonisds - Wednesday, November 20, 2019 - link
Raja "Great Question" KoduriGreat interview, btw
eva02langley - Wednesday, November 20, 2019 - link
"Great question Ian... now just take a cookie so you can see why Intel is so wonderful!"eva02langley - Wednesday, November 20, 2019 - link
What I am feeling about it, is that Raja has his butt on the line there. He is doing 12 hours shift and he is busy to deliver, probably because Intel let him know about the delivery.And on a side note, pretty much confirm that MCM is not for gaming yet, it is for compute and server.
1idd0kun - Wednesday, November 20, 2019 - link
"And on a side note, pretty much confirm that MCM is not for gaming yet"Of course it isn't. No one has solved that problem yet, and I very much doubt Koduri of all people will be the one to do it.
Rudde - Thursday, November 21, 2019 - link
The biggest problem with gaming on MCM is software. While Koduri might not solve it, Intel does have enough manpower to solve it. Nvidia is another alternative and they have a lot more interaction with game developers.JayNor - Wednesday, November 20, 2019 - link
One thing I'd like to know is if they borrowed any AI processing features from NNP-I that would make the GPU implementation equally efficient.Another question I had relates to the recent announcement about porting FB Glow for the NNP chips, and how that effort fits into their oneAPI. Perhaps that was just done to get their foot in the door at FB?
HStewart - Wednesday, November 20, 2019 - link
Ian, this interview clear of a couple things.I thought when people were talking about failed Larrabee - they were talking about a failure PC graphics card like NVidia and such. But having it be Xeon Phi as higher end specialize market and not direct competitor to mainstream graphics cards. I also seen Phi as cool technology but out of touch for such device.
An important clearup is about Gen graphics and Xe, it seems logical that Gen will be replace but I would even think that it is possible that Gen 11 borrows some of stuff coming from Xe. I think initially Xe was referred as Gen 12
I be interested in how far up performance scale they planed to go in mobile laptop market. Of course integrated will be there - but would think in even on gaming level they will products out there. Frameworks like EMiB would be perfect for this with dedicated memory and such.
Guspaz - Thursday, November 21, 2019 - link
Larrabee *was* supposed to be a consumer GPU on an add-in card. Their marketing people were even claiming that it would be competitive with the then highest-end GeForce products. In reality, the performance wasn’t remotely close to competitive (competing GPUs were multiple times faster than the hardware Intel demo’d), so the GPU plans were canned and it was repurposed and transformed into Xeon PhiHStewart - Thursday, November 21, 2019 - link
I kept of video cards for long time - even having old VooDoo 5 and such. Many GeForce cards and the Larrabee seem to escape me. But during the days it came I was solid in NVidia cards but in last decade, I went mobile and honest GPU's cards became less important to me.But in the late 80's and 90's, external GPU were more important but now even the integrated GPU's have as much power as external GPU's then.
Qasar - Thursday, November 21, 2019 - link
which one were you able to get? the 5000, 5500, or the 6000 ?Gasaraki88 - Wednesday, November 20, 2019 - link
Ian, stop asking all those great questions.Sychonut - Wednesday, November 20, 2019 - link
Yo, what up doc? It's a pity Intel's GPU will not be fabricated on the 14+++++ node. Had high hopes for that.wr3zzz - Wednesday, November 20, 2019 - link
Boy, does this guy have the managerial speak down, no wonder he has been able to land plum jobs everywhere yet nowhere he went thinks he was indispensable. The interview is full of self promotions mixed in with lots of buzz words with some flattery but revealed very little substance.Sychonut - Wednesday, November 20, 2019 - link
Who do you think gets promoted to the top of the ladder? Only a two-faced bullshit artist.iranterres - Wednesday, November 20, 2019 - link
You are giving intel's publicity too much fellas. It's starting to get awkward. Perhaps changing the site's name to anandintelchs? Come on. Regardless of you doing your job OR NOT. Be less partial.alufan - Friday, November 22, 2019 - link
hmm 11 articles on the front page featuring intel?you wont give a stuff but i guess its time for me to look elsewhere for actual "news"
Slash3 - Wednesday, November 20, 2019 - link
Appreciate the interview as always, Ian. Looking forward to the Papermaster writeup.CityBlue - Friday, November 22, 2019 - link
Papermaster interview probably won't happen, in much the same way as the Intel security/performance article we were promised months ago has failed to materialise - are you still waiting for Intel to answer your questions LOL? Dr Intel could have written a second security/vulnerability/performance article by now on JCC, with a third on TSA, and no doubt more down the pike.Intel security issues are the gift that keeps on giving if you are a tech writer with a focus on CPU design, yet we get nothing - zero coverage - from Anadtech. Why is that, is it some sort of Intel "Quid Pro Quo"? Is losing your integrity and impartiality really the price to pay for tours of a fab and a few interviews?
CityBlue - Saturday, November 23, 2019 - link
For TSA I meant TAA ("ZombieLoad TAA"). Next up is "iTLB Multihit". Every single Intel vulnerability means ever increasing performance penalties in order to workaround the design flaws, sometimes up to a 40% reduction in performance. And yet not a peep from Anandtech on the subject. I do hope Anandtech benchmark Intel vs AMD only with full mitigations these days - any results without the latest mitigations (in particular Intel) are now not worthy of comparison.Why don't you have a conversation about this increasingly negative performance issue, I'm sure your readers would be very interested.
29a - Thursday, November 21, 2019 - link
I can't believe you all still think Intel will release a dGPU.ballsystemlord - Sunday, November 24, 2019 - link
@Ian You didn't ask him if we'd get consumer level compute GPUs. Do you know if this will go either way?Thanks!
seceyilu - Thursday, January 16, 2020 - link
What I am feeling about it, is that Raja has his butt on the line there. He is doing 12 hours shift and he is busy to deliver, probably because Intel let him know about the delivery https://bestpornsites.mobi/ .And on a side note, pretty much confirm that MCM is not for gaming yet, it is for compute and server.