I've got to give AMD credit here. They've been struggling in the desktop CPU space (Intel), aren't competitive in the mobile CPU space (Intel), didn't even show up to the tablet/smartphone space (Samsung, Qualcomm, NVIDIA, Intel), in addition to losing a number of design wins against NVIDIA for GPU superiority. I mean, if their new flagship GPU can't consistently beat Titan (rumored)... ouch. Clearly, being the nice guy on the block doesn't pay off anymore: http://www.ngohq.com/news/16519-amd-senior-manager...
This move to unite their multi-platform GCN architecture is a good move. I have a mix of NVIDIA and AMD GPUs under my roof (mostly NVIDIA), so I'm not completely excited about it if it means Mantle-enabled games don't play well on my non-GCN hardware. However, I assume the measurable, real-world performance will be a lot more like TWIMTBP and less like Glide for the first couple years.
"didn't even show up to the tablet/smartphone space (Samsung, Qualcomm, NVIDIA, Intel),"
Note: Adreno GPU technology in Qualcomm SoC's originally came from ATi. They sold the division to Qualcomm. IOW, some of the best mobile GPU tech originated with ATi/AMD. :)
DX and OpenGL are too high level to expose some of the features that are likely exposed by Mantle, but that doesn't mean thy cannot benefit from some of the features. Someone below mentioned how Mantle is probably just a "scene graph". Very unlikely. More likely is the enhanced draw calls they are referring to is the same thing Nvidia presented a few years ago via OpenGL extensions - bindless graphics. With enough work, the API can also then start harnessing GPU memory as pointers instead of data bound by a set of registers, which is where OpenGL and DX mostly start breaking down, and cannot do this today. Direct memory access can have huge benefits.
Ok, thanx. Another question: does not Mantle kill off all game consoles? If we take the same hardware, the console will always be faster. That is because there is less bloat on the console, you program directly to the metal on the console. With Mantle, you program directly to the metal on PS4 and Xbone. But, when Mantle is ported to PCs, you program also directly to the metal on the PC, which means the consoles will have no advantages anymore. Thus, Mantle will equalize PCs and consoles with respect to bloat. Thus, consoles will fall behind even more. But, accordint to rumours, both Sony and MS said this console generation will be the last.
An AMD rep recently said in a interview "our new graphics card will ridicule NVIDA Titan in BF4 when running Mantle, it will be much faster". As the new AMD graphics card will target the GTX780, but run faster than a Titan with Mantle, AMD will give very good value for the money. This is a very interesting move by AMD. As Mantle is open, anyone can join.
Consoles will continue to take advantage of a handful of specific hardware optimizations that software cannot fix. Other things like having a unified system memory and video memory pool is an incredible value. Today, Video memory is special and memory has to be paged in and out from a system memory backed section of memory by the OS. In most simple cases this is not an issue, as there is a one-time cost to upload the data and make ti resident on the GPU, but it makes it incredibly expensive to read back results from the GPU, one of the reasons that makes compute work on the GPU still pretty difficult today. Games that make use of compute typically have to either ensure that the results of the calculations are kept on the GPU in a format that can be used by other stages of their processing, or that the mass of the work is done in bulk and read back in a single operation.
For the most part, though, this is where Mantle is moving towards. Low level down to the hardware access. It will make it far more difficult for the average developer to use. Consider the difficulty that some people already have today with DX and OpenGL and realize that on Windows the DX Kernel does tons and tons of stuff for you behind the scenes. Complicated operations that even some experienced graphics developers don't even know exists.
Wait. I think the question is could DX and OpenGL drivers be written to use Mantle rather than direct hardware access. The answer is yes. Mantle is providing a virtual GPU that is a direct 1:1 mapping to hardware functions. It might not be quite as fast, but it should be pretty close because it wouldn't need make any more mode switches than it would for direct hardware access. The benefit would be in driver development. Once a particular OpenGL or D3D version was implemented using Mantle, it would work for any AMD GCN hardware well into the future. It would give AMD a set of stable, optimized D3D and OpenGL drivers that would work with any future GCN devices for quite some time and require far less code maintenance.
Because we don't have the inside information it's fully possible that this is how their drivers have always worked, they're just exposing the middle layer now.
The problem with glide wrappers was mostly that 3dFX hardware was that much better than anything else. Of course, by the time nvidia bought 3dFX, they really should have shipped the wrapper (previously banned by 3dFX) as by then anything shipping could outrun voodoo. The other big problem was the general kludge of directX (1-4) specifically and MS software in general.
Also it depends on the coding and the wrapper. QuakeGL was pretty much a high level wrapper around low-level calls, but that was coded to the wrapper. Using a dumb (and any complete wrapper has to act dumb to be correct in general use) wrapper with software that was written with low level hardware in mind is a bad idea.
http://msdn.microsoft.com/en-us/library/windows/ha... As documented by Microsoft. Reason for split is that there is barrier user/kernel mode and this is often most expensive. (And one thing AMD won't be able to avoid)
No. They are almost certainly completely different APIs that do different things. On the other hand, if you wanted to write your own OpenGL (or Wayland) drivers using Mantle, it should certainly be far easier (but the results will be slower than what AMD could do by writing directly to the hardware) than writing them for nvidia (without any API or other information). Linux users are salivating at the thought.
On second thought, I suspect that AMD will do just that (write directX and OpenGL in Mantle). Hopefully it will make debugging a little easier. Just make sure that you look at both Mantle and non-Mantle benchmarks when comparing cards (because directX will suffer slightly if they do this).
Writing OpenGL to be on top of Mantle will not give a performance advantage, that makes no sense, otherwise OpenGL would just have been implemented that way to begin with. OpenGL and DX drivers must obey a specific set of interfacing rules. DX for example has a separate user mode and kernel mode driver that must interact in a specific way. Bypassing these is not an option. Same goes for OpenGL, which is a standardized interface on top of the OS. Best you could do is expose a few extensions to call to Mantle.
There is nothing really preventing nvidia from doing the same, although you can already customize the pipeline significantly since it is pretty much entirely programmable, even if you lack some features implemented by the driver you can still write your own code and get hardware accelerated custom render features.
I suspect Mantle is nothing more than a "scenegraph" with implemented out of the box optimizations for the drawing calls to minimize the need for context switching and data transfer plus a bunch of pre-compiled ready-to-use compute kernels.
This alone is more than enough to make up the numbers AMD claims, the difference from just drawing arbitrary stuff in sequential order to drawing selectively, in special order, employing instancing and offloading everything possible to the compute shaders will easily result in 9x if not significantly more rendering throughput. The problem is that if you want efficient drawing, you have to implement that yourself for your engine, while AMD offer to have it done for you, but only if you use an API that only works for their products. I don't think AMD even went for a hardware implementation of the scenegraph, that would have yielded more improvement, so most likely it is just a bit of code that could just as well be hooked to OpenGL and work for code that runs on every platform, so while a good effort, Mantle comes with a less-than-optimally-positive intent. It does way more good for AMD than for developers, and even does some harm, because besides competition it is also fragmentation and forcing extra work for developers. What is good for people are open and portable standards, so if AMD really wanted to make it good for users, it would simply make a good graphics API with optimizations for OpenGL that still markets AMD but works with non-AMD products too. This will give the healthy boost OpenGL needs, since it is capable and low level enough and perfectly extendable with hardware accelerated features, and unlike Direct3D it works everywhere, not only on M$$$ platforms. It was a heck of a b1tch move from M$ to disable OpenGL in windows phone, even though all the chips come with hardware support for it, once again creating more work for developers just for the sake of using a platform vendor limited API, which is BAD if it doesn't conform to an open standard.
'There is nothing really preventing nvidia from doing the same'
Except the fact that nvidia are in none of the new consoles, while AMD hardware is powering the PS4 and Xbox One. Both of which will have the ability to use AMD's Mantle with game developers already making use of it when creating games for the consoles. Making it a no brainer to use it for carrying the game over to PC with the Mantle API available there as well.
Nvidia already has a close-to-metal API, it's called NVAPI and they've had it for years. It literally spits out basic Assembly code (ADD, MUL, MAD, MOV) to run on their GPUs, if they really wanted to do something like Mantle I'm sure they could dress up CUDA to be the front end instead of DX, but I'm not really sure they'd want to do so. I'm really not sure if the PC market wants that (2 vendor specific gaming APIs?).
Yes but it is not nvidia or AMD coding the games, it's the game developers. They'll be using Mantle already for developing console games, so using the implementation for the PC port is already work that is done.
There is zero reason to do additional work just for an nvidia low level API on PC only that is additional work and expense.
Nvidia is dead in the water with any attempt at their own version of Mantle.
Sorry, I doubt that is the case. The developer will need to compile a new runtime that uses this new API, but AMD will most likely still need to provide some high level compiler or tools that converts OpenGL, C or whatever high level language they are using for source code and convert it to AMD's close-to-metal Mantle machine code.
There is no way Devs are going to take on the additional burden of hand coding machine code for Mantle, that's a fact.
The idea is that instead of writing something in OpenGL, developers write it in Mantle, which is harder but gets better performance. They're already doing this for the consoles where they need to eke out every last drop of performance, and AMD is already making developer tools to support this there. AMD is now giving them a way to take these more hand tuned routines and port them to PC, bypassing OpenGL/DX. There's certainly some work involved, but it's not all new code - it's letting devs reuse high quality code they've alrady written on a new platform, where before they were limited to DX and a performance hit.
For a PC-only title, it's a much harder sell, granted, since there still needs to be a DX or OpenGL codepath, but for a console port it's at least an interesting option.
Not sure where you all get the idea of developers writing code in Mantle since we already know Devs are writing code in OpenGL (PS4) or DirectX (XB1) or both (PC) as the source code. The only thing Mantle does is allow the source code to be compiled and executed in machine code at runtime, thus stripping out the middle API (DX, OpenGL).
The 4th slide that talks about "simplified porting" touches on this.
If Devs actually had to code low-level API...you're going to have a lot of Devs who are going to need to brush up on their matrix math. Drawing a simple diagonal line in 3D space for example goes from 1 instruction in HLSL to at least 3-4 in machine code. There's simply no way any Dev is going to undertake that task, maybe Carmack but he takes that kind of challenge on for fun more than efficiency.
Except for maybe indie games, almost all games are low level on consoles. OpenGL will not be the common API used on the PS4. The PS3 also sorta supports OpenGL but nobody uses it. The 360 also supports DirectX sorta but it allows you to go much lower than PC DirectX. Low level API's have been and will always be common on consoles.
Sorry, again highly unlikely, the big AAA multi-platform titles are more likely to be written in OGL/DX for the simple reason of portability. They may be compiled to run at low level on the consoles, but they are without a doubt, created in HLSL source code and then ported/compiled to each target platform.
Naughty Dog games aren't a very good example of portability. They've done a lot of low level work on the PS3. They made and use the low level Edge engine. They develop on PCs but they do about as much PS3 specific optimizations as possible. The common multi platform engines like Unreal still have plenty of console specific bits.
Did you not look at the video and pictures? Last of Us is clearly being developed on the PC, written in C++ and OpenGL most likely, just like most every major game, then it is ported to whatever PS3 specific code they use.
I never disputed they developed on PC not sure why you had to repeat it. They have a version that works fine on PC hardware but they definitely do a lot of work porting it to get good performance out of the PS3. They made Edge and were the first to do deferred SPU lighting.
I don't think most dev's will do much low level work on the newer consoles but Naughty Dog will be one of those do and make low level PS4 specific tools.
I don't know, I agree with itchy. What about game engine developers? They might take that on, especially since they know they have a couple consoles to work on that will be around a long time.
I mean even good old vertex or fragment shader allows for explicit access to data and instructions what to do with it, and it all runs in hardware. Any lower access will mean assembly code, generic OpenGL allows for low enough level access to anything the hardware is capable of computationally, in fact enough to simply implement an optimized graphics API on top of OpenGL and still result in maximized rendering throughput, making Mantle entirely possible as a mere software implementation.
I suspect AMD are marketing it as some low level magic feature in a hopeless attempt to misguide the competition and send them into doing something crazy, when in reality what AMD are doing is giving a software shoulder to developers who commit to develop against something that only runs on AMD hardware. It is just AMD losing its fabs and being even less competitive with Intel trying to make it up using this apparently new thing to them called "software". Too bad the competition can do that too, so AMD will only have a brief window to exploit that advantage until other begin to "pollute" that niche until hopefully, someone will release something open and portable that doesn't cost 1000$ per target device platform. That is the big problem of OpenGL, no big players pour money into it like MS does in their APIs, which aren't really any better, it is just companies like intel, m$ or nvidia are only willing to "lend" you help in the form of a vendor loyalty pledge, but not give any effort to anything that can run everywhere, that is why Direct3D is pretty much dominating gaming, M$ paid quite a lot to make using their API easier and in the same time moved to make OpenGL development harder and at this point even disabled on platforms which come with it. So technically, D3D is better directly because of m$ being greedy evil monopolists. And since linux fails at anything but running the most critical and large scale supercomputers of the world, most of the PCs came with windows, which enabled m$ to erect (also figuratively) the pillar of D3D when another open API was already present and was in the need of extra development. If M$ were at least a tiny bit decent, they'd pour all the effort of the creating and promotion of D3D into OpenGL, it would be all that much easier to use API today, and I don't say better because it already comes close enough to the hardware, but it could have come earlier if M$ were capable of allowing something they did to make it possible for developers to breach the barrier that is developing expressively for Micro$oft.
Some people say we should thank companies like MS, Intel, AMD and Nvidia, but in reality it is only those companies that should thank us for our money, each of them has done it, and the bigger the fist the more shameless in its exploits to help developers exclusively in ways that egoistically benefit only its product range while making it harder and for a lot of time almost impossible for developers to be vendor independent, which is a basic consumer right. The little they do to make developing exclusively for their products is tiny compared to the damage and the delay they caused to the IT revolution, the result of which does obviously work in favor of the industry - ridiculously powerful device infest people's lives but not as the powerful tools they are for creativity and productivity, but as useless toys, tools good only for milking people out of their money and privacy. This is where pioneers like the inventor of the rounded corner - Apple, a company which struggled to make a profit on high quality HPC workstations but made the top on selling ego-boosting trinkets at ridiculous profit margins.
So, while some people see AMD doing good to some people, it is effectively just doing the usual kind of bad companies like that do all the time. Hurray for fragmentation and proprietary standards.
At the end you actually spelled MS without $. What happened, did the chip temporarily fall off your shoulder or something? Take a chill pill dude. AMD and Nvidia are neck and neck. All this will do is potentially help developers better optimize for AMD... IF THEY CHOOSE TO DO SO! It will still be CHEAPER and FASTER for PC companies to write as they have been. This only HELPS console developers who don't already port to PC. Man, you have to be a real pessimist to find something that negative about this story.
I decided to leave one in to avoid ambiguity. And this has nothing to do with pessimism, it is more like you see all butterflies and candy where in reality lie sneaky corporate extortion tactics. Sorry for having higher standards - that's what you want to hear, right? ;)
Yeah, the only thing I could see upon reading this article, is "And yet another proprietary, platform-locked, system." At the time OpenGL and OpenCL seem to make slow progress in their respective domains, here comes another new proprietary standard that might, again, postpone the development and wider adoption of open and multi-platform technologies. I'm not sure anyone should feel rejoiced by reading that.
You have to remember who said it was low margins: Nvidia. Did they see this comming. Probably yes. But they didnt have an x86 cpu. Did amd knew nv didnt have a x86 cpu. Well yes. Is amd in a situation where they can do other than earn money for new products? No. They dont have any options. Its good profit for consoles. Its good profit for gpu. It did take some years to get here.
Did you notice that even according to their propaganda slides, this is only going to improve things by about 10%?
I think Microsoft has more to worry about than nvidia. The basic assumption (a huge problem: if it is so good for marketing than why didn't they come out and say it?) is that you need only code for:
Windows (with Mantle) Xbone (near-exact Mantle interface) PS4 (near-exact Mantle interface: hey, coding to the metal is going to look similar with the same metal. SteamBox (soon to arrive mantle interface, at least if it ships with AMD)
Yes, while this isn't good for nvidia (you get a port, while AMD users get the native code that developers sweated for xbone and PS4 consoles) it certainly takes a lot of MS's control of the desktop & living room away from them.
The flip side of this is that I'm much less sure that anyone other than AAAA-game developers will be using this. This means that anything pre-Mantle and remotely indie will be using the directX drivers. Do you really think that AMD (especially considering its financial situation) will be really sweating the non-mantle drivers like they need to?
This is really fascinating. I am not sure this will work out for AMD. If it does though, they should make up a lot of ground in the market (I would think).
The main limitation they'll face here is that it's apparently coupled closely with GCN, and GPU cores change design much more frequently than consoles do. GCN's already been around for a while, under normal circumstances I'd expect it to be replaced in another year or two, but if successful Mantle would highly limit their ability to do so for the remainder of the decade. This gives nVidia and even Intel plenty of time to push something that needs a major change and torpedo AMD.
It's likely have to re-implement parts of OpenGL and Direct3D on any sufficiently different new architecture anyway? If they go about implementing OpenGL and Direct3D on top of Mantle, it could be done to create less overall work. They'd only have to implement Mantle for each new radically different architecture.
What the hell, Microsoft? Have you lost so much control of the direction of your platform, that an alternate 3D API like Mantle can even be considered? You brought all the GPU makers in line with DirectX, cementing that as a primary development target and now the situation has deteriorated to this. AMD powering PS4, Xbox One and PCs... give them a common AMD-owned API and THEY become a platform target instead. So much time spent chasing the iPad and Android and all this Metro/Modern nonsense and not defending their desktop foundation.
Five years ago any hope for Steam OS and the idea of Glide 2013 Edition would have sounded insane. Now someone could write games for Mantle that would run faster and could potentially be easily ported over to alternate operating systems like Linux if/when that gets Mantle.
I'm not even a huge fan of Microsoft, but dammit, get up and fight for your standardised platform!
There are many advantages to DirectX, portability being a major one. The thing is that, as stated in the article, DirectX comes with a lot of overhead. You can get more performance out of a lower level API. There is only so much that Microsoft can do to limit overhead while still allowing portability of code.
And when did that ever stop Bill Gates (although I think that he had to decide between retiring and eventually losing to the Justice Department. So he retired with MS intact)?
Err what do you expect MS to do here? DirectX is a high level API, and it needs to be in order to work with all the different GPU's for PC. MS couldn't make something to compete with Mantle, as it is a low level API specifically designed for AMD's GCN architecture.
Neither DX or OpenGL could ever compete with a low level graphics API when it comes to performance and getting the most out of a GPU. Even if games use Mantle, they will also HAVE to continue having DX support as well, in order for them to work on Nvidia and Intel GPU's.
MS already made improvements in Win8 to the desktop, including new DX versions in both 8 and 8.1, with 8.1 having the same DX 11.2 version as the Xbox One. So don't mention that typical uninformed BS about MS focusing just on Metro, because Win8 has more desktop improvements over Win7 than 7 had over Vista.
There have been some improvements to the desktop in Windows 8, but it's clearly being treated as a legacy environment and Microsoft sees the future as being Modern/Metro. That's why we have Surface, why all of their platforms are taking their interface cues from Modern/Metro (X1/WP/W8). That's where the focus is. The traditional PC desktop is no longer seen as a growth market.
"Neither DX or OpenGL could ever compete with a low level graphics API when it comes to performance and getting the most out of a GPU."
Sure, and that's always been the case. So how did interoperability win out before, when performance would have been more important when it was older, weaker hardware?
"Even if games use Mantle, they will also HAVE to continue having DX support as well, in order for them to work on Nvidia and Intel GPU's."
Yeah, so you have Mantle and DX, one of those two ends up being the primary platform it gets developed towards and it gets ported to the other. If you played Glide and DirectX games you remember how rubbish the DX versions were. I remember Wing Commander: Prophecy when I sold my Voodoo 1 to get a TNT2, and it looked terrible afterwards. Part of that is how immature DX was, but also the developer is only going to go to put in so much work to support a less popular platform.
"Err what do you expect MS to do here?"
What I expect is for them to have developers for the Xbox One write to DirectX rather than writing to Mantle just like they have for all their previous consoles. It allows for easier ports to standard Windows and gives DirectX traction over Mantle. They certainly have control over that as part of their certification process, but from the tone of this article it sounds like they're making Mantle a core API on the console itself.
You say developers would "HAVE to continue having DX support", but that DirectX support could get pretty poor. Sure of the total game playing market on PC, AMD and NVIDIA may be 30% of the market each with Intel being the other 40% (thumb suck numbers), and based on that the developer would split their resources between DX and Mantle. But if they're targeting Xbox One, PS4 and PC the non-AMD portion of that starts looking very small and the amount of work they do on the DX path could correspondingly suffer.
"have developers for the Xbox One write to DirectX rather than writing to Mantle"
Any other issues aside, I doubt Microsoft would want to hand Sony a performance advantage like that. If Sony happily allowed usage of a lower level API with potential performance boosts on their similar hardware, while Microsoft denied it, Sony's exclusive titles would likely begin to outshine Xbone's fairly quickly I would imagine, which would hurt Xbone's perception quite a bit.
Except that since "the metal" on both consoles looks astonishingly similar to each other and to shipping AMD GPUs, any API that "writes to the metal" will be astonishingly close.
I get the impression that they DIDN'T write all of the code in DX on previous consoles.. They always had to write some in low level APIs (similar to mantel) in order to get the most performance out of a fixed set of hardware. How else would games look better that were developed at the end of the console's life? The difference is, this time it's given a name and talked about with the public, and of course it (or something similar) will be applied to desktop PCs.
PS - I would have quoted you, but website's Po.st made it so I couldn't copy your text on my phone...
I AGREE 100% With you. Anyone who says win8 or 8.1 isn't a good OS basically doesn't understand the meaning of stability and or efficiency/utilization.
win8.1 is what windows 8 should have been from the get go. DX11.2 tile resources....
AMD HUMA.... look at history and watch as MS will champion again in OS performance and be the the still leading API champion.
Who says Microsoft needs to compete? They will probably implement a way to use Mantle as an alternative to Direct3D, while still being able to use all the other DirectX components (Sound, Input, Network, Fonts, 2D, etc.).
I thought that DirectSound, DirectInput, and DirectPlay were deprecated some time ago. Since Vista, you no longer get direct access to the hardware with DirectSound; you now need to use WASAPI to do that.
They were deprecated because they were replaced with XInput, XAudio and XACT. That is, except DirectPlay which has no equivalent (mostly because one was not needed). These APIs were directly ported from the Xbox 360.
Games could use OpenGL and be easily ported to Linux, too. That's the most ridiculous thing. OpenGL is used for Android, OpenGL is used for Linux, OpenGL is used for the PS4 and PS Vita. Hell, I think OpenGL is used by Nintendo's consoles, too.
OpenGL is already out there for people who want to port between multiple OS's. This API has nothing to do with that. This API has to do with AMD trying to make HSA work without getting cooperation with Intel or nVidia in creating a new API that includes it.
Neither nVidia or Intel has an interest in helping AMD recover any lost ground, so they're pointing a nuke at the entirety of PC gaming and are saying, "Now. We're asking again. Join us... or we'll take the whole thing down with us."
This is a WAR. Windows vs Android vs iOS vs Steam OS. Direct 3D vs OpenGL (iOS Android and SteamOS) vs Mantle x86 vs ARM Console+PC games (AMD) vs mobile+PC games (Nvidia and ot even that strong in mobile) And don't forget sound!
Either *some* standard will emerge on top, which may not be the classic x86-Windows-Direct3D-Nvidia we used to know, or in this new age of multi-platform there will be a place for everybody. The burden then will be on developpers.
I wild guess: -AMD will capture more PC-GPU market share in the next 5 years than it used to.
Mantle is cross platform. I strongly suspect that we'll be seeing it on Linux (and hence Steam OS) as well as OS X. It wouldn't surprise me if Mantle also appeared on the Xbox One and PS4. The only platforms that would be left out would be iOS due to the lack of AMD hardware. Similarly, the x86 version of Android *might* get Mantle but without AMD graphics in the ARM space, there is little motivation to do an Andriod port.
While AMD is clearly aiming Mantle at their own hardware (ie the Glide comparisons), I haven't read where they'd block out other vendors. (And if they have, please post a link!).
Did they explicitly state that it was from the PS4 or Xbox One? From the live stream all I heard is that this is a console-*like* API due to how thin the software layer is. Having said that, DirectX exists on both the PC and Xbox One but the Direct3D and driver layers are noticeably smaller due to only having one hardware configuration to account for.
Gabe's been talking up the open nature of Steam OS and has even stated they worked with NV on some things (the recent Linux driver announcement might've been spurred by that)... Somehow Mantle doesn't seem like the kind of approach he'd wanna get mixed up in.
It might not have been his plan, but if AMD is laying out a widely-used (at least for AAA and higher games) API on a silver platter, I'm sure he will take it.
I suspect that Gabe's take on this (next big thing or just more marketing) will likely be the most accurate one you will hear, especially in the early days.
What part of "low-level API" don't you get? Unless you want to design a GCN device, you might as well use directX or OpenGL. Using this with any other hardware (unless it is a really careful/lucky fit*) is a bad idea.
* I seem to recall that a large part of the reason that QuakeGL happened was that Glide was extremely similar (for sufficiently low level calls) to OpenGL. Restrict yourself to those calls and it is easy to write a "restricted OpenGL" driver for glide. Even if you went this way, you would need devs to restrict themselves to only the calls that could be efficiently mapped to your hardware.
This will certainly be in AMD's favor. They are likely to get a nice bonus in speed essentially "free" (assuming that it is paid by the sweat of dev's writing the game on a console).
The other side is to expect SteamBoxes (and boxes bought with SteamOS in mind) to be largely (or entirely) AMD. I might be writing this on a Linux box, but I really don't expect this to be big any time soon (when tablets are up to SteamOS and GCN, expect things to get interesting. Don't know if Moore's Law will scale that far).
It is a bit exciting isn't it. I'll probably keep running windows for a long time yet, but if you can get substantially better fps out of steam OS I might dual boot.
AMD just admitted that they are lost OpenGL and Dirext X war. I mean they can't win against Nvidia in these two so they are making it's own. :D AMD is too small, I think only Battlefield 4 will use Mantle just like only Tomb Raider used TressFX.
Umm what? I think you missed where AMD has beaten Nvidia out of the gate since the 5000 series. It's only this current gen that Nvidia finally got their refresh out before AMD.
I appreciate AMD for moving to a lower level language to improve performance. Think about it. We are starting to stall when it comes to increases in compute so the next place to look is in the software for more performance. Now let's see applications start being written in Assembly for some serious performance improvements.
On the other hand, writing in OpenCL (and wishing you had cuda:) is another story. I suspect it is even slightly easier than assembly (I wrote a fair share of assembler back in the day. Haven't gotten much further than "hello world" in cuda).
this is very interesting. I hope it's not going to become exclusive, meaning a game is made with/for mantle, can't play it otherwise. That would be a huge disappointment :/.
I highly doubt any games will be Mantle exclusive. They'll just have a DirectX or OpenGL renderer as an option alongside Mantle. Just like many old games did..
Interesting piece, but all of the emphasis on the Xbox One and Direct3D were a little annoying. Considering that there is a large emphasis on creating cross platform games these days, Direct3D is becoming less and less of the Goliath that it used to be. "Why write in Direct3D when OpenGL can cover Windows *and* everything else?" That's the question many games studios are focusing on today. With the Steambox running Linux, the ever-growing market share of Mac OS X, and the complete dominance of OpenGL-based tablets and phones, Direct3D Just Doesn't Matter. Not like it used to, at least.
I know Anandtech loves Microsoft, but this was a little ridiculous. Do you really think AMD is going to create a low level API for the Xbox One and another one for the PS4? No. They don't have that kind of software team. Their software teams have always been on the smaller side. Mantle is more relatable to OpenGL than Direct3D -- regardless of Microsoft announcing a coincidental version -- unless someone at AMD has just gone crazy. Adding support for another shader language is insignificant.
I'm actually most curious about compute. Mantle seems like a great opportunity for AMD to work in some GPGPU functions to accelerate certain tasks even more... I wonder what they will do in regards to this.
One interesting thing worth noting is that EA games for the next gen (consoles and new PC games) are to be built on one of two engines, either Frostbite 3 or the EA Sports one (Infinity I think) if Frostbite 3 supports this that means most of EAs games will benefit from this for the foreseeable future. Say what you want about EA but they have a major role in deciding what will happen in the industry as a whole
So, now nvidia will have to make their own proprietary low level API, so developers are even more fragmented. I guess they just aim to bite from (ironic, I know) M$'s big apple, which is the product of the even more and more enforced Direct3D.
Maybe this fragmentation is a good thing by being bad, right now most games target D3D with the exception of mobile games, which are mostly OpenGL, throw in additional 2-3 low level APIs from major GPU vendors and it will get even harder to port games to different hardware. In the end, this may help people realize that OpenGL suffices, and with the introduction of compute shaders, you can pretty much generate render features on demand, still being executed in hardware.
As a developer I feel safest putting my eggs in the OpenGL basket, and since I am not in the position to learn 5 different ways of doing the same thing just for the sake of the monopolistic aspiration of corporations. A low level API will be more than welcome and useful for implementing good OpenGL compliance drivers by the community, this applies to other potential APIs as well, so while there is a niche that will be good and not cause fragmentation for this extra low level API, but in should be utilized the same way as assembly, and be kept away from application developers. A public low level API means more control, which is good on its own. It is good for vendors to expose their hardware capabilities, but they should come up with an abstraction layer to unify and standardize an API, so that it is more portable, works with different device, os and driver vendors.
Devs will likely have to use Mantle if they develop for consoles. It should be easy to port that code to PC. Doing it again just for nVidia's share of PC market might not be something that most devs would do.
Developers should be free to choose between using higher abstraction with more coverage, but worse performance, or lower abstraction with less coverage, but better performance.
Using Mantle won't reduce coverage that much though as two of the three consoles will definitely be able to use it and even the WiiU may see some benefit. That's a large portion of 'proper' gaming. This will probably remain the case for several years, as I doubt any of the console makers will be in any rush to get to the next 'gen'.
This is some really fascinating stuff. I've been buying both AMD and Nvidia cards in the past, the last two being Nvidia parts. I think it's time to move to the AMD camp the next time I upgrade my PC. I really hope this helps AMD gain more market share in the PC world, because we need proper competition to keep prices at a sane level.
We already know the XB API is DirectX based. We know the XB OS is based off windows technologies. The XB1 API can be specifically targeted and refined for its own hardware, but does that make it any more higher/lower level than what we already have in Windows, based on the prior facts?
Of course I'm not a game developer but on paper this doesn't sound like its actually anything new and groundbreaking.
But the point is that game devs can also use the lower level APIs on XBOX One and PS4 to get more performance instead of the higher level APIs offered. And they will use them because consoles have rather limited performance compared to desktop PC hardware. So now that Mantle is here, devs can use (almost exactly) the same code they use in consoles for AMD GPUs on PCs as well.
This is probably the best move AMD has made in the last 5 years or even longer. Leverage the console API to bring in a performance monopoly on PC gaming. If they can indeed extract significantly more performance using a low level API then everyone is obviously going to buy their GPU. Now if only they had the foresight to add a special block of microcode to their x86 cores to allow them to accelerate these API calls, it would be even that much more faster. Unfortunately, knowing history, the implementation could end up being so bad that even with this advantage, games might end up running faster on intel+nvidia hardware.
The writer is believing that the Mantle API is being brought over form the XBOX One, how does he know its not from the PS4 which has supposedly 2x the graphics grunt of the XBOX One?
I think Mantle will be great, I can't quite understand why they just didn't go for OpenGL though as it is in a resurgent state?
Also, AMD is having to revert to this as Microsoft has not been interested in PC Gaming for many years now with DirectX and GFWL being massively neglected.
Good on AMD! If this brings anywhere between a 2x - 10x frame rate increase it will spur on Nvidia and Microsoft to get off there asses! Great for the consumer!
I would have thought that Mantle would have stemmed from the PS4 as well. The main reason is that MS wants to standardize development across both the Xbox One and PC which means DirectX. The PS4 is its own beast running a BSD variant and its own Sony dev.eloped graphics API reportedly based upon OpenGL
I'm sure I read somewhere they are claiming up to 10x! Agree wait and see benchmarks on BF4 in december. Exciting though, imagine you get double the performance, still incredible.
Microsoft should be very worried about this and start optimising and enhancing DirectX as a matter of urgency.
Two years ago AMD said that they were working on this and they were throwing around the 10x number then. 10x for specific draw calls though, who knows how that translates into real world performance..
'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'
Not a lot. It's a console problem - fast gpu, really weak cpu but lots of cores. You need a new way of splitting the draw cull across the cores or your graphics performance gets bottle necked. PC's don't have the same bottleneck as the cpu cores are much faster.
Fundamentally mantle is a console api, and it's use is designed around removing bottle necks for the completely fixed console architectures. PC's have a huge variety of configurations and consequently completely different bottlenecks. If mantle really is a low level api then to really be useful you'd need to write different code for each different specification of pc to address it's particular constraints. That obviously won't happen - so what we'll get is the normal high level direct X implementation and a few quick code paths written for the consoles but that with minimal tweaking might add a little to some subset of the PC's out there. e.g. the draw cull - if you just happen to have a very fast gpu, and a very slow cpu with lots of cores this could help a bit.
Or you can just buy the Nvidia products that are faster out of the box and not bother with the vendor specific optimizations used in a handful of titles that choose to implement Mantle.
From what I know the API is actually open sourced (no idea what the license through), so Nvidia could always license (I am sure AMD would like PhysX and/or a pile of money) and build it own driver to be compatible with mantle API, if mantle really cause a huge difference in performance and take hold too much, for Nvidia to simply whether the storm.
But been a system/network administrator, I admit the little of what development skill, I have, is limited to server-side script API at best, so all this is a bit too low level programming for me, so mostly just wild speculation on my part.
AMD's Mantle initiative is a fantastic idea to bring low-level cross platform development to a huge user base. And I don't think anybody saw it coming, least of all Nvidia. Because if they had, they would never have shrugged off AMD getting both the XBox One and PS4 contracts.
Back when Glide was used, it was only on the PC. And even then it was a force to be reckoned with. But today with AMD hardware not only in PCs but also in BOTH of the most popular consoles that will most likely be in production for the next 6-10 years, it's a brilliant move. Developers of both PC and console games will have low level access across both platforms (PC and console). So now they will be able to develop games with increased performance that will also be easier to port. Whoever came up with this strategy at AMD deserves a raise!
Nvidia has every right to be very nervous by this unexpected move from AMD. Because now any game designed for a console will automatically have increased low-level performance and graphics enhancements built in that will work with any GCN equipped (AMD) PC.
The company that pushed glide was very successful until they fancied themselves as retailers and not virtual chip makers. Their plant for mass producing chips was not in a well established area and had many quality control problems. Then there was that whole "making your customers your competitors that then flocked to nvidia" thing. That was ultimately their downfall. Developers really liked working in GLide from what I can remember. It was far easier to use than the directx at the time (and to a lesser extent OpenGL).
There was also the issue that they never made a new core after voodoo 1 (everything else was shrinks and multi-core copies of those chips). Not sure if the guys who designed it weren't interested in another design because they liked their rewards or they felt ripped off, but I suspect that other startups have the same issue of lightening not striking twice.
This move really shouldn't be that much of a surprise. If you look at how things are rated in the PC gaming world, where a few extra FPS or a few extra ms of frame latency can make or break a game as well as drive sales of hardware, it doesn't shock me in the least that AMD would look to leverage their work with the game consoles to provide this type of functionality.
From the hardware side of things, GPU's are big, powerful, fast chips, and soon they're going to be running into the same issues that the CPU's are in that dropping down process nodes in their construction (28nm -> 22nm -> 14(?)nm) will follow the same path - eventually, you'll hit a limit as to how fast/hot these chips can be pushed. Eventually, they'll run into what I like to call 'The Haswell Limit', and the chips will become exponentially more complicated, and thus more expensive to develop and produce.
What I see AMD doing here is seeing how this has already played out and recognizing that if they want to increase their performance, they need to do things -differently-. By eliminating some of the overhead (or a lot of the overhead - we're not sure yet), it's unlocking 'free' performance. If they could (theoretically) eliminate 15% of unnecessary overhead on the GPU, that's a 15% increase in performance. That's being able to lower thermals, extract more performance, drop power usage, however they want to utilize it, and all it requires (again) is software.
Comparatively speaking, software is cheap. Hardware is not. AMD is recognizing this and asking themselves a very logical question: Is it better to optimize our hardware/software interface, or continue in a GPU arms race with nVidia.
Apparently, they've decided on option A, and it makes perfect sense to me.
There are simply two limiting factors in GPU hardware design today: power consumption and die size. Today modern GPU's are very programmable and while they could improve performance per clock, they can extract greater benefits from increased parallelism. Thus they won't approach the 'Haswell Limit' until after they've reached a usefulness to parallelism in their designs. Adding more parallelism is mainly a copy/paste job of a shader cluster, ROP, TMU etc. and a bit of work connecting them all together. With each die shrink, they'll be able to add more units into the same area.
nVidia is willing to approach die size limitations around 550 mm^2.**. Beyond that figure, chips greatly suffer from yield issues. AMD had a small die strategy in place with the Radeon 4000 and 5000 series. They loosened up that a bit with the high end Radeon 6900 and 7900 series. It is formally gone now with the R9 and the estimated 425 mm^2 of the new chip.
The other limitation has been power. Since the Radeon 2900 introduced the supplementary 8 pin power connector, the upper limit to the PCI-E spec has been 300W. Modern cards like the Radeon 7970 and nVidia's Titan are capable of drawing that much power with modest overclock. Only recently have we seen AMD and nVidia flirt with cards consuming more power and they are strictly dual GPU solutions. Make no mistake, both AMD and nVidia could design a GPU to consume 375W or 450W with ease (mainly increase voltage a notch and clock speeds on current designs). The problem is cooling such a chip. They are already using rather exotic vapor chambers to quickly move thermal energy from the die to the heat sink. Either moving to liquid cooling or ditching the PCI-e card format to a motherboard socket is likely necessary to permit further increases in power consumption.
**For reference, lithography die limits is a bit over 700 mm^2. The largest die I've heard of is the Tukwilia based Itanium which weighed in at 699 mm^2.
Yup, from the 7000 series and up. This is the best move AMD made it's last 5 years. If this works like they say it will, it would be truly revolutionary to PC gaming.
Are PS4 and Xbox One low level graphics APIs the same? If not, developers still have to do 2 different things, and I don't follow the reasoning behind "this is a port of Xbox One API to PC" (i.e. why don't you guess "this is a port of PS4 API to PC" instead).
The API is not the same, but PS4 and Xbox One use the exact same GCN Architecture, that's why they brought that architecture on PC with the 7000 series. They still have to do 2 different things but the PC API is very easy to develop since it's the same as the X1 and PS4 architecture (console level language). Now what they're trying to do is to make somewhat similar low-language API for multiple AMD GPUs that are a lot easier to support and optimize.
Love the article. I think you are exactly correct on this being a Xbox One API port. I think what is missing is the implication for Microsoft. MS had to also agree to AMDs licensing terms for the CPU/GPU for the X1, not just the other way around as well. They knew full well this was coming, the question is why?
AMD does not, at this point, have a mobile play. They have, over the last year, strategically set themselves up for mobile, servers, and desktop success, in part by bringing back old talent from the likes of NVIDIA, Apple, and Qualcomm. The other part is owning both and x86 and ARM licenses. They know better than anyone that graphics is driving mobile and pc business, the cpu has been commoditized, and the gpu landscape is wide open and fragmented. In comes MS with their device and service company, the X1 and Xbox Live as the center piece to be leveraged in the consumer space. The X1 is just a virtualized mesh of 3 OS's that runs on common PC hardware, I am looking at you Steam. Who says they won't release a Xbox OS for PC gamers? Not to hard to think that they could just give you a download disk that would virtualize you Win8.1 and then install the Xbox OS in another VM along side. Hardcore gamers could update there hardware as they wanted, and play against all console users at once via Live. As a side note, what better way to extend a consoles life, you could update the hardware anytime you wanted if all or most developers use Mantle, wasn't that the original idea behind DirectX? The X1 name, the ONLY One you need? AMD's ARM license gives them access to tablets and smartphones. Qualcomm's Adreno was once AMD/ATI's. MS has stuck to Qualcomm in their phones, could it be because the hardware is still very similar, ie easy to port software? X1 has low power cores, AMD could easily add CGN to an ARM core, X1 on the tablet or phone? AMD x86 X1 Surface tablet maybe? This last part is, I think, the ultimate play for MS. The Xbox is clearly the consumer facing brand for MS, and the engine for unification across all platforms. Platforms being the key word. If AMD can leverage Mantle and get the backing of most of the software developers, I see MS as using the X1 as a strategic partnership with AMD. Think Nokia, and you get my meaning. Intel CPUs are clearly better, but the cpu is commoditized, and their Iris Pro is the same thing MS did with the 360 and now X1. There is clearly a benefit to adding local dram on the die, sorry Sony PS4. Intel gpu is getting better but now where near AMD, and as Apple has shown cpu power is just not that important. I would not be surprised at all if this wasn't a dry run for MS to become more vertically integrated. AMD has the license, the patents, the talent, the roadmap, and aligns very well with MS vision. It's no coincident that AMD let this loose, this is MS vision as well, cross platform everything. It might not be today, or this year, but I would not doubt MS will pick up AMD for a penny, make their own hardware, and still license out all their software for everyone else. At the end of the day software is the one thing that isn't commoditized, it's just unifying the hardware underneath that has kept MS from realizing their dream, and Mr Gate's dream.
I remember a different Bill Gates, one who simply assumed that the goal was 100% marketshare. Letting AMD peddle Mantle to Steamboxes simply doesn't seem like any goal likely to fly in Redmond.
Funny, but I don't remember all these concerns about "fragmentation" being brought up when Nvidia came up with CUDA and PhysX.
If Mantle is available on Linux and therefore allows easy porting of games to Steambox, then in a way it is more "cross-platform" than DirectX. Sure, you lose the ability to choose an Nvidia card, but you gain the ability to choose a different OS.
I don't think there was as much concern because Nvidia has a much better track record when it comes to driver support and feature integration, not to mention they have the resources (ie. money) to expand their scope of services.
If you read what is being promised in this article, the increased burden on AMD is TREMENDOUS. I just don't think they can do it (well), I'm sorry.
And I'm sure you are going to challenge me on my 1st comment, but in the meantime, we got NO updates whatsoever on CrossFire framepacing, EyeFinity support (even though they were teased), and in the last few years we have seen AMD do nothing but cut support for graphics families (HD4000 and older no longer supported in every release, moved to legacy). And the latest WHQL driver release, no more Win8 or Vista support? All that on top of the big news lately with the broken CF drivers, framepacing issues.
And we really expect AMD to now create and implement a new API ecosystem for PC, XB1, PS4, Linux and also work with each dev that wants to implement their API with timely driver updates? Is anyone who owns either AMD and Nvidia cards truly satisfied with their drivers to the point they want all that added to the mix? Fragmentation concerns, indeed.
You're working pretty hard to poo-poo this, chizow. I see Wreckage is out and about on damage control as well. You Nvidia fans seem pretty nervous about Mantle.
Nervous lol, if I were an AMD user I would be nervous. AMD's driver teams need 10 more checkboxes to fulfill like I need a hole in my head. Did you even bother to keep track of how many additional driver engineering positions alone AMD is planning to undertake? Figure 2-3 heads x $100K per seat across 3-4 new platforms...that's a lot to ask for a company that just laid off a lot of people in the last year or so.
But yes Creig, it is amazingly ironic coming from an AMD fan like yourself that spent the last few years poo-poo'ing other "proprietary" APIs and standards like CUDA and PhysX, suddenly do an about-face to welcome Mantle as the greatest thing ever invented for GPUs. :)
Actually, we did get Crossfire frame pacing support but only for single monitor setups and resolutions up to 2560 x 1600. AMD has said they're working on frame pacing improvements for Crossfire + Eyefinity and CrossFire + 4k but haven't disclosed a time frame for release. They haven't given up but the problem is more complex by having to drive multiple displays at the same time.
Dropping support for the Radeon 4000 series and older shouldn't be that concerning now that the hardware is going on five years old. At some point, support will end for a device.
Dropping Windows 8 support isn't going to be that big of a deals since Windows 8.1 is a free upgrade and that will be supported. Dropping Vista shouldn't be too alarming considering its market share: most have moved on to either Windows 7 or 8.
The expectation was there would be big news about CF and EyeFinity, and there was none. I am aware CF framepacing was addressed in a limited capacity in 13.8 beta, but that support has regressed and is curiously missing in subsequent WHQL drivers. Also, as mentioned, there is still no CF fixes for DX9 or EyeFinity and it's been nearly 3 months since the initial driver launched.
Also, I misspoke about Win8 support, only 7 and 8 are supported, Vista and XP support was dropped. You might think it's OK to drop Vista, but it still has higher marketshare than 8 currently and XP still has more marketshare than both Vista and 8 combined.
Same deal for HD4000 series, idk, I think that chipset is still completely relevant and capable of supporting current games, I know for a fact the same cards from Nvidia in that generation (GTX 260/280) are perfectly viable gaming cards today, so I would certainly be disappointed if support were suddenly dropped.
Again, you might not think any of these things are a big deal, but as a whole, you can clearly see there should be healthy concern and skepticism for AMD to take on MORE driver responsibilities (Audio now too on top of that) given their rather poor track record of supporting and sustaining new and existing features.
WHQL drivers will likely follow suit. Remember that the current beta drivers are essentially testing a new feature that doesn't work in all scenarios yet.
Despite its market share, MS is dropping support for XP themselves, thus why continue to support an OS whose makers don't even support? As great as XP was back in 2001 when it launched, becoming increasingly legacy. Similarly Vista came out in 2006 and main stream support for it ended in 2012. Dropping support for Vista is to be expected.
As far as piling on additional driver work, AMD is essentially doing the same thing in the console world as far as Mantle drivers go. AMD has been developing audio drivers since the Radeon 4000 series could output audio over HDMI. I see AMD's audio work as more of a precursor to integrating audio into their SoC line than an additional GPU feature. They're clearing going to be using audio as a differentiating feature down the road in the SoC area.
Oh well, I look forward to your addendum explaining away why AMD has dropped support for Mantle in the near future. I'm sure it will be an interesting read.
Only reason to drop support for it would be if both the Xbox One and the PS4 both have relatively short life spans. Otherwise, it'll hang around for years just on merit of its console connection. Longterm I also see the console connection as the means for Mantle's eventual demise as the PC/console divide widens again over time.
With DICE committed to adding Mantel to the Frostbite engine, that alone will carry Mantle forward as EA uses it across several projects over the short term.
They presumably built the costs of building the API for the XB1 and PS4 into their bids and are being paid to do so. Also it isn't nearly such an issue as it will be to completely frozen hardware. The other API costs should be pretty minimal, as you might expect from a low level API (i.e. the code should be little more than stubs. Initialization will be as fun as always, but should be the same regardless of the API.
The issue isn't fragmentation (GPU families change pretty slow. I expect GCN will be around awhile), but success. With AMD getting all the major console ports to use Mantle, will they rest on their "commitment" to directX? I suspect that future directX drivers will use Mantle more and more, and thus gain that minimal overhead. Also, AMD could easily be less concerned with such less used drivers (especially if you stop seeing them in the latest benchmarks) and expect the bugs to creep in.
Frankly, if I was in in AMD's shoes, I wouldn't worry too much about trading high performance and lower driver costs for bugs in older and lower profile games.
The Windows 8.1 update is free and better in many ways. Why should anyone still support Windows 8? Why would anyone still be running Windows 8?
Were drivers and software still supporting the original WinXP after XP SP2 came out? I sure hope not. I wouldn't have. If someone asked me to support that in my software I'd just mail them a SP2 install disc.
On the one hand, I hate anything that introduces new divisions among hardware and that negates standards, like Mantle seems to do. On the other hand, if it is already being widely used on XBone hardware, it could very well offer a lot of nice performance advantages (AMD 7970 owner here), which I wouldn't mind picking up. I'm interested to see how this plays out eventually.
The idea is that if you make a console game (PS4 or XBOX One or both), you can use the same API on PC hardware as well. Of course you still need to make a DirectX or OpenGL implementation for the PCs that don't have a GCN GPU. But the point is you can, with very low cost, make the game work extremely well on some of the PCs as well.
NVIDIA could have done this years ago when they were in ATI's position. But the biggest they could dream was CUDA, Phys-X, and over charging customers for this cards.
If Mantle is a copy of the Xbox One API, what does it mean to the Xbox GPU? Are there anything that hasn't been disclosed? Is it a derivative of the R9-290?
The Xbox One GPU is based upon the GCN architecture with HSA extensions. Essentially the additions are to allow the Jaguar CPU cores and the GPU to share a common memory address space. To quickly transfer data between the CPU and GPU, a simple pointer needs to be passed between these two units. This ability to share the same memory address space is a nice efficiency boost for CPU-GPU communications and has further implications for GPGPU workloads.
Ok. But if Mantle, which is an API for the new R9-290X is based on Xbox's API, couldn't we infer that the GPU on the Xbox has a lot in common with the R9-290X than the 7700 series that was speculated?
Nope. The Xbox One's GPU specs are indeed on par with the Radeon 7700 series in terms of computational throughput. The Xbox One's GPU does have some HSA features which the Radeon 7700 series does not. This will help the Xbox One's efficiency but it is still bound to an upper limit that Radeon 7700 series share.
I think the timing of this is fantastic. AMD has been refining/emphasizing the "APU" space, and they are ready to move GCN into APU's. As is, the APU graphics are getting closer to mainstream implementation for gaming... maybe this is the plan AMD had all along, to provide the boost the GCN core in APUs needs to make it big and provide a better gaming experience sans dedicated graphics. At the same time, we're hearing about SteamOS and Steamboxes, which are looking to compete with console gaming, where Mantle would allow console games to port into SteamOS usable programming quickly, and AMD is fully on-board with that it seems. Within the SteamOS, we expect less overhead from Linux than we do Windows, and now even less overhead with Mantle than the other APIs. Seems like the SteamBox is getting even more desirable for gamers, as they will get the benefits from a console, with the ability to upgrade the hardware. I don't see Mantle being a problem in the PC space. Dual API writing won't be *as big of an issue* as it was with Glide, since the games will already be somewhat written for Mantle having come from consoles. And for games not coming from consoles, the developer can simply avoid Mantle. Either way, it seems like the stage is being set for a bridging of PC and console gaming one way or the other.
Nvidia's response to mantle is a built in ARM chip. Not only does it have dedicate hardware doing the draw call management and physics, moreover, the ARM development environment is already mature with plenty of experienced developers not needing to relearn a new API syntax. Game is not over... oh no... far from it in fact. I for one bought a GTX 690 because I don't like where all this is going. Sure there's nothing innovative with the GTX 690, but it's the culmination of all of the tried and true standards refined to the brink of performance perfection. PC gaming was just about to have a renaissance but now all these intricately tied suppliers are going for the win causing a total incompatibility cluster bonk.
Huh, very interesting. Makes sense for AMD, I guess. It may not actually be a ton of work for developers to support, as they can presumably use Mantle on Playstation 4, Xbox One...and AMD based PCs. Direct X 11 or OpenGL in some cases might end up the port!
My only negative thoughts about it in terms of AMD is don't they have enough trouble getting their drivers to work? My avoidance of notebooks with AMD GPUs has actually increased this past year rather than decreased. I really WANT them to get their act together but I've literally been waiting on that since 1998.
Hey Ryan, why not ask the good shaps at AMD about Roy Taylor quote, not long time ago: "I think CUDA is doomed. Our industry doesn’t like proprietary standards. PhysX is an utter failure because it’s proprietary. Nobody wants it. You don’t want it, I don’t want it, gamers don’t want it. Analysts don’t want it." Then comes Mantle. LOL.
Mantle is an open standard, not proprietary, just no idea what the licensing is yet.
All Nvidia and Intel have to do is develop their own architecture and driver, so they can interface with the API.
But yes, for now AMD are the only one who will be able to run Mantle. But they control two important hardware platform, so they can permit themselves to be the one who set standard and give themselves a short/medium term advantage, like how they did back in the early x86_64 days.
That doesn't jive with the concept of a low-level API. When they say that the Mantle API is open, they just mean the interface that game engines would use into Mantle. Otherwise, if this is just another abstract API where NVIDIA and Intel can just plug in their own stuff, how would it be any different than Direct3D and OpenGL?
The article indicates that Mantle is explicitly tied to the GCN architecture.
Current Nvidia and Intel hardware most likely cannot plug into the API, but nothing is preventing them from modifying their architecture and driver to be compatible with the API, has the standard is open.
All this does is giving AMD short term advantage, has they are the first one with compatible hardware on the market (HD 7700, HD 7800 and HD 7900 series are GNC) and a long term advantage in the capacity of drawing FRAND licensing revenue for implementation of the standard, if their standard catch on.
I don't understand how that makes sense. A low-level API that is completely tied to a graphics architecture does not translate into Intel and NVIDIA having the ability to change their drivers to be "compatible" with the API. And there is no "standard" here. It's an API into GCN architecture. That's the opposite of the word "standard". Where are you getting this from?
Nothing is stopping nvidia and Intel from making a Mantle driver. Nothing but the fact that it will be slower than using directX unless you are going straight to GCN hardware. Possibly much slower depending on how different your hardware is (gods help you if you are using tiled mobile architecture).
Why are you guys so convinced it's the Xbox One low level API instead of, say, the PS4's? I thought that most of the API work done for the Xbone was done by MS and most of the API work done for the PS4 was done by AMD and Sony.
That's what I've read when I've read about such things. So why act like Xbone is the only next gen console with GCN hardware coming out? Hell, last I checked, the Xbone doesn't even seem to have a fully HSA-spec'ed part unlike the PS4's fully HSA-capable hardware.
MS had their own version that made it so they didn't require HSA. That's what I was reading a few weeks ago...
So what information is making you lock onto just the Xbone's low level API instead of including PS4 in there?
I'm also curious how you aren't seeing the real problem with this. It isn't that AMD doing it and everyone else stays Direct3D, okay so that's fine. The real problem is this is going to redefine how the TWIMTBP vs Gaming Evolved is going to play out. It's especially going to affect any title that Intel decides to throw its rather HUGE budget toward "owning."
Because if AMD can do Mantle, then nVidia can do something they'll probably tie into CUDA, and Intel will gladly do something to lock out AMD and nVidia. Imagine having to disable your discrete GPU to run your integrated GPU just to play the new Angry Birds. Why aren't you seeing the obvious here?
If AMD who is as cash strapped as any of these three companies can afford to buy a few developers, what do you think nVidia whose entire existence is reliant upon being perceived as the high end (and who has a long history of practically co-developing games, re: Batman AA) is going to do? What do you think Intel who just needs a way to "compel" gamers to use their GPU's is going to do with all that money they have just sitting around in money vaults?
Both of the latter have shown a renewed focus on gaming, especially Intel. They aren't going to sit by and let Mantle take away the high performance crown. They're going to either work together to make their own variant or they're going to individually split the industry in a threeway. How long then before games are ignoring the DirectX codepath except as a vaguely baseline "safemode?" How long before it's dropped entirely if Intel wants to REALLY own that game with a lot of money?
Do you really think the very fragile PC gaming market can handle a sudden and abrupt threeway fragmentation battle over the API's? There's just no way this is a good thing. No way. It might seem like a good thing for AMD users for a brief period of time, but then what happens when the hardware changes in the future and going down to "the metal" isn't the same in future products as it is in current ones? Let's hope no games go exclusively Mantle or we're going to be back to hoping someone "ports" a PC title so it runs in the future on more generalized GPU's that forego "specialized functions" again.
I really don't get why you're so upbeat about this. This is more than just shock and awe. You should be horrified. You should be telling them how horrible an idea this is, how badly this is going to go if it were to gain any traction, and how you pray to the gaming gods that no developer actually uses this.
Because look at what nVidia did with PhysX and imagine what they'll do with the idea of owning an API. Look at the sheer amount of money that both nVidia and Intel have to throw around. AMD is going to lose this war even if they land an opening salvo, but all gamers will lose because there will be a day when all this crap will splash out and hit us. One game or another.
And creating uncertainty in PC gaming right now is snatching defeat out of the jaws of victory.
I actually think your wrong, has Mantle is an open standard.
Worse case scenario there is a FRAND licensing attached to implementation, which is something normal for standard in electronic and computer hardware industry.
Basically due to controlling xbox one and PS4, while also having interest in the ARM and x86 market, AMD is trying to implement a low level GPU API standard, that would give them a short term advantage in other market and possibly, if not a royalty free standard, long term FRAND licensing revenue from those licensing the implementation.
Basically AMD is possibly trying to create the GPU API equivalent to WIFI in wireless networking.
I suspect that a great interviewing test for marketing shills is to get them to explain how a system that connects to AMD parts can be somehow sold as an "open system".
I can only hope that this means that SteamOS will soon have mantle drivers available, and better yet Mantle drivers in the Linux kernel (which would make OpenGL and Wayland development take off).
If what AMD said about Mantle is totally true, it's a fucking revolution in PC gaming. Cheer up and hope for the best. The competition will get a lot more critical for Nvidia.
What uncertainty? That if you buy an AMD (GCN) system that you will be able to play console ports without overhead? nVidia (and Intel) can make all the APIs they want, the hard part is how to convince developers to code for them. AMD *knows* that there are plenty of developers that already have coded games "to the metal" of GCN-based systems, and is giving them the chance to use that code with Mantle. All nVidia (and Intel) have to do is convince those developers to do the same all over again without all the benefits of directX for 10% (or less) benefit.
Sounds like the only "uncertainty" is how much this will benefit AMD. There is also the issue of Steamboxes. I also fail to see any issue in how selling steamboxes will harm "PC gaming" (the idea of a port to Steambox without a PC port seems unlikely).
It is, that actually the main reason it exist, has AMD control the hardware of those two important platform, they can permit themselves to develop an open standard to give themselves short term advantage in the PC market, while competition fumble at implementing compatibility to the standard in their own architecture, if the standard take hold among developer and the advantage are obvious.
"Mantle" is a new level of optimization, PS4 and Xbone use the same GCN architecture found on 7000 series and above. The only difference is the API (Mantle) which is not a whole lot different. That's why AMD introduced GCN architecture to PC because they were already working and producing it for PS4 and Xbone.
Good write up, with only one mistake that I saw. I clearly heard in the presentation that Mantle is an open API--that is where the comparison with Glide breaks down. AMD is not going to sit on it as proprietary. nVidia will be free to use it and so will anyone else.
noob question: is it in the realm of possibility that nvidia could be forced to license gcn architecture from AMD same as Intel has 64-bit tech? perhaps that is the master plan from the red team. If so, its brilliant, but only if Mantle is adopted by majority of developers i suppose, and the performance boost must be exceptional for that to happen.
only gcn gpu´s can use mantle, so imagine a mantle only game....only newer cards will run it, what will happen when amd changes their architecture in the future.... no retro compactibility? ... like ps4 not running ps3 titles?
Guys, i read somewhere that the Mantle will be open source... I think that has the possibility to solve the different GPU architectures problem. And doesn't MMOs have very small draw calls? I think that a low-level API can dramatically improve MMOs graphics, doesn't it?
Only for those in development (think Elder Scrolls Online, especially with the console connection). WoW might be able to afford the client development, but I doubt any other mature MMO would consider it. My understanding is that WoW simply doesn't need all the power that a GCN (7770 or better) GPU can bring to the table, I doubt that they need consider mantle.
It's gonna be pretty hard to use this technology since I buy ONLY nvidia based graphics cards. The performance gains can't possibly be as impressive as everyone here is dreaming about. And then there's execution - I can't tell you how many API's have died from shoddy execution. It's best temper your excitement until more information is available.
Higher abstraction, easier to use libraries and even high level language bindings could and I expect will be developed on top. Some may keep true to the architectural differences and others may try to bridge them to offer a unified interface. The diversity of solutions will lead to much better tools for developers and more efficient use of hardware than what OpenGL and Direct3D offer currently.
Maybe most of you don´t remember that, but long time ago, when S3 was still a player in the market, along with their savage line of gpus there was a proprieetary low level API called METAL, which was kinda buggy and all, like the cards themselves, but i recall that only Unreal had support for it, and was like 50% more powerfull using that api than in opengl, and using better texttures due s3tc, which became a standard afterwards. maybe is something like this that we should expect, but with real interest to implement, not that one game and buggy thing???
What I dislike about Mantle is that it would give PC users who own AMD hardware a huge advantage, and Nvidia would not be able to comete. Having both Mantle and DirectX would fuck up the GPU market. I own a GCN GPU, but I don't want competition to go away. Nvidia could not compete with this because they have no standing in the new generation of consoles. So like everyone else is saying, if developers decide to use Mantle, they would effectively need to develop two different ports for PC to accommodate both Mantle compatible GPUs (GCN) and Nvidia hardware. Mantle is supposed to make developing more efficient, but it actually makes it harder by making the processes of porting to PC more complicated.
It could allow games to run natively on Steam OS. That is if Valve decides to work with AMD to implement Mantle into Steam OS. Valve then would benefit from the porting process because developers might decide to port to Steam OS while they are in the process of porting to Windows.
For starters, I want to iterate that I actually like, prefer Nvidia drivers, but hate their CEO. Second, it was massively stupid regardless of current positioning to let AMD win both the Xbox and Playstation contract, and I hope they suffer the maximum damage. I simply have to support the smarter party.
"Consequently while Mantle is good for AMD users, is Mantle good for NVIDIA and Intel users? Do developers start splitting their limited resources between Mantle and Direct3D, spending less time and resources on their Direct3D rendering paths as a result?"
That was a very unintelligible and unneeded statement. Mantle exists so that Playstation and Xbox game developers can keep mantle code while they do as they normally do, program Direct3d API. Anand, you even said this yourself earlier. And the whole point is to give AMD an advantage, and gamers and power users can't complain. You failed to learn the "What's New This Time Around with the Low Level API abstract". The difference is AMD owns the biggest two console players and developers know they need to keep developing in Direct3D, but can keep mantle code.
I for one support mantle and validate its eternal existence.
No, Mantle is developed for GCN architecture only (found on 7000 series and above). This is clearly bad news for Nvidia and they should be worried. It's gonna become a whole lot easier for developers to port their games from next-gen consoles to Mantle than on other APIs.
I, for one, miss Glide---yeah, you were locked into one chip manufacturer, but man, you were pretty much guaranteed that the game you ran on that API would run properly, and as the developer intended---simply because the driver the developers wrote to could be highly optimized. While D3D/OGL allows choice in chips, it also increases the little bugs and such and not optimized performance across competing chipsets (usually AMD vs Nvidia, although it has been the reverse at times).
Still, even if 3dfx had not died, they too were moving towards D3D optimization before Nvidia ended their existence, and Glide would have died out anyways.
Wouldn't one expect Gabe newell and Valve to be watering their mouths over this? I mean their Steam machines are Linux based, and couldn't Mantle be the software to outdo DX11, and their Linux build be much more gamer friendly than Windows ever was?
Question is of course, what Sony and Microsoft would think of AMD helping Valve to be direct competition for the consoles. Microsoft sure as hell won't like it if Steammachines end up with Linux and Mantle :)
The impression I get is that AMD is very much not committed to SteamOS... Android... and just about anything L-related. Microsoft shouldn't have a hard time keeping AMD in the fold.
It'll probably be the same low-level API provided by Sony and MS. GCN is GCN, pretty much like 68000 was a 68000 on an Amiga, ST or Mac. Used to take 68000 assembler done on an Amiga to Mac at Uni. If it resolves the draw calls performance on PCs and allows games like GTA series to really use our GPUs kudos to AMD. I believe where low-level code is being used on the consoles it being ported across for AMD GCN videocards. Still need MS to address draw call performance at the high-level though.
Sounds like horrible 90s idea. Stupid horrible idea. This is vendor lock-in (and good luck with drivers) andlock-in to current architecture.
Should NVidia get significantly better arch with Maxwell or even Volt, AMD won't be able to react.
And PCs are not consoles, keep console crap out of it and forget this ever existed. (Didn't even prove its own reason to exist against current solution like DX11.)
In Windows World Direct3D maybe a high level API, but on GNU Linux OpenGL isn't a high level API, more so when you have things like LLVM to compile and optimize shaders for the GPU architecture. After all with Mantle you are still using an API. Nobody would program with GPU assembler in a large project, with so many registers and long instructions.
Huh? OpenGL is a very high level API. In fact, it is a general purpose graphics API that is completely abstracted from the hardware. Yes, Mantle is an API, but its functions have a 1:1 correspondence with a hardware function in the same manner that the _InterlockedExchange() intrinsic function maps to the xchg4 opcode on x86 hardware. It is mostly like intrinsics, although for GCN hardware instead of x86. Register assignment, etc. is handled by the compiler (or the API function itself is coded in assembler).
2. Due to point one, Game devs will need to better optimize their code for hardware.
3. Games development is focused on reaching as many peoples devices as possible. Consoles, PCs, Smart Phones... More platforms = more money. PC exclusives will become less and less common, this is not a bad thing. Even for die hard PC gamers like myself, I can see the benefit of more hardware optimized code. Note: I own a GTX 670.
4. -IF- mantle is in both the PS4 and XB1 then game developers that work on AAA titles will take advantage of it. We've already heard that Frostbite 3 engine titles support Mantle.
Frostbite 3 games in development: Command & Conquer Battlefield 4 Need for Speed: Rivals Plants vs. Zombies: Garden Warfare Dragon Age: Inquisition Mirror's Edge 2 Star Wars: Battlefront Untitled Mass Effect game - Bioware
5. Even if mantle is a total failure, pretty much all future console games will likely be on PC because porting is ridiculously easy now thanks to X86. If you look at the launch titles for XB1, PS4... even the "exclusive" titles are on PC too.
Anyone who thinks AMD has "won" is crazy and a complete fanboy. No one has "won" anything. I've flip flopped between AMD and Nvidia as each one takes the upper hand, and I will continue to do so. Any gains AMD makes here, Nvidia will make up in some other area. Then AMD will catch up to that, and on and on we go. The amount of religious zealotry and fanboyism in this thread is disturbing: Video cards are not a religion. They are a tool. You use the best one available to you at the price point you can afford, regardless of the name badge on it.
Where is the opensource part? I also don't understand why OpenGL/DX can't include something like Mantle and the NVidia/ Intel competitors under its umbrella with some porting tools. Glide has several wrappers..
How can you negate the negative parts of things like Mantle?
Why is so many against this. AMD took an intiative to develope this from a request from PC game developers. Even if 3/4 can't use this technology it's an indication how big or smal the benefits are. Read somewhere that MS has given it full acceptance.
Don't ask how I got to this conclusion. But it seems that Microsoft is going to buy nVidia (starting with) in the near future and start building/designing their own chips.
Why would Microsoft do that, people purchasing and developing on their ecosystem rely on x86 architecture, Nvidia doesn't have x86 license. On top of that, it would probably alienate Intel and you don't want to alienate Intel, if most of your business is based on x86. Such purchase would most likely get them to start a push for Linux to OEM and software developer.
They don't need to push Linux, doesn't Apple do so?? They now own Nokia, and they started making their own tablets. Windows Phone and Windows RT should be enough to saturate all of nVidia's production Tegras. Tegra might as well be proprietary since it really failed on Android.
Microsoft is going to leave x86 alone (for both Intel and AMD). It's a win-win strategy for them. But Windows on ARM (RT/Phone) needs to be really tightly integrated with hardware (and more cheaply so) to compete effectively with "flagship" Android tablets/smartphones, the iPad/iPhone, and most importantly, cheap Androids that are powered by CHEAP and relatively powerful chinese SoCs. Microsoft now owns Nokia (as I said above), which means that they're going for most of the mobile business. If Microsoft wants to further pursue a "profitable" hardware side of business, it's essential (and common sense) they buy the ARM chip division of a chip maker, or at least start designing their own SoCs (like Apple does). We all know that the most (or more appropriately, the ONLY) profitable mobile OEMs are those who design/fab their own SoCs (Apple and Samsung). And with how things are going, this is how I look at it:
- Qualcomm is the indisputable king of ARM SoCs. nVidia simply cannot compete at this time. - Intel is the indisputable king of x86/64 CPUs, AMD is second best (and the only competitor). nVidia doesn't and probably won't ever compete in that market. - AMD is already giving nVidia a hard time in the GPU business. AMD is more likely to gain even more ground with ^that low level API and will make things that much harder on nVidia in their "only" successful line of business, especially that they lost the console war.
This is all bad news for nVidia. They now only "appeal" to a very few enthusiast who look for marginal gains, while the majority of the market finds AMD GPUs more appealing. Look how sharply they lowered the price of their GPUs in the past couple of years. Their profits aren't nearly as they used to be. They knew that already and that's why they went forward with their ARM division, which like I said, isn't doing as well anymore.
I believe it's all a prelude to lowering the market share of Tegra chips, weakening their discrete GPU market, and ultimately buying them out, (or at least their ARM division). Typical business/buyout practice. We've seen this already with Nokia. Also, Icera's tech is also appealing for LTE you know.
In 2-3 years, Nokia Windows Phones, and Surfaces are only going to be powered by nVidia SoCs with Icera LTE modems, and are all a part of Microsoft. That's the only way Microsoft might be profitable in a business where a comprehensive hardware, software, and services ecosystem is essential for success.
Despite all the valid caveats, I'm positively giddy right now.
By Occam's razor and just general bloody common sense the console connection is absolutely inevitable. Ever since they announced the new console specs, I've been preaching left and right how awesome they are. Not because I'm a fanboy mind you(if anything you could put me in the hater camp, never owned one, probably never will), but because of the impact they will have on PC gaming and the industry in general. Case in point right here.
Maybe even, just maybe with that very same connection the industry will sit down on their collective asses and standardize some of the stuff. One can always dream...
Still, back to the real world. Wait and see we must. I hope it turns out the silver bullet we all want it to be.
Silly to say AMD won't confirm this and pretend you are cleverly analyzing all of this when in fact it is all said at the damn AMD conference in the DICE presentation...
I find it deliciously amusing that all these AMD fanboys are sitting here all quivering in their little space boots, when if they were smart enough to see the forest for the friggin trees they would realize that having one player controlling the entire game IS NOT GOOD FOR THE CONSUMER. In the very same breath they will slam MS or Nvidia for monopolistic practices, and then talk about how great it would be for AMD to smash nvidia and MS into the ground.
like Gregory said I am alarmed that a stay at home mom able to earn $5886 in 1 month on the internet. visit their website............B u z z 5 5 . com open the link without spaces
There is no reason you couldn't take advantage of both the low-level performance and high level convience. All you would need to do is compile down your game to some kind of intermediate code representation and have the Nvidia/AMD compiler translate that IR to native code on the machine it is run on.
Yes, you would need to keep most of the program in IR since many optimizations will probably require knowledge of the program making the calls.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
244 Comments
Back to Article
itchytimes - Thursday, September 26, 2013 - link
Amazing move AMD. They are going for gold. Game over nvidia...nathanddrews - Thursday, September 26, 2013 - link
I, for one, welcome our new AMD overlords.I've got to give AMD credit here. They've been struggling in the desktop CPU space (Intel), aren't competitive in the mobile CPU space (Intel), didn't even show up to the tablet/smartphone space (Samsung, Qualcomm, NVIDIA, Intel), in addition to losing a number of design wins against NVIDIA for GPU superiority. I mean, if their new flagship GPU can't consistently beat Titan (rumored)... ouch. Clearly, being the nice guy on the block doesn't pay off anymore:
http://www.ngohq.com/news/16519-amd-senior-manager...
This move to unite their multi-platform GCN architecture is a good move. I have a mix of NVIDIA and AMD GPUs under my roof (mostly NVIDIA), so I'm not completely excited about it if it means Mantle-enabled games don't play well on my non-GCN hardware. However, I assume the measurable, real-world performance will be a lot more like TWIMTBP and less like Glide for the first couple years.
phoenix_rizzen - Thursday, September 26, 2013 - link
"didn't even show up to the tablet/smartphone space (Samsung, Qualcomm, NVIDIA, Intel),"Note: Adreno GPU technology in Qualcomm SoC's originally came from ATi. They sold the division to Qualcomm. IOW, some of the best mobile GPU tech originated with ATi/AMD. :)
toyotabedzrock - Thursday, September 26, 2013 - link
Notice they mentioned 1 watt applications yesterday. Which means they are going after mobile again soon.Brutalizer - Thursday, September 26, 2013 - link
A question: could the DirectX driver be rewritten to use Mantle instead? And OpenGL driver be rewritten to use Mantle?inighthawki - Thursday, September 26, 2013 - link
DX and OpenGL are too high level to expose some of the features that are likely exposed by Mantle, but that doesn't mean thy cannot benefit from some of the features. Someone below mentioned how Mantle is probably just a "scene graph". Very unlikely. More likely is the enhanced draw calls they are referring to is the same thing Nvidia presented a few years ago via OpenGL extensions - bindless graphics. With enough work, the API can also then start harnessing GPU memory as pointers instead of data bound by a set of registers, which is where OpenGL and DX mostly start breaking down, and cannot do this today. Direct memory access can have huge benefits.Brutalizer - Friday, September 27, 2013 - link
Ok, thanx. Another question: does not Mantle kill off all game consoles? If we take the same hardware, the console will always be faster. That is because there is less bloat on the console, you program directly to the metal on the console. With Mantle, you program directly to the metal on PS4 and Xbone. But, when Mantle is ported to PCs, you program also directly to the metal on the PC, which means the consoles will have no advantages anymore. Thus, Mantle will equalize PCs and consoles with respect to bloat. Thus, consoles will fall behind even more. But, accordint to rumours, both Sony and MS said this console generation will be the last.An AMD rep recently said in a interview "our new graphics card will ridicule NVIDA Titan in BF4 when running Mantle, it will be much faster". As the new AMD graphics card will target the GTX780, but run faster than a Titan with Mantle, AMD will give very good value for the money. This is a very interesting move by AMD. As Mantle is open, anyone can join.
bji - Friday, September 27, 2013 - link
Mantle does not kill consoles because PCs generally don't compete with consoles. Consoles go in the living room, PCs generally do not.gustavo221b - Friday, March 14, 2014 - link
Steam Machines (I'm from the future :P)inighthawki - Friday, September 27, 2013 - link
Consoles will continue to take advantage of a handful of specific hardware optimizations that software cannot fix. Other things like having a unified system memory and video memory pool is an incredible value. Today, Video memory is special and memory has to be paged in and out from a system memory backed section of memory by the OS. In most simple cases this is not an issue, as there is a one-time cost to upload the data and make ti resident on the GPU, but it makes it incredibly expensive to read back results from the GPU, one of the reasons that makes compute work on the GPU still pretty difficult today. Games that make use of compute typically have to either ensure that the results of the calculations are kept on the GPU in a format that can be used by other stages of their processing, or that the mass of the work is done in bulk and read back in a single operation.For the most part, though, this is where Mantle is moving towards. Low level down to the hardware access. It will make it far more difficult for the average developer to use. Consider the difficulty that some people already have today with DX and OpenGL and realize that on Windows the DX Kernel does tons and tons of stuff for you behind the scenes. Complicated operations that even some experienced graphics developers don't even know exists.
Flunk - Monday, September 30, 2013 - link
Unless gaming PCs come down to <$400 consoles are going to be safe, at least for a little while.squngy - Saturday, January 11, 2014 - link
The console will still be cheaper (for the same hardware).Jaybus - Friday, September 27, 2013 - link
Wait. I think the question is could DX and OpenGL drivers be written to use Mantle rather than direct hardware access. The answer is yes. Mantle is providing a virtual GPU that is a direct 1:1 mapping to hardware functions. It might not be quite as fast, but it should be pretty close because it wouldn't need make any more mode switches than it would for direct hardware access. The benefit would be in driver development. Once a particular OpenGL or D3D version was implemented using Mantle, it would work for any AMD GCN hardware well into the future. It would give AMD a set of stable, optimized D3D and OpenGL drivers that would work with any future GCN devices for quite some time and require far less code maintenance.Flunk - Monday, September 30, 2013 - link
Because we don't have the inside information it's fully possible that this is how their drivers have always worked, they're just exposing the middle layer now.Samus - Thursday, September 26, 2013 - link
A wrapper to convert a low level API is too slow. Back in the day using GLide wrappers on nVidia/ATI hardware was hella terrible.wumpus - Friday, September 27, 2013 - link
The problem with glide wrappers was mostly that 3dFX hardware was that much better than anything else. Of course, by the time nvidia bought 3dFX, they really should have shipped the wrapper (previously banned by 3dFX) as by then anything shipping could outrun voodoo. The other big problem was the general kludge of directX (1-4) specifically and MS software in general.Also it depends on the coding and the wrapper. QuakeGL was pretty much a high level wrapper around low-level calls, but that was coded to the wrapper. Using a dumb (and any complete wrapper has to act dumb to be correct in general use) wrapper with software that was written with low level hardware in mind is a bad idea.
Gigaplex - Friday, September 27, 2013 - link
The DirectX driver currently talks directly to the hardware, there's no point putting Mantle in as middleware.bobbozzo - Friday, September 27, 2013 - link
I think its:The DirectX API layer (MS) talks to the video driver (ATI, Intel, NV), which then talks to the hardware.
inighthawki - Friday, September 27, 2013 - link
The call order is:D3D > User Mode Driver > D3D > DX Kernel > Kernel Mode Driver > Hardware
Klimax - Friday, September 27, 2013 - link
http://msdn.microsoft.com/en-us/library/windows/ha...As documented by Microsoft. Reason for split is that there is barrier user/kernel mode and this is often most expensive. (And one thing AMD won't be able to avoid)
wumpus - Friday, September 27, 2013 - link
No. They are almost certainly completely different APIs that do different things. On the other hand, if you wanted to write your own OpenGL (or Wayland) drivers using Mantle, it should certainly be far easier (but the results will be slower than what AMD could do by writing directly to the hardware) than writing them for nvidia (without any API or other information). Linux users are salivating at the thought.On second thought, I suspect that AMD will do just that (write directX and OpenGL in Mantle). Hopefully it will make debugging a little easier. Just make sure that you look at both Mantle and non-Mantle benchmarks when comparing cards (because directX will suffer slightly if they do this).
inighthawki - Friday, September 27, 2013 - link
Writing OpenGL to be on top of Mantle will not give a performance advantage, that makes no sense, otherwise OpenGL would just have been implemented that way to begin with. OpenGL and DX drivers must obey a specific set of interfacing rules. DX for example has a separate user mode and kernel mode driver that must interact in a specific way. Bypassing these is not an option. Same goes for OpenGL, which is a standardized interface on top of the OS. Best you could do is expose a few extensions to call to Mantle.ddriver - Thursday, September 26, 2013 - link
There is nothing really preventing nvidia from doing the same, although you can already customize the pipeline significantly since it is pretty much entirely programmable, even if you lack some features implemented by the driver you can still write your own code and get hardware accelerated custom render features.I suspect Mantle is nothing more than a "scenegraph" with implemented out of the box optimizations for the drawing calls to minimize the need for context switching and data transfer plus a bunch of pre-compiled ready-to-use compute kernels.
This alone is more than enough to make up the numbers AMD claims, the difference from just drawing arbitrary stuff in sequential order to drawing selectively, in special order, employing instancing and offloading everything possible to the compute shaders will easily result in 9x if not significantly more rendering throughput. The problem is that if you want efficient drawing, you have to implement that yourself for your engine, while AMD offer to have it done for you, but only if you use an API that only works for their products. I don't think AMD even went for a hardware implementation of the scenegraph, that would have yielded more improvement, so most likely it is just a bit of code that could just as well be hooked to OpenGL and work for code that runs on every platform, so while a good effort, Mantle comes with a less-than-optimally-positive intent. It does way more good for AMD than for developers, and even does some harm, because besides competition it is also fragmentation and forcing extra work for developers. What is good for people are open and portable standards, so if AMD really wanted to make it good for users, it would simply make a good graphics API with optimizations for OpenGL that still markets AMD but works with non-AMD products too. This will give the healthy boost OpenGL needs, since it is capable and low level enough and perfectly extendable with hardware accelerated features, and unlike Direct3D it works everywhere, not only on M$$$ platforms. It was a heck of a b1tch move from M$ to disable OpenGL in windows phone, even though all the chips come with hardware support for it, once again creating more work for developers just for the sake of using a platform vendor limited API, which is BAD if it doesn't conform to an open standard.
itchytimes - Thursday, September 26, 2013 - link
'There is nothing really preventing nvidia from doing the same'Except the fact that nvidia are in none of the new consoles, while AMD hardware is powering the PS4 and Xbox One. Both of which will have the ability to use AMD's Mantle with game developers already making use of it when creating games for the consoles. Making it a no brainer to use it for carrying the game over to PC with the Mantle API available there as well.
It's ingenious.
chizow - Thursday, September 26, 2013 - link
Nvidia already has a close-to-metal API, it's called NVAPI and they've had it for years. It literally spits out basic Assembly code (ADD, MUL, MAD, MOV) to run on their GPUs, if they really wanted to do something like Mantle I'm sure they could dress up CUDA to be the front end instead of DX, but I'm not really sure they'd want to do so. I'm really not sure if the PC market wants that (2 vendor specific gaming APIs?).itchytimes - Thursday, September 26, 2013 - link
Yes but it is not nvidia or AMD coding the games, it's the game developers. They'll be using Mantle already for developing console games, so using the implementation for the PC port is already work that is done.There is zero reason to do additional work just for an nvidia low level API on PC only that is additional work and expense.
Nvidia is dead in the water with any attempt at their own version of Mantle.
chizow - Thursday, September 26, 2013 - link
Sorry, I doubt that is the case. The developer will need to compile a new runtime that uses this new API, but AMD will most likely still need to provide some high level compiler or tools that converts OpenGL, C or whatever high level language they are using for source code and convert it to AMD's close-to-metal Mantle machine code.There is no way Devs are going to take on the additional burden of hand coding machine code for Mantle, that's a fact.
PhoenixEnigma - Thursday, September 26, 2013 - link
The idea is that instead of writing something in OpenGL, developers write it in Mantle, which is harder but gets better performance. They're already doing this for the consoles where they need to eke out every last drop of performance, and AMD is already making developer tools to support this there. AMD is now giving them a way to take these more hand tuned routines and port them to PC, bypassing OpenGL/DX. There's certainly some work involved, but it's not all new code - it's letting devs reuse high quality code they've alrady written on a new platform, where before they were limited to DX and a performance hit.For a PC-only title, it's a much harder sell, granted, since there still needs to be a DX or OpenGL codepath, but for a console port it's at least an interesting option.
chizow - Thursday, September 26, 2013 - link
Not sure where you all get the idea of developers writing code in Mantle since we already know Devs are writing code in OpenGL (PS4) or DirectX (XB1) or both (PC) as the source code. The only thing Mantle does is allow the source code to be compiled and executed in machine code at runtime, thus stripping out the middle API (DX, OpenGL).So intead of: Source code (DX, OGL) > Compiler > Platform-specific HLSL (DX, XNA, OGL) > Hardware Driver/Compiler > Machine Code > GPU
You get: Source code > Mantle Compiler > Machine Code > GPU
The 4th slide that talks about "simplified porting" touches on this.
If Devs actually had to code low-level API...you're going to have a lot of Devs who are going to need to brush up on their matrix math. Drawing a simple diagonal line in 3D space for example goes from 1 instruction in HLSL to at least 3-4 in machine code. There's simply no way any Dev is going to undertake that task, maybe Carmack but he takes that kind of challenge on for fun more than efficiency.
mfergus - Friday, September 27, 2013 - link
Except for maybe indie games, almost all games are low level on consoles. OpenGL will not be the common API used on the PS4. The PS3 also sorta supports OpenGL but nobody uses it. The 360 also supports DirectX sorta but it allows you to go much lower than PC DirectX. Low level API's have been and will always be common on consoles.chizow - Saturday, September 28, 2013 - link
Sorry, again highly unlikely, the big AAA multi-platform titles are more likely to be written in OGL/DX for the simple reason of portability. They may be compiled to run at low level on the consoles, but they are without a doubt, created in HLSL source code and then ported/compiled to each target platform.Even massive AAA platform exclusives like "The Last of Us" are fully developed on the PC:
http://www.itsartmag.com/features/making-of-the-la...
mfergus - Tuesday, October 1, 2013 - link
Naughty Dog games aren't a very good example of portability. They've done a lot of low level work on the PS3. They made and use the low level Edge engine. They develop on PCs but they do about as much PS3 specific optimizations as possible. The common multi platform engines like Unreal still have plenty of console specific bits.chizow - Wednesday, October 2, 2013 - link
Did you not look at the video and pictures? Last of Us is clearly being developed on the PC, written in C++ and OpenGL most likely, just like most every major game, then it is ported to whatever PS3 specific code they use.mfergus - Thursday, October 3, 2013 - link
I never disputed they developed on PC not sure why you had to repeat it. They have a version that works fine on PC hardware but they definitely do a lot of work porting it to get good performance out of the PS3. They made Edge and were the first to do deferred SPU lighting.I don't think most dev's will do much low level work on the newer consoles but Naughty Dog will be one of those do and make low level PS4 specific tools.
mfergus - Thursday, October 3, 2013 - link
I agree with basically everything your saying though, devs will not write not write in Mantle first and foremost.mikato - Monday, September 30, 2013 - link
I don't know, I agree with itchy. What about game engine developers? They might take that on, especially since they know they have a couple consoles to work on that will be around a long time.Samus - Thursday, September 26, 2013 - link
PhysX is a low-level API, too.blanarahul - Thursday, September 26, 2013 - link
"Used instead of DX11 on compatible Radeon GPUs"My jaw dropped when I read this statement!
toyotabedzrock - Thursday, September 26, 2013 - link
CUDA...Klimax - Friday, September 27, 2013 - link
+NVAPIddriver - Thursday, September 26, 2013 - link
I mean even good old vertex or fragment shader allows for explicit access to data and instructions what to do with it, and it all runs in hardware. Any lower access will mean assembly code, generic OpenGL allows for low enough level access to anything the hardware is capable of computationally, in fact enough to simply implement an optimized graphics API on top of OpenGL and still result in maximized rendering throughput, making Mantle entirely possible as a mere software implementation.I suspect AMD are marketing it as some low level magic feature in a hopeless attempt to misguide the competition and send them into doing something crazy, when in reality what AMD are doing is giving a software shoulder to developers who commit to develop against something that only runs on AMD hardware. It is just AMD losing its fabs and being even less competitive with Intel trying to make it up using this apparently new thing to them called "software". Too bad the competition can do that too, so AMD will only have a brief window to exploit that advantage until other begin to "pollute" that niche until hopefully, someone will release something open and portable that doesn't cost 1000$ per target device platform. That is the big problem of OpenGL, no big players pour money into it like MS does in their APIs, which aren't really any better, it is just companies like intel, m$ or nvidia are only willing to "lend" you help in the form of a vendor loyalty pledge, but not give any effort to anything that can run everywhere, that is why Direct3D is pretty much dominating gaming, M$ paid quite a lot to make using their API easier and in the same time moved to make OpenGL development harder and at this point even disabled on platforms which come with it. So technically, D3D is better directly because of m$ being greedy evil monopolists. And since linux fails at anything but running the most critical and large scale supercomputers of the world, most of the PCs came with windows, which enabled m$ to erect (also figuratively) the pillar of D3D when another open API was already present and was in the need of extra development. If M$ were at least a tiny bit decent, they'd pour all the effort of the creating and promotion of D3D into OpenGL, it would be all that much easier to use API today, and I don't say better because it already comes close enough to the hardware, but it could have come earlier if M$ were capable of allowing something they did to make it possible for developers to breach the barrier that is developing expressively for Micro$oft.
Some people say we should thank companies like MS, Intel, AMD and Nvidia, but in reality it is only those companies that should thank us for our money, each of them has done it, and the bigger the fist the more shameless in its exploits to help developers exclusively in ways that egoistically benefit only its product range while making it harder and for a lot of time almost impossible for developers to be vendor independent, which is a basic consumer right. The little they do to make developing exclusively for their products is tiny compared to the damage and the delay they caused to the IT revolution, the result of which does obviously work in favor of the industry - ridiculously powerful device infest people's lives but not as the powerful tools they are for creativity and productivity, but as useless toys, tools good only for milking people out of their money and privacy. This is where pioneers like the inventor of the rounded corner - Apple, a company which struggled to make a profit on high quality HPC workstations but made the top on selling ego-boosting trinkets at ridiculous profit margins.
So, while some people see AMD doing good to some people, it is effectively just doing the usual kind of bad companies like that do all the time. Hurray for fragmentation and proprietary standards.
purerice - Thursday, September 26, 2013 - link
At the end you actually spelled MS without $. What happened, did the chip temporarily fall off your shoulder or something?Take a chill pill dude. AMD and Nvidia are neck and neck. All this will do is potentially help developers better optimize for AMD... IF THEY CHOOSE TO DO SO! It will still be CHEAPER and FASTER for PC companies to write as they have been. This only HELPS console developers who don't already port to PC.
Man, you have to be a real pessimist to find something that negative about this story.
ddriver - Thursday, September 26, 2013 - link
I decided to leave one in to avoid ambiguity. And this has nothing to do with pessimism, it is more like you see all butterflies and candy where in reality lie sneaky corporate extortion tactics. Sorry for having higher standards - that's what you want to hear, right? ;)bunsHole - Friday, September 27, 2013 - link
You seem like such a tool. Wowwwww...Eled. - Saturday, November 23, 2013 - link
Yeah, the only thing I could see upon reading this article, is "And yet another proprietary, platform-locked, system."At the time OpenGL and OpenCL seem to make slow progress in their respective domains, here comes another new proprietary standard that might, again, postpone the development and wider adoption of open and multi-platform technologies. I'm not sure anyone should feel rejoiced by reading that.
Scabies - Thursday, September 26, 2013 - link
Playing the "I hate Microsoft" card brings another important piece into the picture: the new SteamOS and Steam Machines.boyluck04 - Thursday, September 26, 2013 - link
Yeah, i like it.toyotabedzrock - Thursday, September 26, 2013 - link
It will work on all of one card if the 280 and below are not the new gcn.Tams80 - Thursday, September 26, 2013 - link
It certainly explains why AMD where prepared to go for apparently such low margins on the consoles.I wonder if Nvidia saw this coming?
krumme - Thursday, September 26, 2013 - link
You have to remember who said it was low margins:Nvidia.
Did they see this comming. Probably yes. But they didnt have an x86 cpu.
Did amd knew nv didnt have a x86 cpu. Well yes.
Is amd in a situation where they can do other than earn money for new products?
No. They dont have any options.
Its good profit for consoles. Its good profit for gpu.
It did take some years to get here.
mikato - Monday, September 30, 2013 - link
Hey good common sense :)wumpus - Friday, September 27, 2013 - link
Did you notice that even according to their propaganda slides, this is only going to improve things by about 10%?I think Microsoft has more to worry about than nvidia. The basic assumption (a huge problem: if it is so good for marketing than why didn't they come out and say it?) is that you need only code for:
Windows (with Mantle)
Xbone (near-exact Mantle interface)
PS4 (near-exact Mantle interface: hey, coding to the metal is going to look similar with the same metal.
SteamBox (soon to arrive mantle interface, at least if it ships with AMD)
Yes, while this isn't good for nvidia (you get a port, while AMD users get the native code that developers sweated for xbone and PS4 consoles) it certainly takes a lot of MS's control of the desktop & living room away from them.
The flip side of this is that I'm much less sure that anyone other than AAAA-game developers will be using this. This means that anything pre-Mantle and remotely indie will be using the directX drivers. Do you really think that AMD (especially considering its financial situation) will be really sweating the non-mantle drivers like they need to?
ingwe - Thursday, September 26, 2013 - link
This is really fascinating. I am not sure this will work out for AMD. If it does though, they should make up a lot of ground in the market (I would think).DanNeely - Thursday, September 26, 2013 - link
The main limitation they'll face here is that it's apparently coupled closely with GCN, and GPU cores change design much more frequently than consoles do. GCN's already been around for a while, under normal circumstances I'd expect it to be replaced in another year or two, but if successful Mantle would highly limit their ability to do so for the remainder of the decade. This gives nVidia and even Intel plenty of time to push something that needs a major change and torpedo AMD.gdansk - Thursday, September 26, 2013 - link
It's likely have to re-implement parts of OpenGL and Direct3D on any sufficiently different new architecture anyway? If they go about implementing OpenGL and Direct3D on top of Mantle, it could be done to create less overall work. They'd only have to implement Mantle for each new radically different architecture.klagermkii - Thursday, September 26, 2013 - link
What the hell, Microsoft? Have you lost so much control of the direction of your platform, that an alternate 3D API like Mantle can even be considered? You brought all the GPU makers in line with DirectX, cementing that as a primary development target and now the situation has deteriorated to this. AMD powering PS4, Xbox One and PCs... give them a common AMD-owned API and THEY become a platform target instead. So much time spent chasing the iPad and Android and all this Metro/Modern nonsense and not defending their desktop foundation.Five years ago any hope for Steam OS and the idea of Glide 2013 Edition would have sounded insane. Now someone could write games for Mantle that would run faster and could potentially be easily ported over to alternate operating systems like Linux if/when that gets Mantle.
I'm not even a huge fan of Microsoft, but dammit, get up and fight for your standardised platform!
TwiSparkle - Thursday, September 26, 2013 - link
There are many advantages to DirectX, portability being a major one. The thing is that, as stated in the article, DirectX comes with a lot of overhead. You can get more performance out of a lower level API. There is only so much that Microsoft can do to limit overhead while still allowing portability of code.DanNeely - Thursday, September 26, 2013 - link
I'm not sure what you expect them to do? MS banning AMD from offering anything beyond D3D in its drivers would be a huge anti-trust mess.wumpus - Friday, September 27, 2013 - link
And when did that ever stop Bill Gates (although I think that he had to decide between retiring and eventually losing to the Justice Department. So he retired with MS intact)?mikato - Monday, September 30, 2013 - link
And I figured he was just required to keep Ballmer.B3an - Thursday, September 26, 2013 - link
Err what do you expect MS to do here? DirectX is a high level API, and it needs to be in order to work with all the different GPU's for PC. MS couldn't make something to compete with Mantle, as it is a low level API specifically designed for AMD's GCN architecture.Neither DX or OpenGL could ever compete with a low level graphics API when it comes to performance and getting the most out of a GPU. Even if games use Mantle, they will also HAVE to continue having DX support as well, in order for them to work on Nvidia and Intel GPU's.
MS already made improvements in Win8 to the desktop, including new DX versions in both 8 and 8.1, with 8.1 having the same DX 11.2 version as the Xbox One. So don't mention that typical uninformed BS about MS focusing just on Metro, because Win8 has more desktop improvements over Win7 than 7 had over Vista.
klagermkii - Thursday, September 26, 2013 - link
There have been some improvements to the desktop in Windows 8, but it's clearly being treated as a legacy environment and Microsoft sees the future as being Modern/Metro. That's why we have Surface, why all of their platforms are taking their interface cues from Modern/Metro (X1/WP/W8). That's where the focus is. The traditional PC desktop is no longer seen as a growth market."Neither DX or OpenGL could ever compete with a low level graphics API when it comes to performance and getting the most out of a GPU."
Sure, and that's always been the case. So how did interoperability win out before, when performance would have been more important when it was older, weaker hardware?
"Even if games use Mantle, they will also HAVE to continue having DX support as well, in order for them to work on Nvidia and Intel GPU's."
Yeah, so you have Mantle and DX, one of those two ends up being the primary platform it gets developed towards and it gets ported to the other. If you played Glide and DirectX games you remember how rubbish the DX versions were. I remember Wing Commander: Prophecy when I sold my Voodoo 1 to get a TNT2, and it looked terrible afterwards. Part of that is how immature DX was, but also the developer is only going to go to put in so much work to support a less popular platform.
"Err what do you expect MS to do here?"
What I expect is for them to have developers for the Xbox One write to DirectX rather than writing to Mantle just like they have for all their previous consoles. It allows for easier ports to standard Windows and gives DirectX traction over Mantle. They certainly have control over that as part of their certification process, but from the tone of this article it sounds like they're making Mantle a core API on the console itself.
You say developers would "HAVE to continue having DX support", but that DirectX support could get pretty poor. Sure of the total game playing market on PC, AMD and NVIDIA may be 30% of the market each with Intel being the other 40% (thumb suck numbers), and based on that the developer would split their resources between DX and Mantle. But if they're targeting Xbox One, PS4 and PC the non-AMD portion of that starts looking very small and the amount of work they do on the DX path could correspondingly suffer.
mccarneypt - Thursday, September 26, 2013 - link
"have developers for the Xbox One write to DirectX rather than writing to Mantle"Any other issues aside, I doubt Microsoft would want to hand Sony a performance advantage like that. If Sony happily allowed usage of a lower level API with potential performance boosts on their similar hardware, while Microsoft denied it, Sony's exclusive titles would likely begin to outshine Xbone's fairly quickly I would imagine, which would hurt Xbone's perception quite a bit.
Th-z - Thursday, September 26, 2013 - link
Both consoles will have their own low level APIs, just like current gen, with or without the introduction of Mantle.wumpus - Friday, September 27, 2013 - link
Except that since "the metal" on both consoles looks astonishingly similar to each other and to shipping AMD GPUs, any API that "writes to the metal" will be astonishingly close.This part isn't the same as current consoles.
Trefugl - Thursday, September 26, 2013 - link
I get the impression that they DIDN'T write all of the code in DX on previous consoles.. They always had to write some in low level APIs (similar to mantel) in order to get the most performance out of a fixed set of hardware. How else would games look better that were developed at the end of the console's life? The difference is, this time it's given a name and talked about with the public, and of course it (or something similar) will be applied to desktop PCs.PS - I would have quoted you, but website's Po.st made it so I couldn't copy your text on my phone...
dustructo77 - Thursday, September 26, 2013 - link
I AGREE 100% With you. Anyone who says win8 or 8.1 isn't a good OS basically doesn't understand the meaning of stability and or efficiency/utilization.win8.1 is what windows 8 should have been from the get go. DX11.2 tile resources....
AMD HUMA....
look at history and watch as MS will champion again in OS performance and be the the still leading API champion.
wumpus - Friday, September 27, 2013 - link
What is wrong with windows 8 and 8.1:http://penny-arcade.com/comic/2013/06/28
Well, other than being windows ;)
ShieTar - Thursday, September 26, 2013 - link
Who says Microsoft needs to compete? They will probably implement a way to use Mantle as an alternative to Direct3D, while still being able to use all the other DirectX components (Sound, Input, Network, Fonts, 2D, etc.).JDG1980 - Thursday, September 26, 2013 - link
I thought that DirectSound, DirectInput, and DirectPlay were deprecated some time ago. Since Vista, you no longer get direct access to the hardware with DirectSound; you now need to use WASAPI to do that.Flunk - Monday, September 30, 2013 - link
They were deprecated because they were replaced with XInput, XAudio and XACT. That is, except DirectPlay which has no equivalent (mostly because one was not needed). These APIs were directly ported from the Xbox 360.HisDivineOrder - Thursday, September 26, 2013 - link
Games could use OpenGL and be easily ported to Linux, too. That's the most ridiculous thing. OpenGL is used for Android, OpenGL is used for Linux, OpenGL is used for the PS4 and PS Vita. Hell, I think OpenGL is used by Nintendo's consoles, too.OpenGL is already out there for people who want to port between multiple OS's. This API has nothing to do with that. This API has to do with AMD trying to make HSA work without getting cooperation with Intel or nVidia in creating a new API that includes it.
Neither nVidia or Intel has an interest in helping AMD recover any lost ground, so they're pointing a nuke at the entirety of PC gaming and are saying, "Now. We're asking again. Join us... or we'll take the whole thing down with us."
cbrownx88 - Thursday, September 26, 2013 - link
Very interesting metaphor there at the end of your comment :)BrutalPigeon - Thursday, September 26, 2013 - link
If I would be in a Nvidia camp, I would be very worried...Da W - Thursday, September 26, 2013 - link
This is a WAR.Windows vs Android vs iOS vs Steam OS.
Direct 3D vs OpenGL (iOS Android and SteamOS) vs Mantle
x86 vs ARM
Console+PC games (AMD) vs mobile+PC games (Nvidia and ot even that strong in mobile)
And don't forget sound!
Either *some* standard will emerge on top, which may not be the classic x86-Windows-Direct3D-Nvidia we used to know, or in this new age of multi-platform there will be a place for everybody. The burden then will be on developpers.
I wild guess:
-AMD will capture more PC-GPU market share in the next 5 years than it used to.
Kevin G - Thursday, September 26, 2013 - link
Mantle is cross platform. I strongly suspect that we'll be seeing it on Linux (and hence Steam OS) as well as OS X. It wouldn't surprise me if Mantle also appeared on the Xbox One and PS4. The only platforms that would be left out would be iOS due to the lack of AMD hardware. Similarly, the x86 version of Android *might* get Mantle but without AMD graphics in the ARM space, there is little motivation to do an Andriod port.While AMD is clearly aiming Mantle at their own hardware (ie the Glide comparisons), I haven't read where they'd block out other vendors. (And if they have, please post a link!).
Despoiler - Thursday, September 26, 2013 - link
Mantle is on the XB1 and PS4. That is why AMD is bringing it to PC.Kevin G - Thursday, September 26, 2013 - link
Did they explicitly state that it was from the PS4 or Xbox One? From the live stream all I heard is that this is a console-*like* API due to how thin the software layer is. Having said that, DirectX exists on both the PC and Xbox One but the Direct3D and driver layers are noticeably smaller due to only having one hardware configuration to account for.Impulses - Thursday, September 26, 2013 - link
Gabe's been talking up the open nature of Steam OS and has even stated they worked with NV on some things (the recent Linux driver announcement might've been spurred by that)... Somehow Mantle doesn't seem like the kind of approach he'd wanna get mixed up in.wumpus - Friday, September 27, 2013 - link
It might not have been his plan, but if AMD is laying out a widely-used (at least for AAA and higher games) API on a silver platter, I'm sure he will take it.I suspect that Gabe's take on this (next big thing or just more marketing) will likely be the most accurate one you will hear, especially in the early days.
wumpus - Friday, September 27, 2013 - link
What part of "low-level API" don't you get? Unless you want to design a GCN device, you might as well use directX or OpenGL. Using this with any other hardware (unless it is a really careful/lucky fit*) is a bad idea.* I seem to recall that a large part of the reason that QuakeGL happened was that Glide was extremely similar (for sufficiently low level calls) to OpenGL. Restrict yourself to those calls and it is easy to write a "restricted OpenGL" driver for glide. Even if you went this way, you would need devs to restrict themselves to only the calls that could be efficiently mapped to your hardware.
wumpus - Friday, September 27, 2013 - link
This will certainly be in AMD's favor. They are likely to get a nice bonus in speed essentially "free" (assuming that it is paid by the sweat of dev's writing the game on a console).The other side is to expect SteamBoxes (and boxes bought with SteamOS in mind) to be largely (or entirely) AMD. I might be writing this on a Linux box, but I really don't expect this to be big any time soon (when tablets are up to SteamOS and GCN, expect things to get interesting. Don't know if Moore's Law will scale that far).
LevelUser - Thursday, September 26, 2013 - link
Linux port of Mantle straight into steam boxes! We might be near not needing windows ?andrewaggb - Thursday, September 26, 2013 - link
It is a bit exciting isn't it. I'll probably keep running windows for a long time yet, but if you can get substantially better fps out of steam OS I might dual boot.Gloomy - Thursday, September 26, 2013 - link
Better get Nridia of your Nvidia.AMDisDead - Thursday, September 26, 2013 - link
AMD just admitted that they are lost OpenGL and Dirext X war. I mean they can't win against Nvidia in these two so they are making it's own. :D AMD is too small, I think only Battlefield 4 will use Mantle just like only Tomb Raider used TressFX.iniudan - Thursday, September 26, 2013 - link
Actually all Frostbite 3 engine game use mantle. So basically everything EA publish that not EA sport and Crytek.Finally - Friday, September 27, 2013 - link
Star Citizen will support TressFX and TrueAudio. SC is based on CryEngine 3.iniudan - Friday, September 27, 2013 - link
TressFX and TrueAudio have nothing to do with Mantle, other then been from AMD.Despoiler - Thursday, September 26, 2013 - link
Umm what? I think you missed where AMD has beaten Nvidia out of the gate since the 5000 series. It's only this current gen that Nvidia finally got their refresh out before AMD.AMDisDead - Thursday, September 26, 2013 - link
lol it doesn't matter who's first. :Dpiroroadkill - Thursday, September 26, 2013 - link
Good stuff. I want to see some benchmarks between the two APIs on the same game, same machine.Can't wait for more juicy info. It needs a Linux port soon as possible, though, because I imagine PS4 will be running Linux.
If AMD have their own, fast as hell API that runs between Linux and Windows, things could be hotting up, fast.
After all, whatever NVIDIA does now, it can't replace the sheer momentum of having their GPUs in both next gen consoles.
Novan Leon - Thursday, September 26, 2013 - link
I believe PS4 is running a FreeBSD (i.e. UNIX) based OS.Hulk - Thursday, September 26, 2013 - link
I appreciate AMD for moving to a lower level language to improve performance. Think about it. We are starting to stall when it comes to increases in compute so the next place to look is in the software for more performance.Now let's see applications start being written in Assembly for some serious performance improvements.
wumpus - Friday, September 27, 2013 - link
Um, no.On the other hand, writing in OpenCL (and wishing you had cuda:) is another story. I suspect it is even slightly easier than assembly (I wrote a fair share of assembler back in the day. Haven't gotten much further than "hello world" in cuda).
Mudpie - Thursday, September 26, 2013 - link
this is very interesting. I hope it's not going to become exclusive, meaning a game is made with/for mantle, can't play it otherwise. That would be a huge disappointment :/.piroroadkill - Thursday, September 26, 2013 - link
I highly doubt any games will be Mantle exclusive. They'll just have a DirectX or OpenGL renderer as an option alongside Mantle. Just like many old games did..nismotigerwvu - Thursday, September 26, 2013 - link
Good read, but I caught a minor typo. In the the first sentence under mantle 1 you've got an extra i on "is" there.coder543 - Thursday, September 26, 2013 - link
Interesting piece, but all of the emphasis on the Xbox One and Direct3D were a little annoying. Considering that there is a large emphasis on creating cross platform games these days, Direct3D is becoming less and less of the Goliath that it used to be. "Why write in Direct3D when OpenGL can cover Windows *and* everything else?" That's the question many games studios are focusing on today. With the Steambox running Linux, the ever-growing market share of Mac OS X, and the complete dominance of OpenGL-based tablets and phones, Direct3D Just Doesn't Matter. Not like it used to, at least.I know Anandtech loves Microsoft, but this was a little ridiculous. Do you really think AMD is going to create a low level API for the Xbox One and another one for the PS4? No. They don't have that kind of software team. Their software teams have always been on the smaller side. Mantle is more relatable to OpenGL than Direct3D -- regardless of Microsoft announcing a coincidental version -- unless someone at AMD has just gone crazy. Adding support for another shader language is insignificant.
coder543 - Thursday, September 26, 2013 - link
I'm actually most curious about compute. Mantle seems like a great opportunity for AMD to work in some GPGPU functions to accelerate certain tasks even more... I wonder what they will do in regards to this.hawler - Thursday, September 26, 2013 - link
One interesting thing worth noting is that EA games for the next gen (consoles and new PC games) are to be built on one of two engines, either Frostbite 3 or the EA Sports one (Infinity I think) if Frostbite 3 supports this that means most of EAs games will benefit from this for the foreseeable future. Say what you want about EA but they have a major role in deciding what will happen in the industry as a wholeddriver - Thursday, September 26, 2013 - link
So, now nvidia will have to make their own proprietary low level API, so developers are even more fragmented. I guess they just aim to bite from (ironic, I know) M$'s big apple, which is the product of the even more and more enforced Direct3D.Maybe this fragmentation is a good thing by being bad, right now most games target D3D with the exception of mobile games, which are mostly OpenGL, throw in additional 2-3 low level APIs from major GPU vendors and it will get even harder to port games to different hardware. In the end, this may help people realize that OpenGL suffices, and with the introduction of compute shaders, you can pretty much generate render features on demand, still being executed in hardware.
As a developer I feel safest putting my eggs in the OpenGL basket, and since I am not in the position to learn 5 different ways of doing the same thing just for the sake of the monopolistic aspiration of corporations. A low level API will be more than welcome and useful for implementing good OpenGL compliance drivers by the community, this applies to other potential APIs as well, so while there is a niche that will be good and not cause fragmentation for this extra low level API, but in should be utilized the same way as assembly, and be kept away from application developers. A public low level API means more control, which is good on its own. It is good for vendors to expose their hardware capabilities, but they should come up with an abstraction layer to unify and standardize an API, so that it is more portable, works with different device, os and driver vendors.
Gigaplex - Thursday, September 26, 2013 - link
What would NVIDIA do with a low level API without any consoles to go with it?ArthurG - Thursday, September 26, 2013 - link
To not lose on benchmarks ?medi02 - Friday, September 27, 2013 - link
Devs will likely have to use Mantle if they develop for consoles.It should be easy to port that code to PC.
Doing it again just for nVidia's share of PC market might not be something that most devs would do.
Especially if Mantle brings about 10%-ish boost.
Tams80 - Thursday, September 26, 2013 - link
Developers should be free to choose between using higher abstraction with more coverage, but worse performance, or lower abstraction with less coverage, but better performance.Using Mantle won't reduce coverage that much though as two of the three consoles will definitely be able to use it and even the WiiU may see some benefit. That's a large portion of 'proper' gaming. This will probably remain the case for several years, as I doubt any of the console makers will be in any rush to get to the next 'gen'.
mikato - Monday, September 30, 2013 - link
Great comment. I imagine AMD could have already envisioned that as one of the possible ultimate outcomes, one in which they still come out ahead.noblehouse - Thursday, September 26, 2013 - link
AMD rocks!Kepe - Thursday, September 26, 2013 - link
This is some really fascinating stuff. I've been buying both AMD and Nvidia cards in the past, the last two being Nvidia parts. I think it's time to move to the AMD camp the next time I upgrade my PC. I really hope this helps AMD gain more market share in the PC world, because we need proper competition to keep prices at a sane level.a176 - Thursday, September 26, 2013 - link
We already know the XB API is DirectX based. We know the XB OS is based off windows technologies. The XB1 API can be specifically targeted and refined for its own hardware, but does that make it any more higher/lower level than what we already have in Windows, based on the prior facts?Of course I'm not a game developer but on paper this doesn't sound like its actually anything new and groundbreaking.
Kepe - Thursday, September 26, 2013 - link
But the point is that game devs can also use the lower level APIs on XBOX One and PS4 to get more performance instead of the higher level APIs offered. And they will use them because consoles have rather limited performance compared to desktop PC hardware. So now that Mantle is here, devs can use (almost exactly) the same code they use in consoles for AMD GPUs on PCs as well.Shadowmaster625 - Thursday, September 26, 2013 - link
This is probably the best move AMD has made in the last 5 years or even longer. Leverage the console API to bring in a performance monopoly on PC gaming. If they can indeed extract significantly more performance using a low level API then everyone is obviously going to buy their GPU. Now if only they had the foresight to add a special block of microcode to their x86 cores to allow them to accelerate these API calls, it would be even that much more faster. Unfortunately, knowing history, the implementation could end up being so bad that even with this advantage, games might end up running faster on intel+nvidia hardware.B0GiE-uk- - Thursday, September 26, 2013 - link
The writer is believing that the Mantle API is being brought over form the XBOX One, how does he know its not from the PS4 which has supposedly 2x the graphics grunt of the XBOX One?I think Mantle will be great, I can't quite understand why they just didn't go for OpenGL though as it is in a resurgent state?
Also, AMD is having to revert to this as Microsoft has not been interested in PC Gaming for many years now with DirectX and GFWL being massively neglected.
Good on AMD! If this brings anywhere between a 2x - 10x frame rate increase it will spur on Nvidia and Microsoft to get off there asses! Great for the consumer!
cbgoding - Thursday, September 26, 2013 - link
2x-10x? Doubtful. If this is primarily an overhead and latency reducer, we'd probably be luck to get 20% improvements.We'll see.
andrewaggb - Thursday, September 26, 2013 - link
Yeah I was thinking maybe 10%. If 20% or more had been sitting on the table I think Nvidia or AMD would have released a low level api years ago.Kevin G - Thursday, September 26, 2013 - link
I would have thought that Mantle would have stemmed from the PS4 as well. The main reason is that MS wants to standardize development across both the Xbox One and PC which means DirectX. The PS4 is its own beast running a BSD variant and its own Sony dev.eloped graphics API reportedly based upon OpenGLTh-z - Thursday, September 26, 2013 - link
"I think Mantle will be great, I can't quite understand why they just didn't go for OpenGL though as it is in a resurgent state?"OpenGL is a high level API just like DirectX, what they want is a low level one.
Novan Leon - Thursday, September 26, 2013 - link
I don't think the PS4 is quite 100% the power of the XBO. Based on all reports, the PS4 is between 30-50% more powerful than the XBO.Relevant:
http://www.neogaf.com/forum/showthread.php?t=68366...
http://ps4versusxbone.wordpress.com/2013/09/24/dev...
Tams80 - Thursday, September 26, 2013 - link
Yeah, I found it strange the PS4 was barely mentioned.I'm guessing they haven't gone with OpenGL because it is at a higher level of abstraction.
B0GiE-uk- - Thursday, September 26, 2013 - link
I'm sure I read somewhere they are claiming up to 10x! Agree wait and see benchmarks on BF4 in december. Exciting though, imagine you get double the performance, still incredible.Microsoft should be very worried about this and start optimising and enhancing DirectX as a matter of urgency.
klagermkii - Thursday, September 26, 2013 - link
Two years ago AMD said that they were working on this and they were throwing around the 10x number then. 10x for specific draw calls though, who knows how that translates into real world performance..http://www.bit-tech.net/hardware/graphics/2011/03/...
---
'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'
remosito - Thursday, September 26, 2013 - link
@Bogie UK:You sure you don't refer to the 9x the amount of draw calls!
Draw calls are so far removed from resulting frame rates nobody can even guess how much it would improve them...
Dribble - Thursday, September 26, 2013 - link
Not a lot. It's a console problem - fast gpu, really weak cpu but lots of cores. You need a new way of splitting the draw cull across the cores or your graphics performance gets bottle necked. PC's don't have the same bottleneck as the cpu cores are much faster.Fundamentally mantle is a console api, and it's use is designed around removing bottle necks for the completely fixed console architectures. PC's have a huge variety of configurations and consequently completely different bottlenecks. If mantle really is a low level api then to really be useful you'd need to write different code for each different specification of pc to address it's particular constraints. That obviously won't happen - so what we'll get is the normal high level direct X implementation and a few quick code paths written for the consoles but that with minimal tweaking might add a little to some subset of the PC's out there. e.g. the draw cull - if you just happen to have a very fast gpu, and a very slow cpu with lots of cores this could help a bit.
chizow - Thursday, September 26, 2013 - link
Mantle seems to place a lot of responsibility and expectations on AMD and specifically their driver team. I'm not optimistic, to say the least.krumme - Thursday, September 26, 2013 - link
Well then you can buy a slower NV card, as you can buy a slower AMD cpu if you dont want 100% Intel compiler speed.chizow - Thursday, September 26, 2013 - link
Or you can just buy the Nvidia products that are faster out of the box and not bother with the vendor specific optimizations used in a handful of titles that choose to implement Mantle.iniudan - Thursday, September 26, 2013 - link
From what I know the API is actually open sourced (no idea what the license through), so Nvidia could always license (I am sure AMD would like PhysX and/or a pile of money) and build it own driver to be compatible with mantle API, if mantle really cause a huge difference in performance and take hold too much, for Nvidia to simply whether the storm.But been a system/network administrator, I admit the little of what development skill, I have, is limited to server-side script API at best, so all this is a bit too low level programming for me, so mostly just wild speculation on my part.
iniudan - Thursday, September 26, 2013 - link
Ooops, pressed reply on first comment of page instead of post a comment, sorry.chizow - Thursday, September 26, 2013 - link
I doubt the API is open sourced, or even available for license, but we will see.iniudan - Thursday, September 26, 2013 - link
It was already mention to people who were at the presentation that it was open, they just didn't mention the licensing.http://www.techspot.com/news/54134-amd-unveils-rev...
JDG1980 - Thursday, September 26, 2013 - link
Writing low-level graphics drivers should be easier and less error-prone than writing high-level graphics drivers.chizow - Thursday, September 26, 2013 - link
How so? You have 3-4 lines of machine code replacing 1 line of HLSL, which requires much sharper matrix math skills to boot.Wreckage - Thursday, September 26, 2013 - link
So they are bringing back GLIDE?What's next are they going to bring back dial-up internet? Maybe some AMD branded floppy disk drives.
Remember what happened to the company that tried to push GLIDE?
Besides AMD fans have gone on record saying they hate proprietary solutions. So they will all reject this.
Creig - Thursday, September 26, 2013 - link
AMD's Mantle initiative is a fantastic idea to bring low-level cross platform development to a huge user base. And I don't think anybody saw it coming, least of all Nvidia. Because if they had, they would never have shrugged off AMD getting both the XBox One and PS4 contracts.Back when Glide was used, it was only on the PC. And even then it was a force to be reckoned with. But today with AMD hardware not only in PCs but also in BOTH of the most popular consoles that will most likely be in production for the next 6-10 years, it's a brilliant move. Developers of both PC and console games will have low level access across both platforms (PC and console). So now they will be able to develop games with increased performance that will also be easier to port. Whoever came up with this strategy at AMD deserves a raise!
Nvidia has every right to be very nervous by this unexpected move from AMD. Because now any game designed for a console will automatically have increased low-level performance and graphics enhancements built in that will work with any GCN equipped (AMD) PC.
dookiebot - Thursday, September 26, 2013 - link
The company that pushed glide work for NVIDIA now. Who have their proprietary solutions as well. So I expect NVIDIA fans will embrace Mantle.erple2 - Friday, September 27, 2013 - link
The company that pushed glide was very successful until they fancied themselves as retailers and not virtual chip makers. Their plant for mass producing chips was not in a well established area and had many quality control problems. Then there was that whole "making your customers your competitors that then flocked to nvidia" thing. That was ultimately their downfall. Developers really liked working in GLide from what I can remember. It was far easier to use than the directx at the time (and to a lesser extent OpenGL).wumpus - Friday, September 27, 2013 - link
There was also the issue that they never made a new core after voodoo 1 (everything else was shrinks and multi-core copies of those chips). Not sure if the guys who designed it weren't interested in another design because they liked their rewards or they felt ripped off, but I suspect that other startups have the same issue of lightening not striking twice.bill.rookard - Thursday, September 26, 2013 - link
This move really shouldn't be that much of a surprise. If you look at how things are rated in the PC gaming world, where a few extra FPS or a few extra ms of frame latency can make or break a game as well as drive sales of hardware, it doesn't shock me in the least that AMD would look to leverage their work with the game consoles to provide this type of functionality.From the hardware side of things, GPU's are big, powerful, fast chips, and soon they're going to be running into the same issues that the CPU's are in that dropping down process nodes in their construction (28nm -> 22nm -> 14(?)nm) will follow the same path - eventually, you'll hit a limit as to how fast/hot these chips can be pushed. Eventually, they'll run into what I like to call 'The Haswell Limit', and the chips will become exponentially more complicated, and thus more expensive to develop and produce.
What I see AMD doing here is seeing how this has already played out and recognizing that if they want to increase their performance, they need to do things -differently-. By eliminating some of the overhead (or a lot of the overhead - we're not sure yet), it's unlocking 'free' performance. If they could (theoretically) eliminate 15% of unnecessary overhead on the GPU, that's a 15% increase in performance. That's being able to lower thermals, extract more performance, drop power usage, however they want to utilize it, and all it requires (again) is software.
Comparatively speaking, software is cheap. Hardware is not. AMD is recognizing this and asking themselves a very logical question: Is it better to optimize our hardware/software interface, or continue in a GPU arms race with nVidia.
Apparently, they've decided on option A, and it makes perfect sense to me.
Kevin G - Thursday, September 26, 2013 - link
There are simply two limiting factors in GPU hardware design today: power consumption and die size. Today modern GPU's are very programmable and while they could improve performance per clock, they can extract greater benefits from increased parallelism. Thus they won't approach the 'Haswell Limit' until after they've reached a usefulness to parallelism in their designs. Adding more parallelism is mainly a copy/paste job of a shader cluster, ROP, TMU etc. and a bit of work connecting them all together. With each die shrink, they'll be able to add more units into the same area.nVidia is willing to approach die size limitations around 550 mm^2.**. Beyond that figure, chips greatly suffer from yield issues. AMD had a small die strategy in place with the Radeon 4000 and 5000 series. They loosened up that a bit with the high end Radeon 6900 and 7900 series. It is formally gone now with the R9 and the estimated 425 mm^2 of the new chip.
The other limitation has been power. Since the Radeon 2900 introduced the supplementary 8 pin power connector, the upper limit to the PCI-E spec has been 300W. Modern cards like the Radeon 7970 and nVidia's Titan are capable of drawing that much power with modest overclock. Only recently have we seen AMD and nVidia flirt with cards consuming more power and they are strictly dual GPU solutions. Make no mistake, both AMD and nVidia could design a GPU to consume 375W or 450W with ease (mainly increase voltage a notch and clock speeds on current designs). The problem is cooling such a chip. They are already using rather exotic vapor chambers to quickly move thermal energy from the die to the heat sink. Either moving to liquid cooling or ditching the PCI-e card format to a motherboard socket is likely necessary to permit further increases in power consumption.
**For reference, lithography die limits is a bit over 700 mm^2. The largest die I've heard of is the Tukwilia based Itanium which weighed in at 699 mm^2.
ssiu - Thursday, September 26, 2013 - link
This Mantle applies to current GCN cards (HD77xx - HD79xx) too, not just upcoming cards, correct?Eddytion - Friday, September 27, 2013 - link
Yup, from the 7000 series and up. This is the best move AMD made it's last 5 years. If this works like they say it will, it would be truly revolutionary to PC gaming.ssiu - Thursday, September 26, 2013 - link
Are PS4 and Xbox One low level graphics APIs the same? If not, developers still have to do 2 different things, and I don't follow the reasoning behind "this is a port of Xbox One API to PC" (i.e. why don't you guess "this is a port of PS4 API to PC" instead).Eddytion - Friday, September 27, 2013 - link
The API is not the same, but PS4 and Xbox One use the exact same GCN Architecture, that's why they brought that architecture on PC with the 7000 series. They still have to do 2 different things but the PC API is very easy to develop since it's the same as the X1 and PS4 architecture (console level language). Now what they're trying to do is to make somewhat similar low-language API for multiple AMD GPUs that are a lot easier to support and optimize.PanzerEagle - Thursday, September 26, 2013 - link
Love the article. I think you are exactly correct on this being a Xbox One API port. I think what is missing is the implication for Microsoft. MS had to also agree to AMDs licensing terms for the CPU/GPU for the X1, not just the other way around as well. They knew full well this was coming, the question is why?AMD does not, at this point, have a mobile play. They have, over the last year, strategically set themselves up for mobile, servers, and desktop success, in part by bringing back old talent from the likes of NVIDIA, Apple, and Qualcomm. The other part is owning both and x86 and ARM licenses. They know better than anyone that graphics is driving mobile and pc business, the cpu has been commoditized, and the gpu landscape is wide open and fragmented. In comes MS with their device and service company, the X1 and Xbox Live as the center piece to be leveraged in the consumer space. The X1 is just a virtualized mesh of 3 OS's that runs on common PC hardware, I am looking at you Steam. Who says they won't release a Xbox OS for PC gamers? Not to hard to think that they could just give you a download disk that would virtualize you Win8.1 and then install the Xbox OS in another VM along side. Hardcore gamers could update there hardware as they wanted, and play against all console users at once via Live. As a side note, what better way to extend a consoles life, you could update the hardware anytime you wanted if all or most developers use Mantle, wasn't that the original idea behind DirectX? The X1 name, the ONLY One you need?
AMD's ARM license gives them access to tablets and smartphones. Qualcomm's Adreno was once AMD/ATI's. MS has stuck to Qualcomm in their phones, could it be because the hardware is still very similar, ie easy to port software? X1 has low power cores, AMD could easily add CGN to an ARM core, X1 on the tablet or phone? AMD x86 X1 Surface tablet maybe?
This last part is, I think, the ultimate play for MS. The Xbox is clearly the consumer facing brand for MS, and the engine for unification across all platforms. Platforms being the key word. If AMD can leverage Mantle and get the backing of most of the software developers, I see MS as using the X1 as a strategic partnership with AMD. Think Nokia, and you get my meaning. Intel CPUs are clearly better, but the cpu is commoditized, and their Iris Pro is the same thing MS did with the 360 and now X1. There is clearly a benefit to adding local dram on the die, sorry Sony PS4. Intel gpu is getting better but now where near AMD, and as Apple has shown cpu power is just not that important. I would not be surprised at all if this wasn't a dry run for MS to become more vertically integrated. AMD has the license, the patents, the talent, the roadmap, and aligns very well with MS vision. It's no coincident that AMD let this loose, this is MS vision as well, cross platform everything. It might not be today, or this year, but I would not doubt MS will pick up AMD for a penny, make their own hardware, and still license out all their software for everyone else. At the end of the day software is the one thing that isn't commoditized, it's just unifying the hardware underneath that has kept MS from realizing their dream, and Mr Gate's dream.
wumpus - Friday, September 27, 2013 - link
I remember a different Bill Gates, one who simply assumed that the goal was 100% marketshare. Letting AMD peddle Mantle to Steamboxes simply doesn't seem like any goal likely to fly in Redmond.JDG1980 - Thursday, September 26, 2013 - link
Funny, but I don't remember all these concerns about "fragmentation" being brought up when Nvidia came up with CUDA and PhysX.If Mantle is available on Linux and therefore allows easy porting of games to Steambox, then in a way it is more "cross-platform" than DirectX. Sure, you lose the ability to choose an Nvidia card, but you gain the ability to choose a different OS.
chizow - Thursday, September 26, 2013 - link
I don't think there was as much concern because Nvidia has a much better track record when it comes to driver support and feature integration, not to mention they have the resources (ie. money) to expand their scope of services.If you read what is being promised in this article, the increased burden on AMD is TREMENDOUS. I just don't think they can do it (well), I'm sorry.
And I'm sure you are going to challenge me on my 1st comment, but in the meantime, we got NO updates whatsoever on CrossFire framepacing, EyeFinity support (even though they were teased), and in the last few years we have seen AMD do nothing but cut support for graphics families (HD4000 and older no longer supported in every release, moved to legacy). And the latest WHQL driver release, no more Win8 or Vista support? All that on top of the big news lately with the broken CF drivers, framepacing issues.
And we really expect AMD to now create and implement a new API ecosystem for PC, XB1, PS4, Linux and also work with each dev that wants to implement their API with timely driver updates? Is anyone who owns either AMD and Nvidia cards truly satisfied with their drivers to the point they want all that added to the mix? Fragmentation concerns, indeed.
Creig - Thursday, September 26, 2013 - link
You're working pretty hard to poo-poo this, chizow. I see Wreckage is out and about on damage control as well. You Nvidia fans seem pretty nervous about Mantle.You should be.
shinkueagle - Thursday, September 26, 2013 - link
Lol!! I just love how elegantly you put those two fanboys in their place!!! I TOTALLY ROFLed at your comment! =)chizow - Thursday, September 26, 2013 - link
Nervous lol, if I were an AMD user I would be nervous. AMD's driver teams need 10 more checkboxes to fulfill like I need a hole in my head. Did you even bother to keep track of how many additional driver engineering positions alone AMD is planning to undertake? Figure 2-3 heads x $100K per seat across 3-4 new platforms...that's a lot to ask for a company that just laid off a lot of people in the last year or so.But yes Creig, it is amazingly ironic coming from an AMD fan like yourself that spent the last few years poo-poo'ing other "proprietary" APIs and standards like CUDA and PhysX, suddenly do an about-face to welcome Mantle as the greatest thing ever invented for GPUs. :)
Jigar2speed - Friday, September 27, 2013 - link
Chizow - Jigar here, you might want to stop before you crap your pants again. mate.Eddytion - Friday, September 27, 2013 - link
AND STILL with the latest drivers, my 7950 boost is faster than the GTX 670... It's a shame that you had to pay + 80$.Kevin G - Thursday, September 26, 2013 - link
Actually, we did get Crossfire frame pacing support but only for single monitor setups and resolutions up to 2560 x 1600. AMD has said they're working on frame pacing improvements for Crossfire + Eyefinity and CrossFire + 4k but haven't disclosed a time frame for release. They haven't given up but the problem is more complex by having to drive multiple displays at the same time.Dropping support for the Radeon 4000 series and older shouldn't be that concerning now that the hardware is going on five years old. At some point, support will end for a device.
Dropping Windows 8 support isn't going to be that big of a deals since Windows 8.1 is a free upgrade and that will be supported. Dropping Vista shouldn't be too alarming considering its market share: most have moved on to either Windows 7 or 8.
chizow - Thursday, September 26, 2013 - link
The expectation was there would be big news about CF and EyeFinity, and there was none. I am aware CF framepacing was addressed in a limited capacity in 13.8 beta, but that support has regressed and is curiously missing in subsequent WHQL drivers. Also, as mentioned, there is still no CF fixes for DX9 or EyeFinity and it's been nearly 3 months since the initial driver launched.Also, I misspoke about Win8 support, only 7 and 8 are supported, Vista and XP support was dropped. You might think it's OK to drop Vista, but it still has higher marketshare than 8 currently and XP still has more marketshare than both Vista and 8 combined.
Same deal for HD4000 series, idk, I think that chipset is still completely relevant and capable of supporting current games, I know for a fact the same cards from Nvidia in that generation (GTX 260/280) are perfectly viable gaming cards today, so I would certainly be disappointed if support were suddenly dropped.
Again, you might not think any of these things are a big deal, but as a whole, you can clearly see there should be healthy concern and skepticism for AMD to take on MORE driver responsibilities (Audio now too on top of that) given their rather poor track record of supporting and sustaining new and existing features.
Kevin G - Thursday, September 26, 2013 - link
The Frame Pacing + Eyefinity drivers should be out this fall:http://techreport.com/news/25428/driver-fix-for-cr...
WHQL drivers will likely follow suit. Remember that the current beta drivers are essentially testing a new feature that doesn't work in all scenarios yet.
Despite its market share, MS is dropping support for XP themselves, thus why continue to support an OS whose makers don't even support? As great as XP was back in 2001 when it launched, becoming increasingly legacy. Similarly Vista came out in 2006 and main stream support for it ended in 2012. Dropping support for Vista is to be expected.
As far as piling on additional driver work, AMD is essentially doing the same thing in the console world as far as Mantle drivers go. AMD has been developing audio drivers since the Radeon 4000 series could output audio over HDMI. I see AMD's audio work as more of a precursor to integrating audio into their SoC line than an additional GPU feature. They're clearing going to be using audio as a differentiating feature down the road in the SoC area.
chizow - Thursday, September 26, 2013 - link
Oh well, I look forward to your addendum explaining away why AMD has dropped support for Mantle in the near future. I'm sure it will be an interesting read.Kevin G - Friday, September 27, 2013 - link
Only reason to drop support for it would be if both the Xbox One and the PS4 both have relatively short life spans. Otherwise, it'll hang around for years just on merit of its console connection. Longterm I also see the console connection as the means for Mantle's eventual demise as the PC/console divide widens again over time.With DICE committed to adding Mantel to the Frostbite engine, that alone will carry Mantle forward as EA uses it across several projects over the short term.
wumpus - Friday, September 27, 2013 - link
They presumably built the costs of building the API for the XB1 and PS4 into their bids and are being paid to do so. Also it isn't nearly such an issue as it will be to completely frozen hardware. The other API costs should be pretty minimal, as you might expect from a low level API (i.e. the code should be little more than stubs. Initialization will be as fun as always, but should be the same regardless of the API.The issue isn't fragmentation (GPU families change pretty slow. I expect GCN will be around awhile), but success. With AMD getting all the major console ports to use Mantle, will they rest on their "commitment" to directX? I suspect that future directX drivers will use Mantle more and more, and thus gain that minimal overhead. Also, AMD could easily be less concerned with such less used drivers (especially if you stop seeing them in the latest benchmarks) and expect the bugs to creep in.
Frankly, if I was in in AMD's shoes, I wouldn't worry too much about trading high performance and lower driver costs for bugs in older and lower profile games.
Zan Lynx - Tuesday, October 1, 2013 - link
The Windows 8.1 update is free and better in many ways. Why should anyone still support Windows 8? Why would anyone still be running Windows 8?Were drivers and software still supporting the original WinXP after XP SP2 came out? I sure hope not. I wouldn't have. If someone asked me to support that in my software I'd just mail them a SP2 install disc.
Mugur - Thursday, September 26, 2013 - link
Hopefully the performance gain will be more than 10% on average... Otherwise is just marketing or much ado about nothing.Death666Angel - Thursday, September 26, 2013 - link
On the one hand, I hate anything that introduces new divisions among hardware and that negates standards, like Mantle seems to do. On the other hand, if it is already being widely used on XBone hardware, it could very well offer a lot of nice performance advantages (AMD 7970 owner here), which I wouldn't mind picking up. I'm interested to see how this plays out eventually.Gunbuster - Thursday, September 26, 2013 - link
Oh boy, lets code our PC game specifically to just 33% of the user base...http://store.steampowered.com/hwsurvey
Kepe - Thursday, September 26, 2013 - link
The idea is that if you make a console game (PS4 or XBOX One or both), you can use the same API on PC hardware as well. Of course you still need to make a DirectX or OpenGL implementation for the PCs that don't have a GCN GPU. But the point is you can, with very low cost, make the game work extremely well on some of the PCs as well.dookiebot - Thursday, September 26, 2013 - link
NVIDIA could have done this years ago when they were in ATI's position. But the biggest they could dream was CUDA, Phys-X, and over charging customers for this cards.Sivar - Thursday, September 26, 2013 - link
First page: "The best place to start with Mantle *iis* a high level overview."I think you've been working with Windows servers too much. :)
sviola - Thursday, September 26, 2013 - link
If Mantle is a copy of the Xbox One API, what does it mean to the Xbox GPU? Are there anything that hasn't been disclosed? Is it a derivative of the R9-290?Kevin G - Thursday, September 26, 2013 - link
The Xbox One GPU is based upon the GCN architecture with HSA extensions. Essentially the additions are to allow the Jaguar CPU cores and the GPU to share a common memory address space. To quickly transfer data between the CPU and GPU, a simple pointer needs to be passed between these two units. This ability to share the same memory address space is a nice efficiency boost for CPU-GPU communications and has further implications for GPGPU workloads.sviola - Thursday, September 26, 2013 - link
Ok. But if Mantle, which is an API for the new R9-290X is based on Xbox's API, couldn't we infer that the GPU on the Xbox has a lot in common with the R9-290X than the 7700 series that was speculated?Kevin G - Thursday, September 26, 2013 - link
Nope.The Xbox One's GPU specs are indeed on par with the Radeon 7700 series in terms of computational throughput. The Xbox One's GPU does have some HSA features which the Radeon 7700 series does not. This will help the Xbox One's efficiency but it is still bound to an upper limit that Radeon 7700 series share.
Eddytion - Friday, September 27, 2013 - link
Mantle is an API for the 7000 seires and up. The 7000 series also use GCN architecture.codylee - Thursday, September 26, 2013 - link
I think the timing of this is fantastic. AMD has been refining/emphasizing the "APU" space, and they are ready to move GCN into APU's. As is, the APU graphics are getting closer to mainstream implementation for gaming... maybe this is the plan AMD had all along, to provide the boost the GCN core in APUs needs to make it big and provide a better gaming experience sans dedicated graphics.At the same time, we're hearing about SteamOS and Steamboxes, which are looking to compete with console gaming, where Mantle would allow console games to port into SteamOS usable programming quickly, and AMD is fully on-board with that it seems. Within the SteamOS, we expect less overhead from Linux than we do Windows, and now even less overhead with Mantle than the other APIs. Seems like the SteamBox is getting even more desirable for gamers, as they will get the benefits from a console, with the ability to upgrade the hardware.
I don't see Mantle being a problem in the PC space. Dual API writing won't be *as big of an issue* as it was with Glide, since the games will already be somewhat written for Mantle having come from consoles. And for games not coming from consoles, the developer can simply avoid Mantle.
Either way, it seems like the stage is being set for a bridging of PC and console gaming one way or the other.
amo_ergo_sum - Thursday, September 26, 2013 - link
Nvidia's response to mantle is a built in ARM chip.Not only does it have dedicate hardware doing the draw call management and physics, moreover, the ARM development environment is already mature with plenty of experienced developers not needing to relearn a new API syntax.
Game is not over... oh no... far from it in fact.
I for one bought a GTX 690 because I don't like where all this is going. Sure there's nothing innovative with the GTX 690, but it's the culmination of all of the tried and true standards refined to the brink of performance perfection. PC gaming was just about to have a renaissance but now all these intricately tied suppliers are going for the win causing a total incompatibility cluster bonk.
shinkueagle - Thursday, September 26, 2013 - link
Cool story... Whatever floats your boat....dookiebot - Thursday, September 26, 2013 - link
After spending $1,000 on a video card I would be defending my position as well. Carry on my good man.Death666Angel - Thursday, September 26, 2013 - link
Yeah, CUDA and PhysX on GPU are so standardized, right? Right?tiko257 - Thursday, September 26, 2013 - link
-x86 architecture-amd gpus
-mantle
ps4/xb1 emulation does not sound too crazy in my opinion XD
Eddytion - Friday, September 27, 2013 - link
LOL yeah i totally forgot about the emulator possibilities! Now it's a lot more easier!Wolfpup - Thursday, September 26, 2013 - link
Huh, very interesting. Makes sense for AMD, I guess. It may not actually be a ton of work for developers to support, as they can presumably use Mantle on Playstation 4, Xbox One...and AMD based PCs. Direct X 11 or OpenGL in some cases might end up the port!My only negative thoughts about it in terms of AMD is don't they have enough trouble getting their drivers to work? My avoidance of notebooks with AMD GPUs has actually increased this past year rather than decreased. I really WANT them to get their act together but I've literally been waiting on that since 1998.
ArthurG - Thursday, September 26, 2013 - link
Hey Ryan, why not ask the good shaps at AMD about Roy Taylor quote, not long time ago:"I think CUDA is doomed. Our industry doesn’t like proprietary standards. PhysX is an utter failure because it’s proprietary. Nobody wants it. You don’t want it, I don’t want it, gamers don’t want it. Analysts don’t want it."
Then comes Mantle. LOL.
tiko257 - Thursday, September 26, 2013 - link
but i think mantle will be used on both consoles and pc ... chachaniniudan - Thursday, September 26, 2013 - link
Mantle is an open standard, not proprietary, just no idea what the licensing is yet.All Nvidia and Intel have to do is develop their own architecture and driver, so they can interface with the API.
But yes, for now AMD are the only one who will be able to run Mantle. But they control two important hardware platform, so they can permit themselves to be the one who set standard and give themselves a short/medium term advantage, like how they did back in the early x86_64 days.
jwcalla - Thursday, September 26, 2013 - link
That doesn't jive with the concept of a low-level API. When they say that the Mantle API is open, they just mean the interface that game engines would use into Mantle. Otherwise, if this is just another abstract API where NVIDIA and Intel can just plug in their own stuff, how would it be any different than Direct3D and OpenGL?The article indicates that Mantle is explicitly tied to the GCN architecture.
iniudan - Thursday, September 26, 2013 - link
Current Nvidia and Intel hardware most likely cannot plug into the API, but nothing is preventing them from modifying their architecture and driver to be compatible with the API, has the standard is open.All this does is giving AMD short term advantage, has they are the first one with compatible hardware on the market (HD 7700, HD 7800 and HD 7900 series are GNC) and a long term advantage in the capacity of drawing FRAND licensing revenue for implementation of the standard, if their standard catch on.
jwcalla - Thursday, September 26, 2013 - link
I don't understand how that makes sense. A low-level API that is completely tied to a graphics architecture does not translate into Intel and NVIDIA having the ability to change their drivers to be "compatible" with the API. And there is no "standard" here. It's an API into GCN architecture. That's the opposite of the word "standard". Where are you getting this from?iniudan - Thursday, September 26, 2013 - link
No, it is an API that connect into GCN architecture driver, not the GNC architecture itself.It low level has the API feed directly into the kernel layer driver (unless we are speaking of microkernel), that still not metal you know.
wumpus - Friday, September 27, 2013 - link
Nothing is stopping nvidia and Intel from making a Mantle driver. Nothing but the fact that it will be slower than using directX unless you are going straight to GCN hardware. Possibly much slower depending on how different your hardware is (gods help you if you are using tiled mobile architecture).HisDivineOrder - Thursday, September 26, 2013 - link
Why are you guys so convinced it's the Xbox One low level API instead of, say, the PS4's? I thought that most of the API work done for the Xbone was done by MS and most of the API work done for the PS4 was done by AMD and Sony.That's what I've read when I've read about such things. So why act like Xbone is the only next gen console with GCN hardware coming out? Hell, last I checked, the Xbone doesn't even seem to have a fully HSA-spec'ed part unlike the PS4's fully HSA-capable hardware.
MS had their own version that made it so they didn't require HSA. That's what I was reading a few weeks ago...
So what information is making you lock onto just the Xbone's low level API instead of including PS4 in there?
I'm also curious how you aren't seeing the real problem with this. It isn't that AMD doing it and everyone else stays Direct3D, okay so that's fine. The real problem is this is going to redefine how the TWIMTBP vs Gaming Evolved is going to play out. It's especially going to affect any title that Intel decides to throw its rather HUGE budget toward "owning."
Because if AMD can do Mantle, then nVidia can do something they'll probably tie into CUDA, and Intel will gladly do something to lock out AMD and nVidia. Imagine having to disable your discrete GPU to run your integrated GPU just to play the new Angry Birds. Why aren't you seeing the obvious here?
If AMD who is as cash strapped as any of these three companies can afford to buy a few developers, what do you think nVidia whose entire existence is reliant upon being perceived as the high end (and who has a long history of practically co-developing games, re: Batman AA) is going to do? What do you think Intel who just needs a way to "compel" gamers to use their GPU's is going to do with all that money they have just sitting around in money vaults?
Both of the latter have shown a renewed focus on gaming, especially Intel. They aren't going to sit by and let Mantle take away the high performance crown. They're going to either work together to make their own variant or they're going to individually split the industry in a threeway. How long then before games are ignoring the DirectX codepath except as a vaguely baseline "safemode?" How long before it's dropped entirely if Intel wants to REALLY own that game with a lot of money?
Do you really think the very fragile PC gaming market can handle a sudden and abrupt threeway fragmentation battle over the API's? There's just no way this is a good thing. No way. It might seem like a good thing for AMD users for a brief period of time, but then what happens when the hardware changes in the future and going down to "the metal" isn't the same in future products as it is in current ones? Let's hope no games go exclusively Mantle or we're going to be back to hoping someone "ports" a PC title so it runs in the future on more generalized GPU's that forego "specialized functions" again.
I really don't get why you're so upbeat about this. This is more than just shock and awe. You should be horrified. You should be telling them how horrible an idea this is, how badly this is going to go if it were to gain any traction, and how you pray to the gaming gods that no developer actually uses this.
Because look at what nVidia did with PhysX and imagine what they'll do with the idea of owning an API. Look at the sheer amount of money that both nVidia and Intel have to throw around. AMD is going to lose this war even if they land an opening salvo, but all gamers will lose because there will be a day when all this crap will splash out and hit us. One game or another.
And creating uncertainty in PC gaming right now is snatching defeat out of the jaws of victory.
iniudan - Thursday, September 26, 2013 - link
I actually think your wrong, has Mantle is an open standard.Worse case scenario there is a FRAND licensing attached to implementation, which is something normal for standard in electronic and computer hardware industry.
Basically due to controlling xbox one and PS4, while also having interest in the ARM and x86 market, AMD is trying to implement a low level GPU API standard, that would give them a short term advantage in other market and possibly, if not a royalty free standard, long term FRAND licensing revenue from those licensing the implementation.
Basically AMD is possibly trying to create the GPU API equivalent to WIFI in wireless networking.
wumpus - Friday, September 27, 2013 - link
I suspect that a great interviewing test for marketing shills is to get them to explain how a system that connects to AMD parts can be somehow sold as an "open system".I can only hope that this means that SteamOS will soon have mantle drivers available, and better yet Mantle drivers in the Linux kernel (which would make OpenGL and Wayland development take off).
iniudan - Friday, September 27, 2013 - link
It doesn't connect in the hardware, it connect into the kernel-mode driver.Eddytion - Friday, September 27, 2013 - link
If what AMD said about Mantle is totally true, it's a fucking revolution in PC gaming. Cheer up and hope for the best. The competition will get a lot more critical for Nvidia.wumpus - Friday, September 27, 2013 - link
What uncertainty? That if you buy an AMD (GCN) system that you will be able to play console ports without overhead? nVidia (and Intel) can make all the APIs they want, the hard part is how to convince developers to code for them. AMD *knows* that there are plenty of developers that already have coded games "to the metal" of GCN-based systems, and is giving them the chance to use that code with Mantle. All nVidia (and Intel) have to do is convince those developers to do the same all over again without all the benefits of directX for 10% (or less) benefit.Sounds like the only "uncertainty" is how much this will benefit AMD. There is also the issue of Steamboxes. I also fail to see any issue in how selling steamboxes will harm "PC gaming" (the idea of a port to Steambox without a PC port seems unlikely).
Jackyle - Thursday, September 26, 2013 - link
Touche AMD. If Mantle would be available for Xbox One and PS4, that would be even better.iniudan - Thursday, September 26, 2013 - link
It is, that actually the main reason it exist, has AMD control the hardware of those two important platform, they can permit themselves to develop an open standard to give themselves short term advantage in the PC market, while competition fumble at implementing compatibility to the standard in their own architecture, if the standard take hold among developer and the advantage are obvious.Eddytion - Friday, September 27, 2013 - link
"Mantle" is a new level of optimization, PS4 and Xbone use the same GCN architecture found on 7000 series and above. The only difference is the API (Mantle) which is not a whole lot different. That's why AMD introduced GCN architecture to PC because they were already working and producing it for PS4 and Xbone.WaltC - Thursday, September 26, 2013 - link
Good write up, with only one mistake that I saw. I clearly heard in the presentation that Mantle is an open API--that is where the comparison with Glide breaks down. AMD is not going to sit on it as proprietary. nVidia will be free to use it and so will anyone else.Eddytion - Friday, September 27, 2013 - link
LOL false, Nvidia doesn't have the rights to make the same exact architecture as AMD. Mantle is for GCN chips (found on 7000 series and above).mwasilew83@gmail.com - Friday, October 11, 2013 - link
noob question: is it in the realm of possibility that nvidia could be forced to license gcn architecture from AMD same as Intel has 64-bit tech? perhaps that is the master plan from the red team. If so, its brilliant, but only if Mantle is adopted by majority of developers i suppose, and the performance boost must be exceptional for that to happen.tiko257 - Thursday, September 26, 2013 - link
only gcn gpu´s can use mantle, so imagine a mantle only game....only newer cards will run it, what will happen when amd changes their architecture in the future.... no retro compactibility? ... like ps4 not running ps3 titles?toyotabedzrock - Thursday, September 26, 2013 - link
They should build a special setup proc to take over for the cpu to create the draw calls.Azyx_Lima - Thursday, September 26, 2013 - link
Guys, i read somewhere that the Mantle will be open source... I think that has the possibility to solve the different GPU architectures problem.And doesn't MMOs have very small draw calls? I think that a low-level API can dramatically improve MMOs graphics, doesn't it?
wumpus - Friday, September 27, 2013 - link
Only for those in development (think Elder Scrolls Online, especially with the console connection). WoW might be able to afford the client development, but I doubt any other mature MMO would consider it. My understanding is that WoW simply doesn't need all the power that a GCN (7770 or better) GPU can bring to the table, I doubt that they need consider mantle.TEAMSWITCHER - Thursday, September 26, 2013 - link
It's gonna be pretty hard to use this technology since I buy ONLY nvidia based graphics cards. The performance gains can't possibly be as impressive as everyone here is dreaming about. And then there's execution - I can't tell you how many API's have died from shoddy execution. It's best temper your excitement until more information is available.et20 - Thursday, September 26, 2013 - link
This is great.Nvidia should make their own version for developers to target and both OpenGL and Direct3D can be largely put to rest.
iniudan - Thursday, September 26, 2013 - link
They will not be put to rest, most application don't require that low level of abstraction to perform well.But for professional graphical application, I can easily see them offering product for it, if the standard catch on on the consumer side.
et20 - Thursday, September 26, 2013 - link
Higher abstraction, easier to use libraries and even high level language bindings could and I expect will be developed on top.Some may keep true to the architectural differences and others may try to bridge them to offer a unified interface.
The diversity of solutions will lead to much better tools for developers and more efficient use of hardware than what OpenGL and Direct3D offer currently.
Leosquizz - Thursday, September 26, 2013 - link
Maybe most of you don´t remember that, but long time ago, when S3 was still a player in the market, along with their savage line of gpus there was a proprieetary low level API called METAL, which was kinda buggy and all, like the cards themselves, but i recall that only Unreal had support for it, and was like 50% more powerfull using that api than in opengl, and using better texttures due s3tc, which became a standard afterwards. maybe is something like this that we should expect, but with real interest to implement, not that one game and buggy thing???TrantaLocked - Thursday, September 26, 2013 - link
What I dislike about Mantle is that it would give PC users who own AMD hardware a huge advantage, and Nvidia would not be able to comete. Having both Mantle and DirectX would fuck up the GPU market. I own a GCN GPU, but I don't want competition to go away. Nvidia could not compete with this because they have no standing in the new generation of consoles. So like everyone else is saying, if developers decide to use Mantle, they would effectively need to develop two different ports for PC to accommodate both Mantle compatible GPUs (GCN) and Nvidia hardware. Mantle is supposed to make developing more efficient, but it actually makes it harder by making the processes of porting to PC more complicated.Spaceman000 - Thursday, September 26, 2013 - link
What you people are reading from AMD are just DX10 and DX11 benefits over DX9.Its finally here, after so many years, ROFL...
wumpus - Friday, September 27, 2013 - link
It's what you get for letting microsoft monopolize your API.awvz - Thursday, September 26, 2013 - link
What could this new API potentially mean for Valve and the Steam Box?OverclockedCeleron - Thursday, September 26, 2013 - link
It could allow games to run natively on Steam OS. That is if Valve decides to work with AMD to implement Mantle into Steam OS. Valve then would benefit from the porting process because developers might decide to port to Steam OS while they are in the process of porting to Windows.ericore - Thursday, September 26, 2013 - link
Welcome our New AMD Overloads INDEED :)For starters, I want to iterate that I actually like, prefer Nvidia drivers, but hate their CEO.
Second, it was massively stupid regardless of current positioning to let AMD win both the Xbox and Playstation contract, and I hope they suffer the maximum damage. I simply have to support the smarter party.
"Consequently while Mantle is good for AMD users, is Mantle good for NVIDIA and Intel users? Do developers start splitting their limited resources between Mantle and Direct3D, spending less time and resources on their Direct3D rendering paths as a result?"
That was a very unintelligible and unneeded statement.
Mantle exists so that Playstation and Xbox game developers can keep mantle code while they do as they normally do, program Direct3d API. Anand, you even said this yourself earlier. And the whole point is to give AMD an advantage, and gamers and power users can't complain. You failed to learn the "What's New This Time Around with the Low Level API abstract". The difference is AMD owns the biggest two console players and developers know they need to keep developing in Direct3D, but can keep mantle code.
I for one support mantle and validate its eternal existence.
Eddytion - Friday, September 27, 2013 - link
No, Mantle is developed for GCN architecture only (found on 7000 series and above). This is clearly bad news for Nvidia and they should be worried. It's gonna become a whole lot easier for developers to port their games from next-gen consoles to Mantle than on other APIs.blppt - Thursday, September 26, 2013 - link
I, for one, miss Glide---yeah, you were locked into one chip manufacturer, but man, you were pretty much guaranteed that the game you ran on that API would run properly, and as the developer intended---simply because the driver the developers wrote to could be highly optimized. While D3D/OGL allows choice in chips, it also increases the little bugs and such and not optimized performance across competing chipsets (usually AMD vs Nvidia, although it has been the reverse at times).Still, even if 3dfx had not died, they too were moving towards D3D optimization before Nvidia ended their existence, and Glide would have died out anyways.
fishfishfish - Thursday, September 26, 2013 - link
Must know more! Can you tell us when the NDA is to be lifted?iniudan - Friday, September 27, 2013 - link
There will be more announcement in November.fishfishfish - Friday, September 27, 2013 - link
bah. humbug. hopefully anand can give us a bit more insight before this announcement date.Felix_Ram - Friday, September 27, 2013 - link
Wouldn't one expect Gabe newell and Valve to be watering their mouths over this? I mean their Steam machines are Linux based, and couldn't Mantle be the software to outdo DX11, and their Linux build be much more gamer friendly than Windows ever was?Question is of course, what Sony and Microsoft would think of AMD helping Valve to be direct competition for the consoles. Microsoft sure as hell won't like it if Steammachines end up with Linux and Mantle :)
jwcalla - Friday, September 27, 2013 - link
The impression I get is that AMD is very much not committed to SteamOS... Android... and just about anything L-related. Microsoft shouldn't have a hard time keeping AMD in the fold.JNo - Friday, September 27, 2013 - link
First slide from AMD: “Leverage optimization work from next-gen game consoles to PCs”Article: Instead we have to talk about what is not said not even hinted at ... the console connection.”
But they openly talked about leveraging that connection in the slide no?
::confused::
Eddytion - Friday, September 27, 2013 - link
Does this mean that we get to keep our Graphics cards for longer periods (for example 4 years) and still perform good?Haider - Friday, September 27, 2013 - link
It'll probably be the same low-level API provided by Sony and MS. GCN is GCN, pretty much like 68000 was a 68000 on an Amiga, ST or Mac. Used to take 68000 assembler done on an Amiga to Mac at Uni. If it resolves the draw calls performance on PCs and allows games like GTA series to really use our GPUs kudos to AMD. I believe where low-level code is being used on the consoles it being ported across for AMD GCN videocards. Still need MS to address draw call performance at the high-level though.Klimax - Friday, September 27, 2013 - link
Sounds like horrible 90s idea. Stupid horrible idea. This is vendor lock-in (and good luck with drivers) andlock-in to current architecture.Should NVidia get significantly better arch with Maxwell or even Volt, AMD won't be able to react.
And PCs are not consoles, keep console crap out of it and forget this ever existed. (Didn't even prove its own reason to exist against current solution like DX11.)
Filiprino - Friday, September 27, 2013 - link
In Windows World Direct3D maybe a high level API, but on GNU Linux OpenGL isn't a high level API, more so when you have things like LLVM to compile and optimize shaders for the GPU architecture.After all with Mantle you are still using an API. Nobody would program with GPU assembler in a large project, with so many registers and long instructions.
Jaybus - Friday, September 27, 2013 - link
Huh? OpenGL is a very high level API. In fact, it is a general purpose graphics API that is completely abstracted from the hardware. Yes, Mantle is an API, but its functions have a 1:1 correspondence with a hardware function in the same manner that the _InterlockedExchange() intrinsic function maps to the xchg4 opcode on x86 hardware. It is mostly like intrinsics, although for GCN hardware instead of x86. Register assignment, etc. is handled by the compiler (or the API function itself is coded in assembler).gamoniac - Friday, September 27, 2013 - link
Great article. Can't wait for the next one. I wonder how the newly announced SteamOS will come into play with all that's going on in the gaming world.Pastuch - Friday, September 27, 2013 - link
1. Graphics performance increases due to die shrinks are slowing down dramatically. See reference document from the article above:
http://www.bit-tech.net/hardware/graphics/2011/03/...
TSMC is nowhere near ready for 20nm chips, we'll be lucky to have them by the end of 2014.
http://videocardz.com/45403/nvidia-to-launch-more-...
2. Due to point one, Game devs will need to better optimize their code for hardware.
3. Games development is focused on reaching as many peoples devices as possible. Consoles, PCs, Smart Phones... More platforms = more money. PC exclusives will become less and less common, this is not a bad thing. Even for die hard PC gamers like myself, I can see the benefit of more hardware optimized code. Note: I own a GTX 670.
4. -IF- mantle is in both the PS4 and XB1 then game developers that work on AAA titles will take advantage of it. We've already heard that Frostbite 3 engine titles support Mantle.
Frostbite 3 games in development:
Command & Conquer
Battlefield 4
Need for Speed: Rivals
Plants vs. Zombies: Garden Warfare
Dragon Age: Inquisition
Mirror's Edge 2
Star Wars: Battlefront
Untitled Mass Effect game - Bioware
5. Even if mantle is a total failure, pretty much all future console games will likely be on PC because porting is ridiculously easy now thanks to X86. If you look at the launch titles for XB1, PS4... even the "exclusive" titles are on PC too.
Dman23 - Friday, September 27, 2013 - link
VERY INTERESTING. I think this will be a Net / Net win for cross-platform gameplay!magnusmundus - Friday, September 27, 2013 - link
I'll be interested to see how the performance benefits of Mantle compare with the performance benefits of the ARM CPU in Maxwell.Brainling - Saturday, September 28, 2013 - link
Anyone who thinks AMD has "won" is crazy and a complete fanboy. No one has "won" anything. I've flip flopped between AMD and Nvidia as each one takes the upper hand, and I will continue to do so. Any gains AMD makes here, Nvidia will make up in some other area. Then AMD will catch up to that, and on and on we go. The amount of religious zealotry and fanboyism in this thread is disturbing: Video cards are not a religion. They are a tool. You use the best one available to you at the price point you can afford, regardless of the name badge on it.Arnulf - Saturday, September 28, 2013 - link
"Direct3D took over as the reining king"I don't think "reining" means what you think it means.
junky77 - Saturday, September 28, 2013 - link
Where is the opensource part?I also don't understand why OpenGL/DX can't include something like Mantle and the NVidia/ Intel competitors under its umbrella with some porting tools. Glide has several wrappers..
How can you negate the negative parts of things like Mantle?
dahippo - Saturday, September 28, 2013 - link
Why is so many against this. AMD took an intiative to develope this from a request from PC game developers. Even if 3/4 can't use this technology it's an indication how big or smal the benefits are. Read somewhere that MS has given it full acceptance.lilmoe - Saturday, September 28, 2013 - link
Don't ask how I got to this conclusion. But it seems that Microsoft is going to buy nVidia (starting with) in the near future and start building/designing their own chips.iniudan - Sunday, September 29, 2013 - link
Why would Microsoft do that, people purchasing and developing on their ecosystem rely on x86 architecture, Nvidia doesn't have x86 license. On top of that, it would probably alienate Intel and you don't want to alienate Intel, if most of your business is based on x86. Such purchase would most likely get them to start a push for Linux to OEM and software developer.lilmoe - Sunday, September 29, 2013 - link
They don't need to push Linux, doesn't Apple do so?? They now own Nokia, and they started making their own tablets. Windows Phone and Windows RT should be enough to saturate all of nVidia's production Tegras. Tegra might as well be proprietary since it really failed on Android.Microsoft is going to leave x86 alone (for both Intel and AMD). It's a win-win strategy for them. But Windows on ARM (RT/Phone) needs to be really tightly integrated with hardware (and more cheaply so) to compete effectively with "flagship" Android tablets/smartphones, the iPad/iPhone, and most importantly, cheap Androids that are powered by CHEAP and relatively powerful chinese SoCs. Microsoft now owns Nokia (as I said above), which means that they're going for most of the mobile business. If Microsoft wants to further pursue a "profitable" hardware side of business, it's essential (and common sense) they buy the ARM chip division of a chip maker, or at least start designing their own SoCs (like Apple does). We all know that the most (or more appropriately, the ONLY) profitable mobile OEMs are those who design/fab their own SoCs (Apple and Samsung). And with how things are going, this is how I look at it:
- Qualcomm is the indisputable king of ARM SoCs. nVidia simply cannot compete at this time.
- Intel is the indisputable king of x86/64 CPUs, AMD is second best (and the only competitor). nVidia doesn't and probably won't ever compete in that market.
- AMD is already giving nVidia a hard time in the GPU business. AMD is more likely to gain even more ground with ^that low level API and will make things that much harder on nVidia in their "only" successful line of business, especially that they lost the console war.
This is all bad news for nVidia. They now only "appeal" to a very few enthusiast who look for marginal gains, while the majority of the market finds AMD GPUs more appealing. Look how sharply they lowered the price of their GPUs in the past couple of years. Their profits aren't nearly as they used to be. They knew that already and that's why they went forward with their ARM division, which like I said, isn't doing as well anymore.
I believe it's all a prelude to lowering the market share of Tegra chips, weakening their discrete GPU market, and ultimately buying them out, (or at least their ARM division). Typical business/buyout practice. We've seen this already with Nokia. Also, Icera's tech is also appealing for LTE you know.
In 2-3 years, Nokia Windows Phones, and Surfaces are only going to be powered by nVidia SoCs with Icera LTE modems, and are all a part of Microsoft. That's the only way Microsoft might be profitable in a business where a comprehensive hardware, software, and services ecosystem is essential for success.
lilmoe - Sunday, September 29, 2013 - link
Does Apple 'do' so***martixy - Sunday, September 29, 2013 - link
Despite all the valid caveats, I'm positively giddy right now.By Occam's razor and just general bloody common sense the console connection is absolutely inevitable.
Ever since they announced the new console specs, I've been preaching left and right how awesome they are. Not because I'm a fanboy mind you(if anything you could put me in the hater camp, never owned one, probably never will), but because of the impact they will have on PC gaming and the industry in general. Case in point right here.
Maybe even, just maybe with that very same connection the industry will sit down on their collective asses and standardize some of the stuff. One can always dream...
Still, back to the real world. Wait and see we must. I hope it turns out the silver bullet we all want it to be.
Wwhat - Sunday, September 29, 2013 - link
Silly to say AMD won't confirm this and pretend you are cleverly analyzing all of this when in fact it is all said at the damn AMD conference in the DICE presentation...Kutark - Sunday, September 29, 2013 - link
I find it deliciously amusing that all these AMD fanboys are sitting here all quivering in their little space boots, when if they were smart enough to see the forest for the friggin trees they would realize that having one player controlling the entire game IS NOT GOOD FOR THE CONSUMER. In the very same breath they will slam MS or Nvidia for monopolistic practices, and then talk about how great it would be for AMD to smash nvidia and MS into the ground.Morons.
Midwayman - Tuesday, October 1, 2013 - link
And people laughed at me when I said AMD having the gpu in both the xb1 and ps4 would have repercussions in the PC gaming space.Kathrine647 - Wednesday, October 2, 2013 - link
like Gregory said I am alarmed that a stay at home mom able to earn $5886 in 1 month on the internet. visit their website............B u z z 5 5 . com open the link without spacesTruePath - Friday, October 4, 2013 - link
There is no reason you couldn't take advantage of both the low-level performance and high level convience. All you would need to do is compile down your game to some kind of intermediate code representation and have the Nvidia/AMD compiler translate that IR to native code on the machine it is run on.Yes, you would need to keep most of the program in IR since many optimizations will probably require knowledge of the program making the calls.