What are the chances this shows up in the Galaxy S7? Because let's be honest, a large reason of Qualcomm's drop in revenue is because Samsung is sticking to their own chips now. But the 820 looks interesting indeed.
Probably better than the chance that this will show up in the iPhone 7, haha. But in all seriousness, I think everyone would be very surprised to see Sammy switch back. Any specific guesses on my part would be pure BS.
It will probably use multiple sourcing like other models do. So Samsung may sell enough phones to have the need to also source SoCs from Qualcomm. Also Samsung sells their Exynos SoC to other vendors so the entire production is not dedicated for internal use.
There are a lot of complaints of network issues because of the S6's Shannon modem, in NA especially on both the operator and client sides; a return to the previous method of doing US/Canada with Qualcomm and everyone else Exynos seems likely for that reason. Especially since S820 seems quite decent.
The rumors were that Qualcomm gets the CDMA carriers while everything else,including international unlocked version, is Exynos. Samsung would save a lot of money with their own SoC so they would use something else only if there was a very good reason to. The perf here doesn't seem like a good enough reason.
I remember back in the early days of Exynos, the USA galaxy phones would get snapdragon simply because Sammy couldn't produce enough of their own to cover global demand. They don't have that problem anymore. And like you said, there really aren't any compelling reasons for them to go back to Snapdragon that would justify the cost.
In fairness the CPU perf is great, sure it doesn't seem like it will crush A72 but compared to SD801 the perf is fantastic - SD810 was terrible so can be ignored. In Geekbench in integer and FP it's over 2x the SD801 in single core and that's insane in a phone.That's why a comparison with older desktop cores would be nice. CPU perf is clearly hitting good enough in phones. The memory perf is stupid good here. 2016 is exciting, what SoC wins doesn't really matter, we the users certainly win.
How can you say performance is great?! It is actually at best as good as A72 performances reported so far, and not even on a same node size. So what was all the reason behind research and development when you build something as good as ARM's offering. No wonder other companies feel safe using ARM for their SOCs. I don't think samsung will go for this if they can manufacture their SOC in time.
@mmrezaie: "It is actually at best as good as A72 performances reported so far, and not even on a same node size. So what was all the reason behind research and development when you build something as good as ARM's offering."
To be fair, Qualcomm used ARM cores for the 810 and that turn out, lets say, less than stellar. I can understand why they might want to go back to their own IP. Even if performance is the same, that doesn't mean power consumption, heat density, etc. are the same. Samsung uses ARM cores for their Exynos chips and IIRC they had to fix a problem with their cores in the past as well to get the cores up to the speed they wanted. Also, designing it yourself can be cheaper than buying the core from ARM if you are proficient enough.
What happened, really, was that the Exynos chips weren't as good as the Snapdragon, and would be used in areas of the world where cost was more important that ultimate performance. The 810 wasn't used by them because it wasn't a very good chip, and it's only been recently, with the latter tape-outs, that aimed of the heating problems have been mitigated.
I'm not sure when this false perception started, but that's just the fantasy of a handful of Snapdragon apologists. Exynos SoCs have ALWAYS been better than Snapdragons since their inception with the Hummingbird (GS1). Samsung has only stumbled with the Exynos 5410 (where they didn't have a "working" CCI), and less so on the 5420, and that was more like ARM's fault rather than Samsung's. Samsung later "fixed" ARM's design and moved forward with better chips than Qualcomm's offerings starting with the Exynos 5422 (except for the integrated modem part, up until recently that is).
They sometimes ran hotter than Snapdragons but Samsung always fixes things up with updates. Apps and Games have always run better and smoother on Exynos variants (even the 5410), and they've always aged better, *especially* after patches and software updates. I can attest to that through experience, and so can many reviewers.
Snapdragon apologists have always argued "custom ROM support". That's a very small percentage of users. Like REALLY small, to the point of irrelevance.
The main reason I do prefer devices with snapdragon SoCs is because of AOSP-based ROMs. You're right, it's a very small minority, but it's still nice to have. Samsung makes A+ hardware but their software is kind of meh to me. Personal preference and all.
But Exynos hasn't historically been a very good chipset. Sure, it beat the 810 this year and did well with heat, but there have been shortcomings on Exynos in the past.
I remember back in the days of GS2, Exynos didn't have anything on their chipset to support notification LEDs (and even if they did, no one knew how to use them), while the same was available on Qualcomm's chips. Also, up to at least the Galaxy S4 (maybe even S5), Samsung's Exynos devices had a very seriously annoying lag between the time the home button was pressed and the screen turned on. Someone said it was because of the S-voice shortcut but it persisted on custom ROMs where S-Voice wasn't even there. It was some hardware thing.
TL;DR: being a benchmark destroyer isn't everything.
Exynos 5410 wasn't really a failure just because its GPU is FAR ahead of Adreno 320 used in SD600. I was just surprised when saw how bad Adreno 320 runs real 3D games.
Thats nowhere near true I am afraid. The Powervr sgx 544 mp3 was beaten in almost every metric by the Adreno 320 rev.2. CPU wise the 5410 was better due to the use of higher ipc A15 cores but GPU performance was better in the Snapdragon 600, in real life as well as in most benchmarks.
The Adreno 320 revision 2 is still relevant at around 85 Gflops, peforming better than midrangers like the Adreno 405 and Mali T760 MP2. Its almost identical in performance with the Rogue G6200 used in Mediatek's Mt6595 and Mt6795.
Sorry but Adreno 320 v2 doesn't exist at all. Stop believing the delusional lies from a chinese blog.
As I said I was just surprised how bad Adreno 320 really is. I don't mind benchmarks because it seems like Qualcomm bought most benchmark makers. Only real performance I trust. Only in real games and apps.
Actually Samsung probably wouldn't save any money by using an Exynos SOC.
The two divisions are independent of each other, which means that Samsung the SOC vendor charges Samsung the device vendor the same prices they charge everyone else.
I doubt Apple would let them manufacture their CPUs if they weren't seperate divisions and had firewalls between them.
@V900: "Actually Samsung probably wouldn't save any money by using an Exynos SOC."
They'd most likely save some. Just not enough to forgo a better chip if available.
@V900: "I doubt Apple would let them manufacture their CPUs if they weren't seperate divisions and had firewalls between them."
The "firewall" would exist around the fabrication facilities only. R&D and architecture design have no bearing on Apple products. If they are sufficiently proficient at design and the cost of the ARM IP doesn't eat the savings, then they could save some here.
@V900: "The two divisions are independent of each other, which means that Samsung the SOC vendor charges Samsung the device vendor the same prices they charge everyone else."
Current fabrication facilities (TSMC, GloFlo, et al) don't charge the same price per customer. They will give discounts for volume, customer loyalty, just to keep the fabs busy, etc.. Samsung could charge themselves preferred pricing, but it certainly wouldn't be free. How much they could save here is dependent on what they charge vs their competitors (I.E. TSMC) and if there is any margin for preferred pricing. Sometimes they will give their competitors very low margin pricing just to keep the fab busy until they have their next push. Samsung has generally been short on supply, so this hasn't happened much, but given their new expansion, it may be a consideration in the future.
I suppose if they're gonna use Qualcomm one last time, it would be for S7 and Note6. Chances are pretty good to accommodate those who are 'stubborn' with Qualcomm's stuff.
After that, they are going to use their M1 and its derivative for everything else, better margin in saturated market is their goal in the first place.
Well this wouldn't be long until Google release their own processor design to standardize Android's madness
Performance improvements are nice and all, and I'm more excited about the extra features such as Zeroth, Sense ID, and Smart Protect, but Qualcomm must under no circumstance blow it again on the heating/power consumption front. Whatever compromises they need to make for that to not happen again, they must do them.
The Snapdragon 810 overheating issue was very much real, even with the latest versions where they claimed to have "fixed" the issue. Play any game on a 810 device for 10 minutes, and you'll see what I mean. The device get unnaturally hot. That's completely unacceptable and should never again be decided as a "compromise" in order to beat Apple in performance or whatever. Never again!
Now, I hope Qualcomm will focus even more on hardware-enabled security features. It also makes no sense for them to support SHA1 anymore, but I guess that was a decision taken years ago. Next version should drop support for it. What I'd like to see is ChaCha20 acceleration as soon as possible, as it will be part of TLS 1.3 and will be included in OpenSSL 1.1.
I also wish Qualcomm would open source more parts of its security-related firmware, and would also open source its baseband firmware (I know, a hard thing to ask but only way we can be sure there's no backdoor in there). Otherwise, at the very least they should try to completely isolate the baseband firmware from most OS functions, so even if the baseband is "owned" they can't take control of the device, other than perhaps listen to phone calls.
Security is only going to become a more and more important feature in future chips, not just for smartphones and PCs, but also for IoT, which direly needs strong security by default, because we all know most IoT OEMs will never update those devices again after people buy them, or will only do it for a short while.
I keep seeing people complaining about the heat of the 810. I've got an HTC One M9, and I've played games on it. I'd characterize the experience as, well, warm. Ish. Posts like yours indicate people are experiencing heat that's an order of magnitude greater than I am.
Can you give me a sample workload that might allow me to experience this for myself? Barring that, can you give me an objective number in Celsius that's too high for you to bear?
It's not about the phone heating up, it's about the chip heating up and having to slow down. The problem is not the heat to your hand, the problem is that the chip slows down hard and you lose perf. So if you want to see it overheating, track clocks and load.
Thanks @tipoo and @jjj. It's just that I see people posting stuff like, "The device get [sic] unnaturally hot." I parse this complaint to mean, "Ow, my hand!", not, "Dang, why so slow?"
I think yours is the valid complaint, but what I keep reading are comments like @Krysto's here. And I still think @Krysto and others are actually complaining about the device becoming uncomfortable to hold.
My meaning is this: "Quantify your complaint, @Krysto" If it's *really* the heat (really?), how hot is too hot? If it's the performance, maybe say so.
Mixed feels. A move away from the frankly bad Cortex A57 cores, cool. But you could pretty well tell from the very first page that this was set to punch below A9, coudn't you? Longer latencies on float, int, 1/3rd the L2, no L3 at all. And particularly as a big.LITTLE design that means less sharing between high power/low power core clusters.
And unlike S810, there's 2, not 4, "big" cores to at least make up for the lackluster IPC 810 had compared to A9.
Just think their aim wasn't high enough...Again...
Oi, don't slate the A57! Those cores run beautifully in Samsung's Exynos 7 without overheating or throttling, and performance of that SoC is broadly comparable to the A8.
Integration and implementation are just as important as the initial selection of IP blocks for an SoC.
You're forgetting that Samsung had a rather significant process advantage. Take that away, and they'd need to lower their clocks to sd808 levels, is imagine.
He's not talking about clock speeds. He's talking about the whole package. Samsung has lots more experience with big.LITTLE and their implementations are far superior than competing chips on the SAME process node. Both the Exynos 5433 and the Snapdragon 808 are built on 20nm, yet the Exynos performs AND sustains its performance better than the Snapdragon.
On Samsungs 14nm process, yeah they ran ok. But it effectively cancelled out a generation's worth of fabrication process advantage, just to be able to run the things without throttle hell.
This should be interesting. Phones delivering this chip will be seen, mostly, during the April-May period. That leaves them about 4 months, on average, before the iPhone 7 with the new A10 comes out. With this behind the A9 in many areas, that doesn't give them much leeway in performance or time.
So most of the year leaves Apple's chips basically unchallenged. It seems to me that shipping schedules for flagship Android phones needs to shift, along with the release of high end SoCs to more closely match Apple's release dates, or there will always be this disparity.
While it's often said that Android phone manufacturers are competing against one another more than they are competing against Apple, that's only true because they have a hard time competing against Apple at the higher end. Having phones that better compete in performance on the same release schedule would help somewhat.
This chip really needed to come out last August, not next spring.
I disagree, they have no problem competing with Apple at the high end. They won me easily.
The A9 is a nice chip, but running iOS its like having Camaro SS with a limiter set at 75mph.
I'm sorry, I just can't and won't consider the two eco-systems in any way similar. People buy the OS first and the device second. Like iphone, but want an android OS? Someone has an iPhone clone out right now just for you.
Wow. The voice of one dot speaking against reality. Apple's SoC designs and implementations are only expanding their leads on the competition. That ecosystem they also dominate in is building ever greater loyalty: they deliver and the software matches the hardware.
Interesting that they seem to be going with a small cache and the memory score is rather nuts for just 2x32bit.
"And though one could have a spirited argument about whether single-threaded or multi-threaded performance is more important, I’m firmly on the side of ST for most use cases." Do note that SD820 has 2 cores clocked lower, it's not just 4 vs 8, it's 4+4 vs 2+2. Everybody in the dumb press will be tempeted to forget that 2 cores are clocked lower here . As for ST perf , the thing is that at this perf level ST is more than enough so it loses relevance. Would be nice if you guys would compare ST perf with Nehalem and newer desktop cores.
Anyway, it blows that you insist on using the same empty synthetic benchmarks that have no relevance at all. SPECint2000 and Geekbench are fine but all else is irrelevant.
"Where the 820 MDP/S makes up for it is in the photo editing score, which is through the roof. Here Qualcomm’s development device holds a 34% performance lead over the next-fastest device, the 810/A57 based Mi Note Pro." So using the GPU or DSP? If so , is it cheating or (all) actual apps will use the GPU/DSP too, as they should. How about the behaviour of all other SoCs. Long live synthetic! It's like begging them to cheat....
"Apple’s commanding lead in ARM CPU performance." How is that exactly? Have you actually done any math , at the very least at equal per core power? In die area Apple is far behind but you don't like that being mentioned. In Geekbench Apple does 2.5k in each of the 3 segments, Kryo does about 2.1k in FP and integer and well over 3k in memory. So 20% higher clocks could eneble Kryo to match Apple's core in FP and integer. It's not impossible that in a dual core config Kryo could clock 20% higher. Same for A72. In the end if MTK can clock 2xA72 at 2.5GHz on 20nm, they could do much better on 16ff+. In theory 16ff+ can provide up to 40% higher speed over 20nm but only some 30% is needed. Ofc A72 is also much much smaller than Apple's core and you can actually make a cheap SoC with it for 150$ phones.
vs A72 ,it's hard to assume things. If A72 goes to 2.5 GHz in quad config and matches the SD820 in power ,then it's somewhat even and not really. In Geekbench Kryo at 2.15 vs A72 at 2.5Ghz should be about even in integer with Kryo having some 10% lead in FP but Kryo would be at higher per core power. You got core 3 and 4 at likely half the power (or even less) at max load, so total power is like having 3 cores at max clocks. Folks could do that with A72 too. Ofc remains to be seen if A72 can reach 2.5GHz or even more with fewer cores and how everybody does in power.
Will be very interesting to see Kryo in server. Assuming it will be a slightly tuned Kryo and not something very different. A72 does enable others to provide a multitude of configs in different price ranges and that could be interesting. Just today a Xiaomi device showed up in Geekbench with SD618 and just 2GB of RAM. 2GB of RAM would be too little for anything above Redmi 3 and Redmi 3 couldn't be priced above 699CNY (109$). Sure it would be dual A72 at low clocks on 28nm but it's a start.
You're assuming it will happily clock 20% higher with no disproportionate power draw increases. This is what Qualcomm provided, so it only makes sense for the reviewer to test it as they got it, rather than speculating on what it would be while higher clocked.
I don't see how Apples die area matters to an end user. The cost is spread through the entirety of the product, they are premium products, but really all that matters in the end to a user is performance and battery life.
I will remind you that here you got about the equivalent of 3 cores. So with 2 cores you would need 20% extra perf inside a 50% increase in per core power. So i was factoring in a certain amount of increase in power. If 20% increase in clocks with 50% increase in power is doable remains to be seen, we don't have enough data. That statement was about the core not about the device or the SoC and the core metrics are power,perf and area . The convos about the SoC and the device are different topics.
You ignore the fact that Apple has been shipping Kyro class HW since 2014 and Kyro isn't going to ship until 2016. A two year lead is commanding by any kind of definition.
You talk as if only Qualcomm has access to 20% higher without also acknowledging that Apple already ships a very similar design, the A9X, that clocks 30% higher and has no L3 that is approximately 70% faster than the Kyro if we assume the Kyro performs similarly to the A8 at 1.4GHz.
You don't male any sense at all. Apple's old gen core wasn't all that fast while the ipad Pro had higher clocks because the form factor allows it. We are talking in a phone form factor. it is true that Apple's per core power consumption is a bit of an unknown so certain things are assumed. Ryan is a fanboy and he tries to argue that fewer cores are better even he knows very well that more cores provide more computing power in the same TDP. AT worked hard to convince everybody that more core are better when Intel did it and now they are working hard to convince everybody that fewer cores are better when Apple does it because their OS is stuck in the past.
>Apple's old gen core wasn't all that fast while the ipad Pro had higher clocks because the form factor allows it. We are talking in a phone form factor.
The 6s???????????????????????? That's a phone
>that more cores provide more computing power in the same TDP.
Oh yeah checking my facebook requires GPU like parallelism
you have a few problems. And, the 6s runs very cool compared to every phone with a high end Qualcomm I have had. Which includes S4, SD800x2, SD810.
I'm quite confident the iPhone 6s could clock the CPU higher at the expensive of not being able to keep it at max clockspeed in the smaller variant of the phone.
jjj, your comments on this topic are off-base. For starters, there is nothing stuck in the past about iOS. It handles symmetric multiprocessing as well as any device. Realistically, there are far more work roads that are optimized for one processor. Every task will feel faster when a device with fewer but faster cores. There are very few workloads that truly benefit from multiple processors.
It's not using the GPU or DSP. That doesnt just happen automagically ... the app needs to be specifically coded to do that. The reason it gets a high photo editing score is because if has really really great FP performance. Note the Geekbench FP scores -- it is able to beat the 810 in MT in all but ONE test with half as many cores and those 2 cores running much slower.
> That doesnt just happen automagically ... the app needs to be specifically coded to do that.
The app doesn't have to be coded for it. The photo processing part is done in RenderScript by stock Android APIs and that can use the GPU or fall back to the CPU if it fails to do so. Almost all recent Android devices use the GPU. The only time I saw something explicitly fall back to the CPU was on the G4 and it was just a small portion of the test.
Who cares about die size? Sure, it can affect yield and pricing but it obviously isn't affecting then much and, in the only evaluation that matters, Apple is able to have a CPU that obliterates every other mobile chip AND is and to sustain peak performance. BTW, do you have a link to a kryo device that scores 2.1k on int? All of the ones I've seen are around 1.8k, with their total brought up by their massive memory and fp scores (totals if around 2.5k).
S810 MDP/T was a tablet, and S810 suffered badly under 20nm planar. If you want to see how it compares check out our HTC One M9 review, but it wouldn't tell us anything useful about what MDP/S -> retail will look like.
This doesn't inspire confidence. 820 looks much better than 810 at ST tasks and worse or equal in MT tasks, which seems like a good trade-off to me. However, seeing the 820 beaten in almost every benchmark by the A9 is troubling. You can't launch a flagship product six months after a competitor that performs worse than that competitor's product. Well, I guess you can, but it doesn't seem like a good move.
Even if you rule out the A9, since other manufacturers can't buy those, it's not that much better than Samsung's almost year-old (by the time the 820 is in a product) SoC in the GS6.
I'm curious what will be in the GS7. Another Exynos in the NA market? If they keep the momentum from the last one, they could perhaps punch above the 820 just like they did the 810. Not sure if their custom cores will be ready in time though?
To be frank, same thing has been happening since the A7 (even the A6). Problem is back then they used to cheat on Benchmarks or only show us bs Benchmarks like Antutu that only care about the number of cores so the numbers lied back then, they do now too just not as outrageous as before...
Dude, it has half as many cores. It's a 2+2 design vs a 4+4 design. The fact that is is able to stay close in INteger and actually win in all but one test in FP is actually really good.
Although it kinda makes me wish they did a 3+3 design, but oh well, maybe they will in the future.
What I'm gathering from these charts is that every 2016 Android smartphone will have lesser performance than the 2015 iPhone, unless Samsung pulls some magic out of their hat with the next version of their chipset.
Yes, but so what? Apple had had better cpus for several years now. Nothing new here. The more interesting thing is that despite that, Android is still able to keep up with the "highly optimized" "vertically integrated" ios with flagships when it comes to everyday tasks.
That has never been true for iOS either. I've been doing that with my respective iPhones for years by now.
There are multiple options for that, but I personally use GoodReader as my generic file system manager on my iPhone and on my iPad. I can expose my files as a mountable network drive and I can also mount remote drives as well. I can also access files on DropBox and similar services from apps directly if I want.
This is not a real issue – nowadays it's mostly just a myth propagated among people who don't know better.
I just wish Qualcomm would admit the SD810 was terrible. Everyone knows it, and they spend time and money lying about it. Just look at their blog post: https://www.qualcomm.com/news/snapdragon/2015/02/1... That alone makes me not want to trust them ever again. People/companies make mistakes. But they need to admit to it and move on. Otherwise, people will just assume they're lying about SD820 as well!
Yep. Pretty embarrassing when your unreleased SoC can't beat one that came out months ago. Still, Qualcomm has had a disaster year with SD810, so I wouldn't expect them to be in top form in 2016. If they can at least break even with Exynos in in 2016, they could be in better shape in 2017.
In the interest of transparency, 6 comments have been removed. This isn't the place to attack other posters, and using "gay" as a pejorative isn't something that belongs here.
So let's be honest here. It looks like we are looking at a CPU that can trade blows with the Apple A8 (from 2014) but with a GPU that appears to be at least competitive with and probably has a slight edge over the Apple A9. I was hoping for a little more oomph on the CPU side. Maybe the Samsung's custom M1 cores in the the Exynos 8890 will be more impressive...
Your concept of "blowing out of the water" appears to be skewed. I didn't know 10% (or so) faster would qualify to make such a statement. bodonnell is more accurate using the term "slightly".
What does a few percents of extra GPU power matter when all the flagships have 2k/4k resolutions? All that matters is on screen performance and sadly it doesn't deliver (and if you count the performance drop after the 5 minutes mark, it doesn't even come close to AX chips)
That's true, my comment was purely academic in that other things being equal the GPU in the SD 820 appears to be slightly more powerful. It's true that in real world usage the A9 only has to drive up to a 1080p display, whereas 2016 flagships are likely to mostly have 1440p (or higher) displays.
Well yeah actually they do. This is non-shipping hardware using carefully selected parts in a large form factor. We don't yet know how the average part out of mass production will perform in actual phones, whether it can deliver sustained performance or throttle quickly, etc.
And despite the carefully selected parts and demo platform designed to make the SoC look it's best, it it beaten across the board by the A9.
In quite a few examples also coming in behind the A8, which is two years older than this SOC will be when it hits the street, don't forget about that!
In all fairness, Qualcomm's development devices, like the mdp820, are rarely tuned for performance, and many of the drivers may still have some rough edges around them.
But they're also nowhere near as demanding in terms of battery size and thickness as the production models that vendors will release sometime next year.
The MDP820 is 11 mm thick, and has a battery with over 3000mah, which means that It's hard NOT to provide ample cooling and plenty of battery life.
That may prove to be a lot harder in a cellphone with a sub 9mm case and a 2500 mah battery.
And let's not forget, that when Anandtech tested the 810MDP, there wasn't a trace of overheating to be found.
You do realize that what you're saying is that android has been built to be svelte? This is actually somewhat true given their android one initiative. In practice it would mean that far from being bloated (a really common criticism that folks like to throw at...pretty much any software they are having issues with), it is very carefully built to be used with low hardware requirements. IOW, it would be extraordinarily fast on high-end hardware. All of this is to say that you're mostly wrong.
Being built to run on lowest common denominator hardware isn't necessarily the same as doing it well.
Just look at how fast and smooth WP 7/8/8.1 or iOS runs on phones with just 512 or even 256mb RAM, and compare it with the asthmatic performance you'd usually see from an Android handset with twice as much RAM.
Android has always been bloated and slow compared to its competition (aside from Symbian and BBOS), and part of the explanation is probably that it's developed with the lowest common denominator in mind, with the focus placed on delivering acceptable performance on a handful of SOCs instead of delivering outstanding performance on one or two SOCs.
You haven't explained why I'm wrong except to say I'm wrong. Aiming for acceptable performance in low end devices implies much better performance on much better hardware (all else being equal .. which is the case here). Also, keep in mind that i didn't agree with the premise that Android is built with the lcd in mind.
Your argument makes sense whatsoever. If Android is designed for low end "least common demonstrator" hardware, then it should run circles around the high end hardware?
Anyways, I have heard your argument before, and I heard it many times when the apple fan boys explain why Apple gives you so little memory in its flagship phones. Well guess what, Android doesn't need much memory either. You can do just fine with 1 or 2GB of memory. But in this time, memory is getting dirty cheap so Android phone vendors often throw in a bit of memory as a bonus. On the other hand, Apple has always been an expert at charging the most money for the least hardware. Hence, the "apology" from apple and the apply fan boys that apple gives you so little memory because Apple can run just fine with only 1GB but android cannot. This argument is utterly and stupidly wrong.
Actually, Android devices need a lot more RAM to keep the permanent stuttering from garbage collection halfway under control (but still can't eliminate it because it is fundamentally inherent).
iOS apps only need to push out other unusued apps initially (which can be noticeable but which is required on Android as well) but after they've gained enough RAM they can run completely stutter-free indefinitely since iOS uses deterministic memory management without garbage collection.
As a consequence iOS devices can deliver completely smooth gaming performance, for instance, even with a lot less RAM and without the associated battery power draw, something which Android is fundamentally incapable of due to the choice of garbage collection.
I have a lumia 635 with 512MB of ram. runs like @$$. Slow, laggy, slow loading times, crashing programs. Moto g with 1GB runs flawlessly by comparison.
A Helio X10 SoC found on phones <$200 is already massively overkill for the general user, Android inefficiencies or not.
Last time I checked $600+ devices had a laughingly tiny 5% share in total Android sales last quarter and that of course is something Qualcomm would not mention in their marketing. The truth is hardly anybody cares about high-end SoCs outside synthetic benchmark whores that roams tech review sites.
What you're describing together with the fact of Apple's double-digit total market share is actually a symptom of the fact that Android has almost completely lost the high end of the market to Apple (which is reflected by Apple raking in almost all the profits in the industry).
Which of course is a major problem for Qualcomm: Android devices are not selling at prices comparable to Apple's, meaning that even the high-end Android devices are usually sold at steep discounts which in turn puts major price pressure on the device makers to fight for low component prices. which then effectively caps Qualcomm's pricing range much lower than they probably need to finance their expensive CPU development while they're still trying to catch up to Apple.
I don't think it's fair to blame the processes. Apple ran on the same processes, and didn't have any problems. The problem was in the design of the chip. Let's just get that out there, instead of throwing blame elsewhere.
Last page (closing Thoughts), second paragraph, last line. "not a full-size tablet has was the case in the past couple of generations" I think you meant to say "as" not "has".
Don't forget that the S810 MDP was a tablet form factor, with all the extra cooling and battery and what-not that comes with it, while all the phones using the S810 are in the sub-6" category. Big difference!
The S820 MDP is a phablet form factor, so it should be closer to the reality of using an S820 SoC in a sub-6" phone.
All in all the preliminary results show some impressive performance gains over the older generation of socs in Android phones. Especially the memory bandwidth and gpu performance seems much better. I don't see the A9 as a direct competitor as it is on a whole different OS and I doubt anyone who ever jumps over to a different OS does this as a result of CPU benchmark scores. I expect the Samsung M1 soc to step it up even a bit more especially in multithreaded benchmarks because of the extra cores. All in all it seems 2016 will finaly again be a decent year for Android phones in terms of high end socs. I just hope all this extra power and efficiency won't go wasted on useless gimmicks such as qhd or 4k screens. (VR apllications aside).
Does it use all 4 cores at the same time? 2 of these cores is enough it will not throttle at all. Why would u need 4 cores on a phone even all the macbooks except 15" only have 2 cores. They used a big reference phone so the heat can dissapate.
It will be interesting to see if Microsoft goes with an 820 for their next phone and continues to be hamstrung by a single supplier option or if they get off their duff and compile Win 10 Mobile for X86 Atom in the fabled "Surface Phone"
Based on the rumors I've been reading, sounds like the Atom X3 Surface Phone has been canceled in favor of a newer, upcoming Intel x86 mobile CPU architecture. Personally, I'm still hanging on to my old Lumia 822 until the Surface Phone is released, and I can't be happier if this rumor turns out to be true.
The X3 is old school architecture compared to what 2016 ARM CPU's will have, plus it would have totally sucked on the graphics side of things. Braswell arch ups the graphics capabilities, but is lacking on the CPU side, especially in single threaded. I'm hoping for a brand new, previously unannounced Intel x86 for 2016 that will make it into the Surface Phone.
Microsoft can't afford to cripple their premium line with subpar performance, even if does fill the niche of running x86 apps.
2017, Qualcomm is still very difficult, Apple will use A10, Samsung, as far as I know, Samsung's Exynos M2 architecture than the IPC upgrade Exynos M1 great, S820 and S830 compared to little change in architecture , but Samsung's 10nm LPE process. In fact, the biggest reason Samsung Qualcomm processor it is CDMA, Qualcomm CDMA grasp the vast majority of patents, almost a monopoly, and only China and the US presence CDMA, so Samsung had to use these two regions S820, otherwise, if Exynos8890 need plug-CDMA baseband, which will add significant costs.
This consumes 30% less power than snapdragon 810. 810 throttles 50%. So 820 will still throttle 20%. Better to use only 2 cores like apple so it wont throttle at all and consume less power. 2 of these cores is enough.
I find that my SD800 based LG G2 runs smoothly most of the time, while barely getting hot or sipping battery. So will be impressive and more than sufficient if SD820 based smartphones have an SoC that's for many purposes 200% faster than the old SD800/SD801 parts without overheating or using too much power. I think we are close to be intering the smartphone era when, just like with the PCs, the CPU is fast enough for the most of the ordinary users, but what now matters is the sum of all parts: the screen, the camera, the battery, the design, and the build quality. In this sense in my opinion, the Oneplus X is probably one of the best phones of the year because it delivers all of this in a $250 dollar package.
haha why qualcom spending money on sily cat 12 when we even won't properly use cat 6? maybe not even that lte cat 4 is not properly used.. mainly in usa where 4g is unknown to most ppl
Anandtech only see high performance and only give likes what is most highest.. blindly appraise only high end but never gake a look at battery lige how these faster and faster smartphones stay las just half day.. so pathetic.. They will NEVER review Mate 2 which is 4G and which is the best on battery life.. where these review sites aming going nowadays?
Fast forward to Q4 2016 with a much better benchmark (GB4) and Qualcomm's Kyro looks even more embarrassing: Completely got destroyed by the A10 and even their own A72 SD650 is matching SD820 in ST performance with better IPC to boot. With A73 around the corner there is no good reason for Qualcomm to stick with a custom ARM core.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
146 Comments
Back to Article
kspirit - Thursday, December 10, 2015 - link
What are the chances this shows up in the Galaxy S7? Because let's be honest, a large reason of Qualcomm's drop in revenue is because Samsung is sticking to their own chips now. But the 820 looks interesting indeed.Clayevans - Thursday, December 10, 2015 - link
Probably better than the chance that this will show up in the iPhone 7, haha. But in all seriousness, I think everyone would be very surprised to see Sammy switch back. Any specific guesses on my part would be pure BS.close - Friday, December 11, 2015 - link
It will probably use multiple sourcing like other models do. So Samsung may sell enough phones to have the need to also source SoCs from Qualcomm. Also Samsung sells their Exynos SoC to other vendors so the entire production is not dedicated for internal use.frostyfiredude - Sunday, December 13, 2015 - link
There are a lot of complaints of network issues because of the S6's Shannon modem, in NA especially on both the operator and client sides; a return to the previous method of doing US/Canada with Qualcomm and everyone else Exynos seems likely for that reason. Especially since S820 seems quite decent.jjj - Thursday, December 10, 2015 - link
The rumors were that Qualcomm gets the CDMA carriers while everything else,including international unlocked version, is Exynos.Samsung would save a lot of money with their own SoC so they would use something else only if there was a very good reason to. The perf here doesn't seem like a good enough reason.
Clayevans - Thursday, December 10, 2015 - link
I remember back in the early days of Exynos, the USA galaxy phones would get snapdragon simply because Sammy couldn't produce enough of their own to cover global demand. They don't have that problem anymore. And like you said, there really aren't any compelling reasons for them to go back to Snapdragon that would justify the cost.jjj - Thursday, December 10, 2015 - link
In fairness the CPU perf is great, sure it doesn't seem like it will crush A72 but compared to SD801 the perf is fantastic - SD810 was terrible so can be ignored.In Geekbench in integer and FP it's over 2x the SD801 in single core and that's insane in a phone.That's why a comparison with older desktop cores would be nice. CPU perf is clearly hitting good enough in phones.
The memory perf is stupid good here.
2016 is exciting, what SoC wins doesn't really matter, we the users certainly win.
mmrezaie - Thursday, December 10, 2015 - link
How can you say performance is great?! It is actually at best as good as A72 performances reported so far, and not even on a same node size. So what was all the reason behind research and development when you build something as good as ARM's offering. No wonder other companies feel safe using ARM for their SOCs. I don't think samsung will go for this if they can manufacture their SOC in time.ddriver - Thursday, December 10, 2015 - link
Judging from the name, it will take Kryogenic cooling to really shine ;)BurntMyBacon - Monday, December 14, 2015 - link
@ddriver: "Judging from the name, it will take Kryogenic cooling to really shine ;)"Oh. I thought it meant that kryotic was used in the manufacturing process. ;')
Thanks for clearing that up.
BurntMyBacon - Monday, December 14, 2015 - link
@mmrezaie: "It is actually at best as good as A72 performances reported so far, and not even on a same node size. So what was all the reason behind research and development when you build something as good as ARM's offering."To be fair, Qualcomm used ARM cores for the 810 and that turn out, lets say, less than stellar. I can understand why they might want to go back to their own IP. Even if performance is the same, that doesn't mean power consumption, heat density, etc. are the same. Samsung uses ARM cores for their Exynos chips and IIRC they had to fix a problem with their cores in the past as well to get the cores up to the speed they wanted. Also, designing it yourself can be cheaper than buying the core from ARM if you are proficient enough.
melgross - Thursday, December 10, 2015 - link
What happened, really, was that the Exynos chips weren't as good as the Snapdragon, and would be used in areas of the world where cost was more important that ultimate performance. The 810 wasn't used by them because it wasn't a very good chip, and it's only been recently, with the latter tape-outs, that aimed of the heating problems have been mitigated.lilmoe - Friday, December 11, 2015 - link
"Exynos chips weren't as good as the Snapdragon"I'm not sure when this false perception started, but that's just the fantasy of a handful of Snapdragon apologists. Exynos SoCs have ALWAYS been better than Snapdragons since their inception with the Hummingbird (GS1). Samsung has only stumbled with the Exynos 5410 (where they didn't have a "working" CCI), and less so on the 5420, and that was more like ARM's fault rather than Samsung's. Samsung later "fixed" ARM's design and moved forward with better chips than Qualcomm's offerings starting with the Exynos 5422 (except for the integrated modem part, up until recently that is).
They sometimes ran hotter than Snapdragons but Samsung always fixes things up with updates. Apps and Games have always run better and smoother on Exynos variants (even the 5410), and they've always aged better, *especially* after patches and software updates. I can attest to that through experience, and so can many reviewers.
Snapdragon apologists have always argued "custom ROM support". That's a very small percentage of users. Like REALLY small, to the point of irrelevance.
kspirit - Friday, December 11, 2015 - link
The main reason I do prefer devices with snapdragon SoCs is because of AOSP-based ROMs. You're right, it's a very small minority, but it's still nice to have. Samsung makes A+ hardware but their software is kind of meh to me. Personal preference and all.But Exynos hasn't historically been a very good chipset. Sure, it beat the 810 this year and did well with heat, but there have been shortcomings on Exynos in the past.
I remember back in the days of GS2, Exynos didn't have anything on their chipset to support notification LEDs (and even if they did, no one knew how to use them), while the same was available on Qualcomm's chips. Also, up to at least the Galaxy S4 (maybe even S5), Samsung's Exynos devices had a very seriously annoying lag between the time the home button was pressed and the screen turned on. Someone said it was because of the S-voice shortcut but it persisted on custom ROMs where S-Voice wasn't even there. It was some hardware thing.
TL;DR: being a benchmark destroyer isn't everything.
Andrei Frumusanu - Saturday, December 12, 2015 - link
Notification LEDs have absolutely nothing to so with the SoC...alex3run - Saturday, December 12, 2015 - link
Exynos 5410 wasn't really a failure just because its GPU is FAR ahead of Adreno 320 used in SD600. I was just surprised when saw how bad Adreno 320 runs real 3D games.LiverpoolFC5903 - Monday, December 14, 2015 - link
Thats nowhere near true I am afraid. The Powervr sgx 544 mp3 was beaten in almost every metric by the Adreno 320 rev.2. CPU wise the 5410 was better due to the use of higher ipc A15 cores but GPU performance was better in the Snapdragon 600, in real life as well as in most benchmarks.The Adreno 320 revision 2 is still relevant at around 85 Gflops, peforming better than midrangers like the Adreno 405 and Mali T760 MP2. Its almost identical in performance with the Rogue G6200 used in Mediatek's Mt6595 and Mt6795.
alex3run - Monday, December 21, 2015 - link
Sorry but Adreno 320 v2 doesn't exist at all. Stop believing the delusional lies from a chinese blog.As I said I was just surprised how bad Adreno 320 really is. I don't mind benchmarks because it seems like Qualcomm bought most benchmark makers. Only real performance I trust. Only in real games and apps.
tuxRoller - Monday, December 21, 2015 - link
How do you determine that the pvr sgx 544 is FAR ahead of the adreno 320?Anything reproducible?
V900 - Friday, December 11, 2015 - link
Actually Samsung probably wouldn't save any money by using an Exynos SOC.The two divisions are independent of each other, which means that Samsung the SOC vendor charges Samsung the device vendor the same prices they charge everyone else.
I doubt Apple would let them manufacture their CPUs if they weren't seperate divisions and had firewalls between them.
BurntMyBacon - Monday, December 14, 2015 - link
@V900: "Actually Samsung probably wouldn't save any money by using an Exynos SOC."They'd most likely save some. Just not enough to forgo a better chip if available.
@V900: "I doubt Apple would let them manufacture their CPUs if they weren't seperate divisions and had firewalls between them."
The "firewall" would exist around the fabrication facilities only. R&D and architecture design have no bearing on Apple products. If they are sufficiently proficient at design and the cost of the ARM IP doesn't eat the savings, then they could save some here.
@V900: "The two divisions are independent of each other, which means that Samsung the SOC vendor charges Samsung the device vendor the same prices they charge everyone else."
Current fabrication facilities (TSMC, GloFlo, et al) don't charge the same price per customer. They will give discounts for volume, customer loyalty, just to keep the fabs busy, etc.. Samsung could charge themselves preferred pricing, but it certainly wouldn't be free. How much they could save here is dependent on what they charge vs their competitors (I.E. TSMC) and if there is any margin for preferred pricing. Sometimes they will give their competitors very low margin pricing just to keep the fab busy until they have their next push. Samsung has generally been short on supply, so this hasn't happened much, but given their new expansion, it may be a consideration in the future.
zeeBomb - Thursday, December 10, 2015 - link
Damn it, late!WorldWithoutMadness - Thursday, December 10, 2015 - link
I suppose if they're gonna use Qualcomm one last time, it would be for S7 and Note6. Chances are pretty good to accommodate those who are 'stubborn' with Qualcomm's stuff.After that, they are going to use their M1 and its derivative for everything else, better margin in saturated market is their goal in the first place.
Well this wouldn't be long until Google release their own processor design to standardize Android's madness
zeeBomb - Friday, December 11, 2015 - link
So the summary is...the CPU of Kryo is getting some major competition to Apples A9 but the GPU is great, beating the A9 in many of the tests.Also... The Kyro Snapdragon 820 attained a high 131648 and the Kirin 950 with 95280. Thoughts?
http://www.gizmochina.com/2015/12/11/snapdragon-82...
gg555 - Sunday, December 20, 2015 - link
It has already been heavily leaked that the S7 will use the 820 in some markets.yeeeeman - Friday, March 13, 2020 - link
I can tell you from the future that Samsung will use both Exynos and Snapdragon for GS7. The exynos chip with custom mongoose cores is better.Krysto - Thursday, December 10, 2015 - link
Performance improvements are nice and all, and I'm more excited about the extra features such as Zeroth, Sense ID, and Smart Protect, but Qualcomm must under no circumstance blow it again on the heating/power consumption front. Whatever compromises they need to make for that to not happen again, they must do them.The Snapdragon 810 overheating issue was very much real, even with the latest versions where they claimed to have "fixed" the issue. Play any game on a 810 device for 10 minutes, and you'll see what I mean. The device get unnaturally hot. That's completely unacceptable and should never again be decided as a "compromise" in order to beat Apple in performance or whatever. Never again!
Now, I hope Qualcomm will focus even more on hardware-enabled security features. It also makes no sense for them to support SHA1 anymore, but I guess that was a decision taken years ago. Next version should drop support for it. What I'd like to see is ChaCha20 acceleration as soon as possible, as it will be part of TLS 1.3 and will be included in OpenSSL 1.1.
I also wish Qualcomm would open source more parts of its security-related firmware, and would also open source its baseband firmware (I know, a hard thing to ask but only way we can be sure there's no backdoor in there). Otherwise, at the very least they should try to completely isolate the baseband firmware from most OS functions, so even if the baseband is "owned" they can't take control of the device, other than perhaps listen to phone calls.
Security is only going to become a more and more important feature in future chips, not just for smartphones and PCs, but also for IoT, which direly needs strong security by default, because we all know most IoT OEMs will never update those devices again after people buy them, or will only do it for a short while.
ganz - Thursday, December 10, 2015 - link
I keep seeing people complaining about the heat of the 810. I've got an HTC One M9, and I've played games on it. I'd characterize the experience as, well, warm. Ish. Posts like yours indicate people are experiencing heat that's an order of magnitude greater than I am.Can you give me a sample workload that might allow me to experience this for myself? Barring that, can you give me an objective number in Celsius that's too high for you to bear?
tipoo - Thursday, December 10, 2015 - link
iirc, the M9 had a patch for the overheating issue, but that just ended up throttling performance earlier to never get so hot.
jjj - Thursday, December 10, 2015 - link
It's not about the phone heating up, it's about the chip heating up and having to slow down. The problem is not the heat to your hand, the problem is that the chip slows down hard and you lose perf.So if you want to see it overheating, track clocks and load.
lucam - Friday, December 11, 2015 - link
So if the chip is not heating up it's not a problem, we need to wait if the phone heating up too...very weird point of view..Impulses - Friday, December 11, 2015 - link
Your backwards interpretation of what he said is the only weird thing.ganz - Friday, December 11, 2015 - link
Thanks @tipoo and @jjj. It's just that I see people posting stuff like, "The device get [sic] unnaturally hot." I parse this complaint to mean, "Ow, my hand!", not, "Dang, why so slow?"I think yours is the valid complaint, but what I keep reading are comments like @Krysto's here. And I still think @Krysto and others are actually complaining about the device becoming uncomfortable to hold.
My meaning is this: "Quantify your complaint, @Krysto" If it's *really* the heat (really?), how hot is too hot? If it's the performance, maybe say so.
extide - Thursday, December 10, 2015 - link
Supporting the SHA1 instructions is part of being ARMv 8 compliant -- it's not really their choice...lilmoe - Thursday, December 10, 2015 - link
On the first chart, the memory controller stack of the 820 should be 64bit dual channel.tipoo - Thursday, December 10, 2015 - link
Mixed feels. A move away from the frankly bad Cortex A57 cores, cool. But you could pretty well tell from the very first page that this was set to punch below A9, coudn't you? Longer latencies on float, int, 1/3rd the L2, no L3 at all. And particularly as a big.LITTLE design that means less sharing between high power/low power core clusters.And unlike S810, there's 2, not 4, "big" cores to at least make up for the lackluster IPC 810 had compared to A9.
Just think their aim wasn't high enough...Again...
r3loaded - Thursday, December 10, 2015 - link
Oi, don't slate the A57! Those cores run beautifully in Samsung's Exynos 7 without overheating or throttling, and performance of that SoC is broadly comparable to the A8.Integration and implementation are just as important as the initial selection of IP blocks for an SoC.
tuxRoller - Thursday, December 10, 2015 - link
You're forgetting that Samsung had a rather significant process advantage. Take that away, and they'd need to lower their clocks to sd808 levels, is imagine.Andrei Frumusanu - Thursday, December 10, 2015 - link
The Exynos 5433 runs quite better than the Snapdragon 808.tuxRoller - Thursday, December 10, 2015 - link
According to wikipedia it's clocked at 1.9ghz, which is right around the snapdragon 808.lilmoe - Friday, December 11, 2015 - link
He's not talking about clock speeds. He's talking about the whole package. Samsung has lots more experience with big.LITTLE and their implementations are far superior than competing chips on the SAME process node. Both the Exynos 5433 and the Snapdragon 808 are built on 20nm, yet the Exynos performs AND sustains its performance better than the Snapdragon.testbug00 - Sunday, December 13, 2015 - link
yes, Qualcomm's memory controller was busted. Doesn't make the A57 core any better.It's a pretty bad core compared to just about everything else ARM offers currently. A7, A9, A12/17, A53, A72. All far superior to the A57 overall.
tipoo - Thursday, December 10, 2015 - link
On Samsungs 14nm process, yeah they ran ok. But it effectively cancelled out a generation's worth of fabrication process advantage, just to be able to run the things without throttle hell.melgross - Thursday, December 10, 2015 - link
This should be interesting. Phones delivering this chip will be seen, mostly, during the April-May period. That leaves them about 4 months, on average, before the iPhone 7 with the new A10 comes out. With this behind the A9 in many areas, that doesn't give them much leeway in performance or time.So most of the year leaves Apple's chips basically unchallenged. It seems to me that shipping schedules for flagship Android phones needs to shift, along with the release of high end SoCs to more closely match Apple's release dates, or there will always be this disparity.
While it's often said that Android phone manufacturers are competing against one another more than they are competing against Apple, that's only true because they have a hard time competing against Apple at the higher end. Having phones that better compete in performance on the same release schedule would help somewhat.
This chip really needed to come out last August, not next spring.
Refuge - Friday, December 11, 2015 - link
I disagree, they have no problem competing with Apple at the high end. They won me easily.The A9 is a nice chip, but running iOS its like having Camaro SS with a limiter set at 75mph.
I'm sorry, I just can't and won't consider the two eco-systems in any way similar. People buy the OS first and the device second. Like iphone, but want an android OS? Someone has an iPhone clone out right now just for you.
mdriftmeyer - Friday, December 11, 2015 - link
Wow. The voice of one dot speaking against reality. Apple's SoC designs and implementations are only expanding their leads on the competition. That ecosystem they also dominate in is building ever greater loyalty: they deliver and the software matches the hardware.Move along and hope for the future.
mdriftmeyer - Friday, December 11, 2015 - link
Above comment should have embedded below Refuge.bug77 - Thursday, December 10, 2015 - link
Nice preview, but, as it happens lately, what matters more is sustained performance, not some burst numbers during a single benchmark run.jjj - Thursday, December 10, 2015 - link
Interesting that they seem to be going with a small cache and the memory score is rather nuts for just 2x32bit."And though one could have a spirited argument about whether single-threaded or multi-threaded performance is more important, I’m firmly on the side of ST for most use cases."
Do note that SD820 has 2 cores clocked lower, it's not just 4 vs 8, it's 4+4 vs 2+2. Everybody in the dumb press will be tempeted to forget that 2 cores are clocked lower here .
As for ST perf , the thing is that at this perf level ST is more than enough so it loses relevance. Would be nice if you guys would compare ST perf with Nehalem and newer desktop cores.
Anyway, it blows that you insist on using the same empty synthetic benchmarks that have no relevance at all. SPECint2000 and Geekbench are fine but all else is irrelevant.
"Where the 820 MDP/S makes up for it is in the photo editing score, which is through the roof. Here Qualcomm’s development device holds a 34% performance lead over the next-fastest device, the 810/A57 based Mi Note Pro."
So using the GPU or DSP? If so , is it cheating or (all) actual apps will use the GPU/DSP too, as they should. How about the behaviour of all other SoCs. Long live synthetic! It's like begging them to cheat....
"Apple’s commanding lead in ARM CPU performance."
How is that exactly? Have you actually done any math , at the very least at equal per core power? In die area Apple is far behind but you don't like that being mentioned.
In Geekbench Apple does 2.5k in each of the 3 segments, Kryo does about 2.1k in FP and integer and well over 3k in memory. So 20% higher clocks could eneble Kryo to match Apple's core in FP and integer. It's not impossible that in a dual core config Kryo could clock 20% higher. Same for A72. In the end if MTK can clock 2xA72 at 2.5GHz on 20nm, they could do much better on 16ff+. In theory 16ff+ can provide up to 40% higher speed over 20nm but only some 30% is needed. Ofc A72 is also much much smaller than Apple's core and you can actually make a cheap SoC with it for 150$ phones.
vs A72 ,it's hard to assume things. If A72 goes to 2.5 GHz in quad config and matches the SD820 in power ,then it's somewhat even and not really.
In Geekbench Kryo at 2.15 vs A72 at 2.5Ghz should be about even in integer with Kryo having some 10% lead in FP but Kryo would be at higher per core power.
You got core 3 and 4 at likely half the power (or even less) at max load, so total power is like having 3 cores at max clocks. Folks could do that with A72 too.
Ofc remains to be seen if A72 can reach 2.5GHz or even more with fewer cores and how everybody does in power.
Will be very interesting to see Kryo in server. Assuming it will be a slightly tuned Kryo and not something very different.
A72 does enable others to provide a multitude of configs in different price ranges and that could be interesting. Just today a Xiaomi device showed up in Geekbench with SD618 and just 2GB of RAM. 2GB of RAM would be too little for anything above Redmi 3 and Redmi 3 couldn't be priced above 699CNY (109$). Sure it would be dual A72 at low clocks on 28nm but it's a start.
tipoo - Thursday, December 10, 2015 - link
You're assuming it will happily clock 20% higher with no disproportionate power draw increases. This is what Qualcomm provided, so it only makes sense for the reviewer to test it as they got it, rather than speculating on what it would be while higher clocked.I don't see how Apples die area matters to an end user. The cost is spread through the entirety of the product, they are premium products, but really all that matters in the end to a user is performance and battery life.
jjj - Thursday, December 10, 2015 - link
I will remind you that here you got about the equivalent of 3 cores. So with 2 cores you would need 20% extra perf inside a 50% increase in per core power. So i was factoring in a certain amount of increase in power. If 20% increase in clocks with 50% increase in power is doable remains to be seen, we don't have enough data.That statement was about the core not about the device or the SoC and the core metrics are power,perf and area . The convos about the SoC and the device are different topics.
lucam - Friday, December 11, 2015 - link
Problem of your dogma is that you are comparing a prototype tablet versus a phone.That's why you are already mistaken...
michael2k - Thursday, December 10, 2015 - link
You ignore the fact that Apple has been shipping Kyro class HW since 2014 and Kyro isn't going to ship until 2016. A two year lead is commanding by any kind of definition.You talk as if only Qualcomm has access to 20% higher without also acknowledging that Apple already ships a very similar design, the A9X, that clocks 30% higher and has no L3 that is approximately 70% faster than the Kyro if we assume the Kyro performs similarly to the A8 at 1.4GHz.
jjj - Thursday, December 10, 2015 - link
You don't male any sense at all. Apple's old gen core wasn't all that fast while the ipad Pro had higher clocks because the form factor allows it. We are talking in a phone form factor.it is true that Apple's per core power consumption is a bit of an unknown so certain things are assumed.
Ryan is a fanboy and he tries to argue that fewer cores are better even he knows very well that more cores provide more computing power in the same TDP. AT worked hard to convince everybody that more core are better when Intel did it and now they are working hard to convince everybody that fewer cores are better when Apple does it because their OS is stuck in the past.
Pissedoffyouth - Thursday, December 10, 2015 - link
>Apple's old gen core wasn't all that fast while the ipad Pro had higher clocks because the form factor allows it. We are talking in a phone form factor.The 6s???????????????????????? That's a phone
>that more cores provide more computing power in the same TDP.
Oh yeah checking my facebook requires GPU like parallelism
lucam - Friday, December 11, 2015 - link
He will understand that next year at the time of 825 and A10....michael2k - Thursday, December 10, 2015 - link
http://www.anandtech.com/show/8554/the-iphone-6-re...The only part faster was the Tegra K1 in a tablet form factor.
Honor 6 and Galaxy S6 were close but still overall slower.
testbug00 - Monday, December 14, 2015 - link
you have a few problems. And, the 6s runs very cool compared to every phone with a high end Qualcomm I have had. Which includes S4, SD800x2, SD810.I'm quite confident the iPhone 6s could clock the CPU higher at the expensive of not being able to keep it at max clockspeed in the smaller variant of the phone.
techconc - Wednesday, December 16, 2015 - link
jjj, your comments on this topic are off-base. For starters, there is nothing stuck in the past about iOS. It handles symmetric multiprocessing as well as any device. Realistically, there are far more work roads that are optimized for one processor. Every task will feel faster when a device with fewer but faster cores. There are very few workloads that truly benefit from multiple processors.extide - Thursday, December 10, 2015 - link
It's not using the GPU or DSP. That doesnt just happen automagically ... the app needs to be specifically coded to do that. The reason it gets a high photo editing score is because if has really really great FP performance. Note the Geekbench FP scores -- it is able to beat the 810 in MT in all but ONE test with half as many cores and those 2 cores running much slower.Andrei Frumusanu - Thursday, December 10, 2015 - link
> That doesnt just happen automagically ... the app needs to be specifically coded to do that.The app doesn't have to be coded for it. The photo processing part is done in RenderScript by stock Android APIs and that can use the GPU or fall back to the CPU if it fails to do so. Almost all recent Android devices use the GPU. The only time I saw something explicitly fall back to the CPU was on the G4 and it was just a small portion of the test.
jospoortvliet - Saturday, December 12, 2015 - link
This is interesting, perhaps add such information in the article next time?jospoortvliet - Saturday, December 12, 2015 - link
This is interesting, perhaps add such information in the article next time?tuxRoller - Thursday, December 10, 2015 - link
Who cares about die size? Sure, it can affect yield and pricing but it obviously isn't affecting then much and, in the only evaluation that matters, Apple is able to have a CPU that obliterates every other mobile chip AND is and to sustain peak performance.BTW, do you have a link to a kryo device that scores 2.1k on int? All of the ones I've seen are around 1.8k, with their total brought up by their massive memory and fp scores (totals if around 2.5k).
Wilco1 - Saturday, December 12, 2015 - link
820 vs A72 results: http://browser.primatelabs.com/geekbench3/4159755Shadowmaster625 - Thursday, December 10, 2015 - link
I would have liked to have seen a comparison of S810 MDP vs final hardware so we can get some kind of idea of the amount of optimization to expect.Ryan Smith - Thursday, December 10, 2015 - link
S810 MDP/T was a tablet, and S810 suffered badly under 20nm planar. If you want to see how it compares check out our HTC One M9 review, but it wouldn't tell us anything useful about what MDP/S -> retail will look like.cfenton - Thursday, December 10, 2015 - link
This doesn't inspire confidence. 820 looks much better than 810 at ST tasks and worse or equal in MT tasks, which seems like a good trade-off to me. However, seeing the 820 beaten in almost every benchmark by the A9 is troubling. You can't launch a flagship product six months after a competitor that performs worse than that competitor's product. Well, I guess you can, but it doesn't seem like a good move.Even if you rule out the A9, since other manufacturers can't buy those, it's not that much better than Samsung's almost year-old (by the time the 820 is in a product) SoC in the GS6.
tipoo - Thursday, December 10, 2015 - link
I'm curious what will be in the GS7. Another Exynos in the NA market? If they keep the momentum from the last one, they could perhaps punch above the 820 just like they did the 810. Not sure if their custom cores will be ready in time though?
Araa - Thursday, December 10, 2015 - link
To be frank, same thing has been happening since the A7 (even the A6). Problem is back then they used to cheat on Benchmarks or only show us bs Benchmarks like Antutu that only care about the number of cores so the numbers lied back then, they do now too just not as outrageous as before...extide - Thursday, December 10, 2015 - link
Dude, it has half as many cores. It's a 2+2 design vs a 4+4 design. The fact that is is able to stay close in INteger and actually win in all but one test in FP is actually really good.Although it kinda makes me wish they did a 3+3 design, but oh well, maybe they will in the future.
saratoga4 - Thursday, December 10, 2015 - link
Got to save something for the Snapdragon 825 a year later.alex3run - Friday, December 11, 2015 - link
A9 has still worse CPU performance than Exynos 7420.techconc - Wednesday, December 16, 2015 - link
How so?Thermogenic - Thursday, December 10, 2015 - link
What I'm gathering from these charts is that every 2016 Android smartphone will have lesser performance than the 2015 iPhone, unless Samsung pulls some magic out of their hat with the next version of their chipset.tuxRoller - Thursday, December 10, 2015 - link
Yes, but so what? Apple had had better cpus for several years now. Nothing new here.The more interesting thing is that despite that, Android is still able to keep up with the "highly optimized" "vertically integrated" ios with flagships when it comes to everyday tasks.
MykeM - Thursday, December 10, 2015 - link
http://www.xda-developers.com/marshmallow-reduces-...tuxRoller - Friday, December 11, 2015 - link
....and? I'm aware of android's latency issues (their touch latency is...not good ether).jasonelmore - Friday, December 11, 2015 - link
what good is the performance if it's a sandboxed phone? cant carry around files on the phone, use it as portable computer with flash drive option.Constructor - Saturday, December 12, 2015 - link
That has never been true for iOS either. I've been doing that with my respective iPhones for years by now.There are multiple options for that, but I personally use GoodReader as my generic file system manager on my iPhone and on my iPad. I can expose my files as a mountable network drive and I can also mount remote drives as well. I can also access files on DropBox and similar services from apps directly if I want.
This is not a real issue – nowadays it's mostly just a myth propagated among people who don't know better.
Constructor - Saturday, December 12, 2015 - link
And, of course, that's just on top of a much more secure OS and running on stronger, more efficient processors to begin with.There are legitimate reasons to choose Android, but yours is not among them.
syxbit - Thursday, December 10, 2015 - link
I just wish Qualcomm would admit the SD810 was terrible. Everyone knows it, and they spend time and money lying about it. Just look at their blog post:https://www.qualcomm.com/news/snapdragon/2015/02/1...
That alone makes me not want to trust them ever again. People/companies make mistakes. But they need to admit to it and move on. Otherwise, people will just assume they're lying about SD820 as well!
sritacco - Thursday, December 10, 2015 - link
sooo ... 820 comes out next year and the iphone6 already kicks it's ass? nice tech leap.syxbit - Thursday, December 10, 2015 - link
Yep. Pretty embarrassing when your unreleased SoC can't beat one that came out months ago.Still, Qualcomm has had a disaster year with SD810, so I wouldn't expect them to be in top form in 2016. If they can at least break even with Exynos in in 2016, they could be in better shape in 2017.
Rixxos - Friday, December 11, 2015 - link
You mean the IPhone 6sSydneyBlue120d - Thursday, December 10, 2015 - link
Is unlimited 1080p60 or 2160p60 HEVC encoding with both IOS and HDR enabled supported by the device? Thanks a lot.Ryan Smith - Thursday, December 10, 2015 - link
In the interest of transparency, 6 comments have been removed. This isn't the place to attack other posters, and using "gay" as a pejorative isn't something that belongs here.tipoo - Thursday, December 10, 2015 - link
I vote for a script that automatically removes any first comments that just state "first" :Pjospoortvliet - Saturday, December 12, 2015 - link
Thanks for keeping the comments clean and for being transparent about it.jospoortvliet - Saturday, December 12, 2015 - link
Thanks for keeping the comments clean and for being transparent about it.bodonnell - Thursday, December 10, 2015 - link
So let's be honest here. It looks like we are looking at a CPU that can trade blows with the Apple A8 (from 2014) but with a GPU that appears to be at least competitive with and probably has a slight edge over the Apple A9. I was hoping for a little more oomph on the CPU side. Maybe the Samsung's custom M1 cores in the the Exynos 8890 will be more impressive...jasonelmore - Thursday, December 10, 2015 - link
the GPU cores are blowing A9 out of the water.. it's not slight edge..ciderrules - Thursday, December 10, 2015 - link
Your concept of "blowing out of the water" appears to be skewed. I didn't know 10% (or so) faster would qualify to make such a statement. bodonnell is more accurate using the term "slightly".jasonelmore - Friday, December 11, 2015 - link
GFX Bench Texturing: 20%; GFX Bench ALU: 20% GFX Bench Physics: 18%, All the Offscreen benchmarks 12%,And that's using pre-production chips, with pre-production drivers and software.. Imagine when this thing ships and the software has been optimized.
3dmark is the outlier, and other sites are reporting this is a software driver problem.
bodonnell - Thursday, December 10, 2015 - link
Are we looking at the same benchmarks? That's sad if you consider that blowing it out of the water.Araa - Thursday, December 10, 2015 - link
What does a few percents of extra GPU power matter when all the flagships have 2k/4k resolutions? All that matters is on screen performance and sadly it doesn't deliver (and if you count the performance drop after the 5 minutes mark, it doesn't even come close to AX chips)bodonnell - Thursday, December 10, 2015 - link
That's true, my comment was purely academic in that other things being equal the GPU in the SD 820 appears to be slightly more powerful. It's true that in real world usage the A9 only has to drive up to a 1080p display, whereas 2016 flagships are likely to mostly have 1440p (or higher) displays.bodonnell - Thursday, December 10, 2015 - link
Also remains to be seen how the SD 820 will throttle in actual devices...jasonelmore - Thursday, December 10, 2015 - link
and now the apple chip gods don't look so untouchable....ws3 - Thursday, December 10, 2015 - link
Well yeah actually they do.This is non-shipping hardware using carefully selected parts in a large form factor. We don't yet know how the average part out of mass production will perform in actual phones, whether it can deliver sustained performance or throttle quickly, etc.
And despite the carefully selected parts and demo platform designed to make the SoC look it's best, it it beaten across the board by the A9.
V900 - Friday, December 11, 2015 - link
In quite a few examples also coming in behind the A8, which is two years older than this SOC will be when it hits the street, don't forget about that!In all fairness, Qualcomm's development devices, like the mdp820, are rarely tuned for performance, and many of the drivers may still have some rough edges around them.
But they're also nowhere near as demanding in terms of battery size and thickness as the production models that vendors will release sometime next year.
The MDP820 is 11 mm thick, and has a battery with over 3000mah, which means that It's hard NOT to provide ample cooling and plenty of battery life.
That may prove to be a lot harder in a cellphone with a sub 9mm case and a 2500 mah battery.
And let's not forget, that when Anandtech tested the 810MDP, there wasn't a trace of overheating to be found.
http://slatedroid.info/2015/02/anandtech’s-snapdragon-810-preview-no-overheating-issues-spotted/
StrangerGuy - Thursday, December 10, 2015 - link
If you ask me Qualcomm's main problem is not the chip but rather Android software is overwhelming built to run on lowest common denominator hardware.tuxRoller - Friday, December 11, 2015 - link
You do realize that what you're saying is that android has been built to be svelte? This is actually somewhat true given their android one initiative. In practice it would mean that far from being bloated (a really common criticism that folks like to throw at...pretty much any software they are having issues with), it is very carefully built to be used with low hardware requirements. IOW, it would be extraordinarily fast on high-end hardware.All of this is to say that you're mostly wrong.
V900 - Friday, December 11, 2015 - link
Ehm, no. Actually it would be you who is wrong.Being built to run on lowest common denominator hardware isn't necessarily the same as doing it well.
Just look at how fast and smooth WP 7/8/8.1 or iOS runs on phones with just 512 or even 256mb RAM, and compare it with the asthmatic performance you'd usually see from an Android handset with twice as much RAM.
Android has always been bloated and slow compared to its competition (aside from Symbian and BBOS), and part of the explanation is probably that it's developed with the lowest common denominator in mind, with the focus placed on delivering acceptable performance on a handful of SOCs instead of delivering outstanding performance on one or two SOCs.
tuxRoller - Friday, December 11, 2015 - link
You haven't explained why I'm wrong except to say I'm wrong.Aiming for acceptable performance in low end devices implies much better performance on much better hardware (all else being equal
.. which is the case here).
Also, keep in mind that i didn't agree with the premise that Android is built with the lcd in mind.
UtilityMax - Saturday, December 12, 2015 - link
iOS runs like crap on those devices with 256-521MB of RAM. I used used my iPhone 4 recently.UtilityMax - Saturday, December 12, 2015 - link
Your argument makes sense whatsoever. If Android is designed for low end "least common demonstrator" hardware, then it should run circles around the high end hardware?Anyways, I have heard your argument before, and I heard it many times when the apple fan boys explain why Apple gives you so little memory in its flagship phones. Well guess what, Android doesn't need much memory either. You can do just fine with 1 or 2GB of memory. But in this time, memory is getting dirty cheap so Android phone vendors often throw in a bit of memory as a bonus. On the other hand, Apple has always been an expert at charging the most money for the least hardware. Hence, the "apology" from apple and the apply fan boys that apple gives you so little memory because Apple can run just fine with only 1GB but android cannot. This argument is utterly and stupidly wrong.
Mondozai - Saturday, December 12, 2015 - link
Calling people fanboys on mobile tech discussions is our equivalent of Godwin's law. You are just showing the limits of your intellect.Fact is, Android is more bloated because it has far more targets to hit than iOS. But it's still miles ahead of where it used to be.
Constructor - Saturday, December 12, 2015 - link
Actually, Android devices need a lot more RAM to keep the permanent stuttering from garbage collection halfway under control (but still can't eliminate it because it is fundamentally inherent).iOS apps only need to push out other unusued apps initially (which can be noticeable but which is required on Android as well) but after they've gained enough RAM they can run completely stutter-free indefinitely since iOS uses deterministic memory management without garbage collection.
As a consequence iOS devices can deliver completely smooth gaming performance, for instance, even with a lot less RAM and without the associated battery power draw, something which Android is fundamentally incapable of due to the choice of garbage collection.
TheinsanegamerN - Sunday, December 13, 2015 - link
I have a lumia 635 with 512MB of ram. runs like @$$. Slow, laggy, slow loading times, crashing programs. Moto g with 1GB runs flawlessly by comparison.StrangerGuy - Friday, December 11, 2015 - link
A Helio X10 SoC found on phones <$200 is already massively overkill for the general user, Android inefficiencies or not.Last time I checked $600+ devices had a laughingly tiny 5% share in total Android sales last quarter and that of course is something Qualcomm would not mention in their marketing. The truth is hardly anybody cares about high-end SoCs outside synthetic benchmark whores that roams tech review sites.
Mondozai - Saturday, December 12, 2015 - link
Apple has 13-14% market share and they only do premium devices. You adf the Android space and you get close to 20% of TAM.You're ignorant.
Constructor - Saturday, December 12, 2015 - link
What you're describing together with the fact of Apple's double-digit total market share is actually a symptom of the fact that Android has almost completely lost the high end of the market to Apple (which is reflected by Apple raking in almost all the profits in the industry).Which of course is a major problem for Qualcomm: Android devices are not selling at prices comparable to Apple's, meaning that even the high-end Android devices are usually sold at steep discounts which in turn puts major price pressure on the device makers to fight for low component prices. which then effectively caps Qualcomm's pricing range much lower than they probably need to finance their expensive CPU development while they're still trying to catch up to Apple.
babadivad - Thursday, December 10, 2015 - link
Sounds like a 64bit Krait CPULochheart - Friday, December 11, 2015 - link
Don't forget that Apple has already 3 generations of 64bit Soc. This is the first from Qualcomm (in terms of custom ARM).melgross - Thursday, December 10, 2015 - link
I don't think it's fair to blame the processes. Apple ran on the same processes, and didn't have any problems. The problem was in the design of the chip. Let's just get that out there, instead of throwing blame elsewhere.xboxfanj - Friday, December 11, 2015 - link
Ryan, there is a compiler that accepts Kryo as a target (Qualcomm's LLVM/Clang fork)https://developer.qualcomm.com/download/snapdragon...
Babar Javied - Friday, December 11, 2015 - link
Last page (closing Thoughts), second paragraph, last line. "not a full-size tablet has was the case in the past couple of generations" I think you meant to say "as" not "has".V900 - Friday, December 11, 2015 - link
It's hard to be wildly optimistic with how the 820 will perform out in the wild next year.Let's not forget, that everything looked swell with the 810, when Anandtech tested the MDP for that SOC.
http://slatedroid.info/2015/02/anandtech’s-snapdragon-810-preview-no-overheating-issues-spotted/
phoenix_rizzen - Monday, December 21, 2015 - link
Don't forget that the S810 MDP was a tablet form factor, with all the extra cooling and battery and what-not that comes with it, while all the phones using the S810 are in the sub-6" category. Big difference!The S820 MDP is a phablet form factor, so it should be closer to the reality of using an S820 SoC in a sub-6" phone.
Rixxos - Friday, December 11, 2015 - link
All in all the preliminary results show some impressive performance gains over the older generation of socs in Android phones. Especially the memory bandwidth and gpu performance seems much better. I don't see the A9 as a direct competitor as it is on a whole different OS and I doubt anyone who ever jumps over to a different OS does this as a result of CPU benchmark scores. I expect the Samsung M1 soc to step it up even a bit more especially in multithreaded benchmarks because of the extra cores. All in all it seems 2016 will finaly again be a decent year for Android phones in terms of high end socs. I just hope all this extra power and efficiency won't go wasted on useless gimmicks such as qhd or 4k screens. (VR apllications aside).bushgreen - Friday, December 11, 2015 - link
Does it use all 4 cores at the same time? 2 of these cores is enough it will not throttle at all. Why would u need 4 cores on a phone even all the macbooks except 15" only have 2 cores. They used a big reference phone so the heat can dissapate.bushgreen - Friday, December 11, 2015 - link
Does it use all 4 cores at the same time?tipoo - Friday, December 11, 2015 - link
In benchmarks it does. The question is how much the governor will do that in real apps.milli - Friday, December 11, 2015 - link
Went ahead and made a comparison to the A9 in Geekbench.http://i.imgur.com/Qq4OHys.png
Mondozai - Saturday, December 12, 2015 - link
Thanks! Useful stuff. So slower in SP, the most important metric. Disappointing bit expected. This is why Google are launching their SoC initiative.Gunbuster - Friday, December 11, 2015 - link
It will be interesting to see if Microsoft goes with an 820 for their next phone and continues to be hamstrung by a single supplier option or if they get off their duff and compile Win 10 Mobile for X86 Atom in the fabled "Surface Phone"SpartyOn - Friday, December 11, 2015 - link
Based on the rumors I've been reading, sounds like the Atom X3 Surface Phone has been canceled in favor of a newer, upcoming Intel x86 mobile CPU architecture. Personally, I'm still hanging on to my old Lumia 822 until the Surface Phone is released, and I can't be happier if this rumor turns out to be true.The X3 is old school architecture compared to what 2016 ARM CPU's will have, plus it would have totally sucked on the graphics side of things. Braswell arch ups the graphics capabilities, but is lacking on the CPU side, especially in single threaded. I'm hoping for a brand new, previously unannounced Intel x86 for 2016 that will make it into the Surface Phone.
Microsoft can't afford to cripple their premium line with subpar performance, even if does fill the niche of running x86 apps.
Mondozai - Saturday, December 12, 2015 - link
Still using L1520 and amazingly happy. WP is so goddamn fluid. Personally not missing any apps.ws3 - Saturday, December 12, 2015 - link
I know something me who still uses an original iPhone from 2007. Like your outdated Lumina, it also is very fluid and lacks apps.ws3 - Saturday, December 12, 2015 - link
Oops - "something me" should be "someone"toyotabedzrock - Friday, December 11, 2015 - link
Please tell me they are really not expecting that 3gb will be enough.Exynostein - Friday, December 11, 2015 - link
2017, Qualcomm is still very difficult, Apple will use A10, Samsung, as far as I know, Samsung's Exynos M2 architecture than the IPC upgrade Exynos M1 great, S820 and S830 compared to little change in architecture , but Samsung's 10nm LPE process. In fact, the biggest reason Samsung Qualcomm processor it is CDMA, Qualcomm CDMA grasp the vast majority of patents, almost a monopoly, and only China and the US presence CDMA, so Samsung had to use these two regions S820, otherwise, if Exynos8890 need plug-CDMA baseband, which will add significant costs.bushgreen - Saturday, December 12, 2015 - link
This consumes 30% less power than snapdragon 810. 810 throttles 50%. So 820 will still throttle 20%. Better to use only 2 cores like apple so it wont throttle at all and consume less power. 2 of these cores is enough.UtilityMax - Saturday, December 12, 2015 - link
I find that my SD800 based LG G2 runs smoothly most of the time, while barely getting hot or sipping battery. So will be impressive and more than sufficient if SD820 based smartphones have an SoC that's for many purposes 200% faster than the old SD800/SD801 parts without overheating or using too much power. I think we are close to be intering the smartphone era when, just like with the PCs, the CPU is fast enough for the most of the ordinary users, but what now matters is the sum of all parts: the screen, the camera, the battery, the design, and the build quality. In this sense in my opinion, the Oneplus X is probably one of the best phones of the year because it delivers all of this in a $250 dollar package.Mondozai - Saturday, December 12, 2015 - link
Im using SD800 in my 1520. Its already fast enough. Although I wonder how much of that is due to the smoothness of WP.TelstarTOS - Saturday, December 12, 2015 - link
Good, but not good enough.Rixxos - Sunday, December 13, 2015 - link
Not good enough for what?jruhe - Sunday, December 13, 2015 - link
A72 (Amazon Fire TV 2) vs. Kyro @Geekbenchhttps://browser.primatelabs.com/geekbench3/compare...
milli - Monday, December 14, 2015 - link
The Kryo scores you use, are very early results.I made a comparison between the Amazon Fire TV 2 and the 820, using Anand's numbers.
http://i.imgur.com/O3L6bQM.png
Wilco1 - Monday, December 14, 2015 - link
This is a more realistic comparison - same frequency, phone form factor: http://browser.primatelabs.com/geekbench3/compare/...The 820 looks like a FP monster (no doubt helped by the high bandwidth), but A72 wins ST integer.
Ethos Evoss - Tuesday, December 15, 2015 - link
haha why qualcom spending money on sily cat 12 when we even won't properly use cat 6? maybe not even that lte cat 4 is not properly used..mainly in usa where 4g is unknown to most ppl
Ethos Evoss - Tuesday, December 15, 2015 - link
Anandtech only see high performance and only give likes what is most highest.. blindly appraise only high end but never gake a look at battery lige how these faster and faster smartphones stay las just half day..so pathetic.. They will NEVER review Mate 2 which is 4G and which is the best on battery life.. where these review sites aming going nowadays?
jakoh - Monday, May 30, 2016 - link
Does the kyro cores have L3 cache?StrangerGuy - Saturday, October 1, 2016 - link
Fast forward to Q4 2016 with a much better benchmark (GB4) and Qualcomm's Kyro looks even more embarrassing: Completely got destroyed by the A10 and even their own A72 SD650 is matching SD820 in ST performance with better IPC to boot. With A73 around the corner there is no good reason for Qualcomm to stick with a custom ARM core.