NVIDIA Teases “GeForce Gaming Celebration” For August 20th At Gamescom 2018
by Ryan Smith on July 30, 2018 8:45 PM EST- Posted in
- GPUs
- GeForce
- Trade Shows
- NVIDIA
This afternoon NVIDIA announced their plans for a public “GeForce Gaming Celebration” later this month, taking place amidst the Gamescom expo in Cologne, Germany. Promising talk of games and “spectacular surprises,” this marks the first real GeForce-branded event that NVIDIA has held this year, and just over two years since they’ve held such a large event opposite a major gaming expo.
The world’s biggest gaming expo, Gamescom 2018, runs August 21-25 in Cologne, Germany. And GeForce will loom large there -- at our Gamescom booth, Hall 10.1, Booth E-072; at our partners’ booths, powering the latest PC games; and at our own off-site GeForce Gaming Celebration that starts the day before [August 20th].
The event will be loaded with new, exclusive, hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest game developers, and some spectacular surprises.
The timing of the event along with the vague description of what it’s about is sure to drive speculation about what exactly NVIDIA will have to show off, especially as we're approaching the end of NVIDIA's usual 24 - 30 month consumer product cycle. Their 2016 event opposite Dreamhack was of course the reveal of the GeForce 10 Series. And the date of this year’s event – August 20th – happens to be same day as the now redacted/canceled NVIDIA Hot Chips presentation about “NVIDIA’s Next Generation Mainstream GPU.”
For what it’s worth, NVIDIA’s 2016 teaser didn’t say anything about the event in advance – it was merely an invitation to an unnamed event – whereas this teaser specifically mentions games and surprises. So make of that what you will.
Meanwhile, as noted previously, this is a public event. So NVIDIA says that there is a limited amount of space for Gamescom attendees and other locals to register and catch the event in person. Otherwise like most other NVIDIA events, this event will be live streamed for the rest of the world, with the event kicking off at 6pm CET.
Source: NVIDIA
41 Comments
View All Comments
Stochastic - Monday, July 30, 2018 - link
It would be a huge PR blunder to tease this and NOT announce 11-series GPUs. This is as close to a confirmation of what we all know is coming as we're going to get.imaheadcase - Monday, July 30, 2018 - link
With all the leaks out about its specs they might as well.Yojimbo - Monday, July 30, 2018 - link
I dunno, I doubt it would be a PR blunder. Look at Nintendo and the Switch. It took forever for them to tell when it would be released and they kept announcing and holding events and never talking about the Switch (which wasn't yet released as the name at those times) at them. Then the Switch was actually announced, later than everyone was hoping. It seemed to sell just fine and I haven't noticed any public backlash over it.edzieba - Tuesday, July 31, 2018 - link
Bingo. The only people this would be a 'PR disaster' for are those who take rumours as Cast Iron Double Confirmed Absolute Fact, then complain when they turn out not to be. That's an unfortunately large population in terms of internet whinging volume, but very small in terms of actual buyers.Samus - Tuesday, July 31, 2018 - link
Yep, one of many reasons people hate apple is actually something that’s out of apples control: rumors.haukionkannel - Tuesday, July 31, 2018 - link
It could be another game related announcement. Like new web based game streaming etc...CaedenV - Tuesday, July 31, 2018 - link
But they are just now announcing new GTX 10-series chips... I am not sure 11 series is really in the cards until next Spring at the earliest.Maybe an announcement and a soft-launch, but not expecting any real products for a bit yet
PeachNCream - Tuesday, July 31, 2018 - link
Eh, whatever. Discrete graphics cards have remained above MSRP for pretty much the entire duration of the current generation's retail lifespan. In addition to that, the top end has grown significantly in price as the number of competitors has decreased over the last couple of decades. Factor in the dramatic increase in TDP that makes double slot coolers with multiple fans or system blowers plus independent, dedicated power delivery from the PSU a necessity and the graphics picture gets even less rosy. Technological advancement in graphics has included so much brute force that isn't replicated in other major components that it's just a bleak and depressing element of the computing industry. NVIDIA could announce they were selling gold foil wrapped, chocolate chicken butts and it would matter just as little to me as a new generation of graphics processors.webdoctors - Tuesday, July 31, 2018 - link
There's a lot to complain about, but this part is COMPLETELY WRONG:actor in the dramatic increase in TDP that makes double slot coolers with multiple fans or system blowers plus independent, dedicated power delivery from the PSU a necessity and the graphics picture gets even less rosy. Technological advancement in graphics has included so much brute force that isn't replicated in other major components
The scaling of performance in GPU compared to TDP or Si area dwarfs what you've seen in CPU, RAM, power supply, case fans. What other components are you comparing to?
How much faster are CPUs today compared to Sandybridge from 2011? 50% ? GPUs are several factors faster....
PeachNCream - Tuesday, July 31, 2018 - link
Most AGP and 32-bit PCI graphics cards were powered entirely by the slot in which they resided and many didn't even require heatsinks like the S3 ViRGE DX or the Tseng Labs ET6100. I think the first video card I owned that even had a heatsink on the graphics chip was a Diamond Stealth V550 and that, unsurprisingly, was powered by a NVIDIA chip (the Riva TnT). So yes, graphics cards are now an obscenity when it comes to heat, noise, and power if you want anything high end. At the same time, AMD and Intel both have kept the TDP of their processors down to reasonable levels (around 35-65W for mainstream desktop hardware) so I see no reason why a GPU has to be such a power hog.