Graphics

NVIDIA's Ray Tracing Tech Will Soon Run On Older GTX Cards (engadget.com) 98

NVIDIA's older GeForce GTX 10-series cards will be getting the company's new ray-tracing tech in April. The technology, which is currently only available on its new RTX cards, "will work on GPUs from the 1060 and up, albeit with some serious caveats," reports Engadget. "Some games like Battlefield V will run just fine and deliver better visuals, but other games, like the freshly released Metro Exodus, will run at just 18 fps at 1440p -- obviously an unplayable frame-rate." From the report: What games you'll be able to play with ray-tracing tech (also known as DXR) on NVIDIA GTX cards depends entirely on how it's implemented. In Battlefield V, for instance, the tech is only used for things like reflections. On top of that, you can dial down the strength of the effect so that it consumes less computing horsepower. Metro Exodus, on the other hand, uses ray tracing to create highly realistic "global illumination" effects, simulating lighting from the real world. It's the first game that really showed the potential of RTX cards and actually generated some excitement about the tech. However, because it's so computationally intensive, GTX cards (which don't have the RTX tensor cores) will be effectively be too slow to run it.

NVIDIA explained that when it was first developing the next gen RTX tech, it found chips using Pascal tech would be "monster" sized and consume up to 650 watts. That's because the older cards lack both the integer cores and tensor cores found on the RTX cards. They get particularly stuck on ray-tracing, running about four times slower than the RTX cards on Metro Exodus. Since Metro Exodus is so heavily ray-traced, the RTX cards run it three times quicker than older GTX 10-series cards. However, that falls to two times for Shadow of the Tomb Raider, and 1.6 times for Battlefield V, because both of those games use ray tracing less. The latest GTX 1660 and 1660 Ti GPUs, which don't have RT but do have integer cores, will run ray-traced games moderately better than last-gen 10-series GPUs.
NVIDIA also announced that Unity and Unreal Engine now support ray-tracing, allowing developers to implement the tech into their games. Developers can use NVIDIA's new set of tools called GameWorks RTX to achieve this.

"It includes the RTX Denoiser SDK that enables real-time ray-tracing through techniques that reduce the required ray count and number of samples per pixel," adds Engadget. "It will support ray-traced effects like area light shadows, glossy reflections, ambient occlusion and diffuse global illumination (the latter is used in Metro Exodus). Suffice to say, all of those things will make game look a lot prettier."
Displays

VR Company Co-Founder Spends an Entire Week in a VR Headset (pcgamer.com) 39

An anonymous reader quotes PC Gamer: Not too long into a 168-hour VR marathon session, Jak Wilmot admits the monotony got to him. Wilmot, who is the co-founder of Disrupt VR, also says this experiment is "quite possibly the dumbest thing" he's ever done. So, why do it? For science, of course. I can't imagine immersing myself in a virtual world for a full week, nonstop night and day. Wilmot did it, though, for the most part -- he allowed himself 30 seconds to switch VR headsets when needed, and 30 seconds without a headset on to eat, if required. Other than those small breaks, he spent every other moment in VR...

There doesn't seem to be some big takeaway from this experiment (aside from, perhaps, don't drink coffee while playing VR), though one thing I also found interesting was his integration back into the real world when the experiment was over. "I have never appreciated the smell of outside air so much. One thing we cannot replicate is nature. We can do it visually and auditorally, but there is something about the energy of outside that is amazing," Wilmot observed.

PC Gamer calls it "probably at least partially a publicity stunt. But it's still interesting to see how donning a VR headset for an extended period of time and essentially living in virtual worlds can mess with the mind." Wilmot wore VR gear while working -- and even while showering (with the VR gear protected by plastic), blacking out his windows so he couldn't tell day from night, calling it "a week in the future..."

"I almost feel like I'm in my own 500-suare-foot spaceship," he says at one point, "and I'm really missing earth, and I'm missing nature." Early on he also reported some mild claustrophobia.

You can watch the moment where after seven days he removes the headset and returns to conventional reality, joking "Oh my gosh, the graphics are so good." He reports a slight disorientation as his eyes catch up with real ilfe, and says it changed his perspective on people in the real world, seeing them as "individuals in one collection, one environment -- as avatars."
Patents

Apple Dealt Legal Blow as Jury Awards Qualcomm $31 Million (cnet.com) 47

Apple violated three Qualcomm patents and should pay the chipmaker $31 million for infringing on its technology, a jury decided Thursday, giving Qualcomm momentum as it heads into another legal skirmish with the iPhone maker next month. From a report: Qualcomm, which filed the suit in July 2017, alleged Apple had used its technology without permission in some versions of its popular iPhone. The jury awarded Qualcomm the full amount it requested at the start of the two-week trial, which was held in San Diego. One disputed Qualcomm patent covers technology that lets a smartphone quickly connect to the internet once the device is turned on. Another deals with graphics processing and battery life. The third addresses technology that shifts traffic between a phone's apps processor and modem. The $31 million in damages -- or $1.41 per infringing iPhone -- is a drop in the bucket for Apple, a company that briefly became a $1 trillion company last year. But it marks an important victory for Qualcomm, burnishing its reputation as a mobile components innovator. The win also lends credibility to the notion that much of the company's innovation is reflected in iPhones.
Graphics

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101

MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
Businesses

Amazon Lobbied More Government Entities Than Any Other Public US Company Last Year (fortune.com) 41

Amazon lobbied more government entities last year than any other public U.S. company, covering issues like healthcare, transportation, defense, and labor regulation. "Across 2018, Amazon contacted 40 different federal entities on 21 different general issue areas," reports Fortune, citing a report from Axios. "The only tech giant to lobby on more issues than Amazon was Google's Alphabet." From the report: In terms of money spent, Amazon's $14.4 million is topped only by Alphabet's $21 million, says Bloomberg. While the tech industry overall spent less than half of the $280 million from pharmaceutical and healthcare products companies in Washington, Amazon has increased spending 460% since 2012, growing quickly within its trade. According to Axios, Amazon lobbied on self-driving car and drone issues, hinting at new methods of delivery. It supported a law allowing pharmacists to tell patients when using their insurance is actually more expensive, aiding Amazon's new investment in PillPack. It also covered the labeling of bioengineered food and a pilot program allowing online shoppers to use the Supplemental Nutritional Assistance Program -- signs of Amazon's emerging grocery business.
Graphics

Microsoft Brings DirectX 12 To Windows 7 (anandtech.com) 119

Microsoft has announced a form of DirectX 12 that will support Windows 7. "Now before you get too excited, this is currently only enabled for World of Warcraft; and indeed it's not slated to be a general-purpose solution like DX12 on Win10," reports AnandTech. "Instead, Microsoft has stated that they are working with a few other developers to bring their DX12 games/backends to Windows 7 as well. As a consumer it's great to see them supporting their product ten years after it launched, but with the entire OS being put out to pasture in nine months, it seems like an odd time to be dedicating resources to bringing it new features." From the report: For some background, Microsoft's latest DirectX API was created to remove some of the CPU bottlenecks for gaming by allowing for developers to use low-level programming conventions to shift some of the pressure points away from the CPU. This was a response to single-threaded CPU performance plateauing, making complex graphical workloads increasingly CPU-bounded. There's many advantages to using this API over traditional DX11, especially for threading and draw calls. But, Microsoft made the decision long ago to only support DirectX 12 on Windows 10, with its WDDM 2.0 driver stack.

Today's announcement is a pretty big surprise on a number of levels. If Microsoft had wanted to back-port DX12 to Windows 7, you would have thought they'd have done it before Windows 7 entered its long-term servicing state. As it is, even free security patches for Windows 7 are set to end on January 14, 2020, which is well under a year away, and the company is actively trying to migrate users to Windows 10 to avoid having a huge swath of machines sitting in an unpatched state. In fact, they are about to add a pop-up notification to Windows 7 to let users know that they are running out of support very soon. So adding a big feature like DX12 now not only risks undermining their own efforts to migrate people away from Windows 7, but also adding a new feature well after Windows 7 entered long-term support. It's just bizarre.

First Person Shooters (Games)

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games (hothardware.com) 149

MojoKid writes: Gaming enthusiasts and pro-gamers have believed for a long time that playing on high refresh rates displays with high frame rates offers a competitive edge in fast-action games like PUBG, Fortnite and Apex Legends. The premise is, the faster the display can update the action for you, every millisecond saved will count when it comes to tracking targets and reaction times. This sounds logical but there's never been specific data tabulated to back this theory up and prove it. NVIDIA, however, just took it upon themselves with the use of their GeForce Experience tool, to compile anonymous data on gamers by hours played per week, panel refresh rate and graphics card type. Though obviously this data speaks to only NVIDIA GPU users, the numbers do speak for themselves.

The more powerful the GPU with a higher frame rate, along with higher panel refresh rate, generally speaking, the higher the kill-to-death ratio (K/D) for the gamers that were profiled. In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance. It should be underscored that it really doesn't matter what GPU is at play; gamers with AMD graphics cards that can push high frame rates at 1080p or similar can see similar K/D gains. However, the new performance sweet spot seems to be as close to 144Hz/144FPS as your system can push, the better off you'll be and the higher the frame rate and refresh rate the better as well.

Open Source

Linux 5.0 Released (phoronix.com) 107

An anonymous reader writes: Linus Torvalds has released Linux 5.0 in kicking off the kernel's 28th year of development. Linux 5.0 features include AMD FreeSync support, open-source NVIDIA Turing GPU support, Intel Icelake graphics, Intel VT-d scalable mode, NXP PowerPC processors are now mitigated for Spectre Variant Two, and countless other additions. eWeek adds: Among the new features that have landed in Linux 5.0 is support for the Adiantum encryption system, developed by Google for low power devices. Google's Android mobile operating system and ChromeOS desktop operating system both rely on the Linux kernel. "Storage encryption protects your data if your phone falls into someone else's hands," Paul Crowley and Eric Biggers, Android Security and Privacy Team at Google wrote in a blog post. "Adiantum is an innovation in cryptography designed to make storage encryption more efficient for devices without cryptographic acceleration, to ensure that all devices can be encrypted. Memory management in Linux also gets a boost in the 5.0 kernel with a series of improvements designed to help prevent memory fragmentation, which can reduce performance.
Games

The New 'Red Dead Redemption' Reveals the Biggest Problem With Marquee Games Today: They're Boring as Hell. (theoutline.com) 211

An anonymous reader shares a column: Everything about "Red Dead Redemption 2" is big. The latest open-world western, released in October by Rockstar Games, constantly reminds you of this. It takes roughly 15 minutes for its bland everycowboy star, Arthur Morgan, to gallop across the 29-square-mile map. It has 200 species of animals, including grizzly bears, alligators, and a surprising number of birds. It takes about 45.5 hours to play through the main quest, and 150-plus hours to reach 100 percent completion. There are more than 50 weapons to choose from, such as a double-barreled shotgun and a rusty hatchet. It's big, big, big.

[...] On top of all the bigness, "Red Dead Redemption 2" is also incredibly dull. I've been playing it off and on since it was released, and I'm still waiting for it to get fun. I'm not alone in thinking so -- Mark Brown of Game Maker's Toolkit called it "quite boring" and Mashable said it's a "monumental disappointment." There are a glut of Reddit posts from people complaining about how slow the game feels, usually with a tone of extreme self-consciousness. Unless you're a real a**hole, it's not exactly fun to stray from popular consensus. Perhaps the general hesitancy to criticize the game is due to the fact that it's not technically bad. Its graphics and scale really are impressive. It is designed to please.

And yet "RDR2" seems to exemplify a certain kind of hollowness that's now standard among Triple-A titles. It's very big, with only tedium inside. Call it a Real World Game. The main problem with "RDR2" is that it's comprised almost entirely of tedious, mandatory chores. It always feels like it's stalling for time, trying to juke the number of hours it takes to complete it.

The Military

Microsoft CEO Defends Pentagon Contract Following Employee Outcry (theverge.com) 221

Microsoft CEO Satya Nadella is defending the company's $479 million contract with the Pentagon to supply augmented reality headsets to the U.S. military. "We made a principled decision that we're not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy," he told CNN Business at Mobile World Congress. "We were very transparent about that decision and we'll continue to have that dialogue [with employees]," he added during the exclusive interview. From the report: Microsoft was awarded the contract to supply "Integrated Visual Augmentation System" prototypes to the U.S. military in November. The company could eventually deliver over 100,000 headsets under the contract. Microsoft's HoloLens augmented reality technology allows users to see the world around them, but with virtual graphics overlaid. The Israeli military, which has taken delivery of some HoloLens headsets, says the technology can be used to help commanders visualize the battlefield and field medics to consult doctors. According to procurement documents, the U.S. military is seeking a single platform that provides its forces with "increased lethality, mobility and situational awareness" in combat. Microsoft employees have recently circulated a letter addressed to Nadella and Brad Smith, the company's president and chief legal officer, arguing that the company should not supply its HoloLens technology to the U.S. military. "It's not about taking arbitrary action by a single company, it's not about 50 people or 100 people or even 100,000 people in a company," he said. "It's really about being a responsible corporate citizen in a democracy."
Graphics

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279 (hothardware.com) 94

MojoKid writes: NVIDIA has launched yet another graphics card today based on the company's new Turing GPU. This latest GPU, however, doesn't support NVIDIA's RTX ray-tracing technology or its DLSS (Deep Learning Super Sampling) image quality tech. The new GeForce GTX 1660 Ti does, however, bring with it all of the other GPU architecture improvements NVIDIA Turing offers. The new TU116 GPU on board the GeForce GTX 1660 Ti supports concurrent integer and floating point instructions (rather than serializing integer and FP instructions), and it also has a redesigned cache structure with double the amount of L2 cache versus their predecessors, while its L1 cache has been outfitted with a wider memory bus that ultimately doubles the bandwidth. NVIDIA's TU116 has 1,536 active CUDA cores, which is a decent uptick from the GTX 1060, but less than the current gen RTX 2060. Cards will also come equipped with 6GB of GDDR6 memory at 12 Gbps for 288GB/s of bandwidth. Performance-wise, the new GeForce GTX 1660 Ti is typically slightly faster than a previous gen GeFore GTX 1070, and much faster than a GTX 1060. Cards should be available at retail in the next few days, starting at $279.
Intel

Intel Starts Publishing Open-Source Linux Driver Code For Discrete GPUs (phoronix.com) 43

fstack writes: Intel is still a year out from releasing their first discrete graphics processors, but the company has begun publishing their open-source Linux GPU driver code. This week they began by publishing patches on top of their existing Intel Linux driver for supporting device local memory for dedicated video memory as part of their restructuring effort to support discrete graphics cards. Intel later confirmed this is the start of their open-source driver support for discrete graphics solutions. They have also begun working on Linux driver support for Adaptive-Sync and better reset recovery.
Hardware

Nvidia CEO Foresees a Great Year for PC Gaming Laptops (venturebeat.com) 36

Nvidia has predicted that the year ahead would be a good one for the company, with demand for laptop gaming gear remaining strong. From a report: Looking forward, Huang said it would be a big year for gaming laptops, as Nvidia knows that more than 40 Turing-based gaming laptops (based on the GeForce RTX 2060) are poised to launch during the year. Those laptops use mid-range RTX cards based on graphics processing units (GPUs) using Nvidia's new Turing architecture -- the GeForce RTX graphics cards that can do real-time ray tracing -- that are battery efficient.

Huang acknowledged that visibility is limited. I asked him if cloud gaming would be a disruptive force during the year. But he noted that Nvidia had been providing its own cloud gaming solution, GeForce Now, with relatively little impact on the market for three years. So he said it remains to be seen if cloud gaming and the "Netflix of games" would make an impact on the market. In the meantime, he said that gaming laptops would launch.

Graphics

AMD Radeon VII Graphics Card Launched, Benchmarks Versus NVIDIA GeForce RTX (hothardware.com) 73

MojoKid writes: AMD officially launched its new Radeon VII flagship graphics card today, based on the company's 7nm second-generation Vega architecture. In addition to core GPU optimizations, Radeon VII provides 2X the graphics memory at 16GB and 2.1X the memory bandwidth at a full 1TB/s, compared to AMD's previous generation Radeon RX Vega 64. The move to 7nm allowed AMD to shrink the Vega 20 GPU die down to 331 square millimeters. This shrink and the subsequent silicon die area saving is what allowed them to add an additional two stacks of HBM2 memory and increase the high-bandwidth cache (frame buffer) capacity to 16GB. The GPU on board the Radeon VII has 60CUs and a total of 3,840 active stream processors with a board power TDP of 300 Watts. As you might expect, it's a beast in the benchmarks that's able to pull ahead of NVIDIA's GeForce RTX 2080 in spots but ultimately lands somewhere in between the performance of an RTX 2070 and 2080 overall. AMD Radeon VII cards will be available in a matter of days at an MSRP of $699 with custom boards from third-party partners showing up shortly as well.
AI

The World's Fastest Supercomputer Breaks an AI Record (wired.com) 66

Along America's west coast, the world's most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government. From a report: The record-setting project involved the world's most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before. Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI's frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

"Deep learning has never been scaled to such levels of performance before," says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. His group collaborated with researchers at Summit's home base, Oak Ridge National Lab. Fittingly, the world's most powerful computer's AI workout was focused on one of the world's largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century's worth of three-hour forecasts for Earth's atmosphere.

Ubuntu

System76 Unveils 'Darter Pro' Linux Laptop With Choice of Ubuntu or Pop!_OS (betanews.com) 86

An anonymous reader writes: Today, System76 unveiled its latest laptop -- the 15.6-inch (full-HD) "Darter Pro." It is thin, but not overly so -- it still has USB-A ports (thankfully). The computer is quite modern, however, as it also has a USB-C/Thunderbolt 3 port. It supports Pop!_OS 18.04 LTS (64-bit), Pop!_OS 18.10 (64-bit), or Ubuntu 18.04 LTS (64-bit) operating system. It comes in two variants, with the following processor options: 8th Gen Intel Core i5-8265U: 1.6 up to 3.90 GHz -- 6MB Cache -- 4 Cores -- 8 Threads, or 8th Gen Intel Core i7-8565U: 1.8 up to 4.60 GHz -- 8MB Cache -- 4 Cores -- 8 Threads, with either coupled with Intel UHD Graphics 620 GPU, and up to 32GB Dual Channel DDR4 @ 2400 MHz, and M.2 SATA or PCIe NVMe SSD for storage. As for ports, there is USB 3.1 Type-C with Thunderbolt 3, 2 USB 3.0 Type-A, 1 x USB 2.0, SD Card Reader. The company says it will announce the pricing at a later stage,
AI

DeepMind AI AlphaStar Wins 10-1 Against 'StarCarft II' Pros (newscientist.com) 103

In a series of matches streamed on YouTube and Twitch, DeepMind AI AlphaStar defeated two top-ranked professionals 10-1 at real-time strategy game StarCraft II. "This is of course an exciting moment for us," said David Silver at DeepMind in a live stream watched by more than 55,000 people. "For the first time we saw an AI that was able to defeat a professional player." New Scientist reports: DeepMind created five versions of their AI, called AlphaStar, and trained them on footage of human games. The different AIs then played against each other in a league, with the leading AI accumulating the equivalent of 200 years of game experience. With this, AlphaStar beat professional players Dario Wunsch and Grzegorz Komincz -- ranked 44th and 13th in the world respectively. AlphaStar's success came with some caveats: the AI played only on a single map, and using a single kind of player (there are three in the game). The professionals also had to contend with playing different versions of AlphaStar from match to match. While the AlphaStar was playing on a single graphics processing unit, a computer chip found in many gaming computers, it was trained on 16 tensor processing units hosted in the Google cloud -- processing power beyond the realms of many.
Wine

Wine 4.0 Released With Vulkan Support, Initial Direct3D 12 and Better HiDPI (phoronix.com) 73

Michael Larabel writes via Phoronix: Wine 4.0 is now officially available as the new annual stable release to Wine for running Windows programs and games on Linux and other operating systems. Following seven weekly release candidates, Wine 4.0 was ready to ship today as judged by Wine founder Alexandre Julliard. Wine 4.0 is a big release bringing initial Vulkan graphics API support, Direct3D CSMT is enabled by default, early Direct3D 12 support via VKD3D, continued HiDPI work, various OpenGL improvements, multi-sample D3D texture support, 64-bit improvements, continued Android support, and much more. The release announcement and notes can be read via WineHQ.org. The source can be downloaded here.
Businesses

Battlefield 5's Poor Sales Numbers Have Become a Disaster For Electronic Arts (seekingalpha.com) 715

dryriver writes: Electronic Arts has mismanaged the Battlefield franchise in the past -- BF3 and BF4 were not great from a gameplay perspective -- but with Battlefield 5, Electronic Arts is facing a real disaster that has sent its stock plummeting on the stock exchanges. First came the fierce cultural internet backlash from gamers to the Battlefield 5 reveal trailer -- EA tried to inject so much 21st Century gender diversity and Hollywood action-movie style fighting into what was supposed to be a reasonably historically accurate WWII shooter trailer, that many gamers felt the game would be "a seriously inauthentic portrayal of what WW2 warfare really was like." Then the game sold very poorly after a delayed launch date -- far less than the mildly successful WW1 shooter Battlefield 1 for example -- and is currently discounted by 33% to 50% at all major game retailers to try desperately to push sales numbers up. This was also a disaster for Nvidia, as Battlefield 5 was the tentpole title supposed to entice gamers into buying expensive new realtime ray-tracing Nvidia 2080 RTX GPUs.

Electronic Arts had to revise its earnings estimates for 2019, some hedge funds sold off their EA stock, fearing low sales and stiff competition from popular Battle Royal games like Fortnite and PUBG, and EA stock is currently 45% down from its peak value in July 2018. EA had already become seriously unpopular with gamers because of annoying Battlefield franchise in-game mechanisms such as heaving to buy decent-aiming-accuracy weapons with additional cash, having to constantly pay for additional DLC content and game maps, and the very poor multiplayer gameplay of its two Star Wars: Battlefront titles (essentially Battlefield with laser blasters set in the Star Wars Universe). It seems that with Battlefield 5, EA -- not a company known for listening to its customers -- finally hit a brick wall, in the form of many Battlefield fans simply not buying or playing Battlefield 5.

AMD

Nvidia CEO Trashes AMD's New GPU: 'The Performance Is Lousy' (gizmodo.com) 115

An anonymous reader shares a report: Yesterday, AMD announced a new graphics card, the $700 Radeon VII, based on its second-generation Vega architecture. The GPU is the first one available to consumers based on the 7nm process. Smaller processes tend to be faster and more energy efficient, which means it could theoretically be faster than GPUs with larger processes, like the first generation Vega GPU (14nm) or Nvidia's RTX 20-series (12nm). I say "could," because so far Nvidia's RTX 20-series has been speedy in our benchmarks. From the $1,000+ 2080 Ti down to $350 2060 announced Sunday, support ray tracing. This complex technology allows you to trace a point of light from a source to a surface in a digital environment. What it means in practice is video games with hyperrealistic reflections and shadows.

It's impressive technology, and Nvidia has touted it as the primary reason to upgrade from previous generation GPUs. AMD's GPUs, notably, do not support it. And at a round table Gizmodo attended with Nvidia CEO Jensen Huang he jokingly dismissed AMD's Tuesday announcement, claiming the announcement itself was "underwhelming" and that his company's 2080 would "crush" the Radeon VII in benchmarks. "The performance is lousy," he said of the rival product. When asked to comment about these slights, AMD CEO Lisa Su told a collection of reporters, "I would probably suggest he hasn't seen it." When pressed about his comments, especially his touting of ray tracing she said, "I'm not gonna get into it tit for tat that's just not my style."

Slashdot Top Deals