RTX 2080, 2070 and 2060 to get a more CUDA cores, faster memory
According to fresh leaks, it appears that Nvidia will launch a full refresh of its RTX series in mid-July, giving all cards a slight bump in CUDA core count as well as more or faster 16Gbps memory in some cases.
GeForce GTX 1650, GeForce GTX 1660, and GeForce GTX 1660 Ti
Nvidia, named after a Roman vengeance demon, has launched a new family of more budget-friendly Turing graphics chips for gaming laptops .
Including Pascal and non-RTX Turing GPUs
Nvidia has released its latest Geforce 425.31 Game Ready driver which not only includes fixes and optimization for Ubisoft's Anno 1800 game, but also enables DXR support for Pascal, Titan V, and non-RTX Turing graphics cards, ranging from GTX 1060 6GB and up.
AMD Zen+ Picasso APU with Turing-based GTX 1660 Ti
According to leaks spotted in 3DMark database, it appears that Asus is looking to pair up AMD's Picasso APUs and Nvidia Turing 1660 Ti GPU in a couple of its laptops.
No GDDR6 for GTX 1660
The latest leak showing a couple of upcoming GTX 1660 graphics cards from various partners has pretty much confirmed the upcoming March 14th launch date as well as the fact that the GTX 1660 will be coming with GDDR5 memory.
More details revealed
Nvidia Geforce GTX 1660 Ti has been spotted in Russian retail/e-tail, which also revealed a few more details regarding the actual specifications, as well as a general idea of pricing.
Should launch in February
It appears that Nvidia is gearing up for the launch of its first non-RTX Turing-based graphics card as the GTX 1660 Ti has been spotted in Ashes of the Singularity benchmark.
Turing meets the GTX series
According to rumors floating around the net, Nvidia is preparing a real successor to the GTX 1060, the Turing-based GTX 1660 Ti.
Coming later this month for a cool $2,499
Just as teased earlier, Nvidia has officially announced the new Titan RTX graphics, providing an impressive 130 Tensor TFLOPs performance and packing 24GB of GDDR6 memory.
Great that they can do Ray Tracing
Graphics experts very close to what's happening have shared an interesting thesis that helped Nvidia justify the transistors spent in Turing RT cores. It is all about AI and the next generation data workloads.