With its details maxed out, Watch_Dogs 2 runs comfortably on most >$200 graphics cards, at 1080p resolution. VRAM usage is very reasonable, especially without the optional high-res texture pack. Overall performance is decent, but could be a bit more optimized - Watch_Dogs 2 certainly doesn't look as good as Battlefield 1 for example, which runs at higher FPS across the board. We used these GeForce 376.09 WHQL and Radeon Software Crimson Edition 16.11.5 drivers in this article. Both companies have delivered game optimized drivers yesterday, which, especially on AMD, shows that the company is right on track with providing timely driver updates to gamers for new titles. Performance on AMD Radeon graphics cards is a little bit lower than what we would expect, but this is no surprise given Watch_Dogs 2 bears an NVIDIA "The Way It's Meant To Be Played" badge, inferring that NVIDIA helped Ubisoft in the development of this game. There is no frame-rate (FPS) cap and field of view can be adjusted by up to 110°, both of which are certainly welcome in the PC-gaming arena. An optional high-res texture pack is available as a free download and the game settings provide tons of dials to adjust performance. Only flat areas like streets look a bit too flat, lacking some geometric detail. Graphics-wise, the game looks great, with good visual detail in all scenes. Unfortunately, it still uses DirectX 11, but given how bad the track record of DirectX 12 games is so far, this might not be bad thing. The new Disrupt 2.0 engine is a revamped version of the engine that powered the original. You play a hacktivist in San Francisco who has to team up with his buddies in DedSec to stop whatever ctOS 2.0 is doing with the data it's collecting from all citizens in the city. The GTX 1070 falls between the perfect gap between the 1070 and the GTX 1080.Conclusion"Watch_Dogs 2" is a significant improvement in gameplay over the original Watch_Dogs. But if you’re gaming at 1440p or 2160p, I would recommend getting a 1070 ti (or even a 1080ti if your pockets are as deep as the grand canyon) for more future proofing and the performance difference at the higher resolution. If you’re going to game at 1080p, you would not feel much of a difference between games as both would easily deliver 60+ fps on even the most demanding games now and in the near future. What remains is what you’re willing to spend and the kind of work you need out of them. The Final Verdict:īoth, the GTX 1070 and its TI counterpart, are great cards. The GTX 1070 is quite easily available at most stores by now, while the GTX 1070 TI is at the pre-order stage at the moment and will be shipped in December. The only thing that limits you is the depth of your pockets. Nvidia’s marketing department has priced the cards at a comparable price. While comparing it with the performance difference we your assumption is only confirmed. GTX 1070 Ti Founders Edition at MSRP is quoted $449.Ī lot of you might be wondering, the $50 difference is not a huge price difference for both the cards. GTX 1070 Founders Edition at MSRP is quoted at $399. The original MSRP prices of the products are shown below: Originally, the prices of the cards in question are quite something for an average user to afford. GTX 1070 vs GTX 1070 Ti Price Difference: You can check the exact benchmarks differences here. The average benchmarks of absolutely random people on the internet (that you should obviously trust) show that the 1070 ti beats the 1070 in absolutely every task except the few where the 1070 matches its newer-counterpart’s on its luckiest days. GTX 1070 vs GTX 1070 Ti Performance Difference:Īccording to The 1070 ti easily overtakes its former self by an impressive 15%. So to further answer the question we jump to the benchmarks. While specifications are fun and all, they in the end may not be as accurate as they seem to be. Both of the graphic cards run on the pascal architecture, have 8 gigabytes of GDDR5 memory and a boost clock of 1683 Mhz. While everything else is about the same, the GTX 1070 TI has 512 more CUDA cores than its predecessor. The basic specification difference between the two beasts is shown at Nvidia’s website as:
0 Comments
Leave a Reply. |