Ian
Administrator
- Joined
- Feb 23, 2002
- Messages
- 19,878
- Reaction score
- 1,509
If you've been looking at buying a new monitor or graphics card, you're probably aware of the raging FreeSync vs G-Sync battle. NVIDIA and AMD both have different methods of avoiding graphics "tearing" during high FPS games - but you'll need to buy matching and compatible hardware for this to work. Techspot take a look at the current state of things in 2017:
Read the rest here:
https://www.techspot.com/article/1454-gsync-vs-freesync/
Adaptive sync display technologies from Nvidia and AMD have been on the market for a few years now, however it's just recently that it's become more mainstream with gamers taking the plunge thanks to generous selection, a wide variety of options, and monitor budgets. Initially, Nvidia’s G-Sync and AMD’s FreeSync significantly differed in their implementation and user experience, but now that both technologies and ecosystems have matured, it’s a good opportunity to revisit them to see where the differences lie in mid 2017.
For those that haven’t been keeping up with adaptive sync, here’s a quick refresher on what it brings to the table. Traditional monitors (without adaptive sync) have a fixed refresh rate, which sees the display update its image at the same interval regardless of what your PC is doing. For 60 Hz monitors, this means the image is always updated every 1/60th of a second.
Read the rest here:
https://www.techspot.com/article/1454-gsync-vs-freesync/