>>1557875Jokes aside, theres really no reason to go to 4k, most of the content out right now is using 1k resolution textures and then use a separate process to "upscale" to fake being 4k. Many if not all of the "4k mods" that fan makes for games use the same process. Games are still mostly made with console play in mind and the consoles struggle to reach 4k so there isn't much point.
Now if you're doing illustrator/photoshop or other general art/printing work as a hobby or for work, you might want to consider a build for 4k.
Right now the RTX 9070 is considered the top of the line mostly because it comes with such a high amount of VRAM which is used for shaders and higher resolution texture loading. If you intend to go into VR games a lot or want to do blender/3dmodeling the VRAM is almost mandatory. You can kind of think of it as a SSD for ram. I personally don't recommend converting from nvidia to AMD unless you plan on doing a full fresh install of windows and clean memory but that's probably your best bet for pure future proofing.
Alternatively, the rtx 5070 only comes with 12gb vram for the same prices and it's main benefit is that it'll use AI to keep your frame rate high even when your actual hardware begins to struggle. This leads to a problem that im going to refer to as "fake frames." this is especially noticeable when an object s moving fast. and since the ai doesnt know where the object will definitely be next frame it creates a halo of potential future screen blurs around the object which essentially makes fast objects appear to glow. Or especially annoying when the AI predicts a still object will remain still, introducing fake frames where an object remains stationary which appears to the viewer as input lag. Nvidia has also indicated that they will soon be taking the AI frames away and charging monthly for them- so I dont recommend this card.
If you just want to play the latest games on 1080p ultra high you dont need either of these cards