![]() NVIDIA's GPUs don't have 4:2:2 decode but the iGPU in the Intel CPUs do have this capability. This doesn't use the regular CPU cores but some silicon specially set aside for such operations and it is as quick (or quicker) than the encoders found on NVIDIA cards in some cases. it's the iGPU (a separate graphics controller built into the non-F series CPUs) that does the decoding/encoding. We're not talking about the CPU itself doing the decoding. I need GPU acceleration, that's why I specifically made the post. CPU decoding is way too slow for such a 4k video, and that's without adding plugins. Telling me "the CPU can do it" is not helpful to me or to anyone else. I'm specifically looking into having accelerated 4:2:2 10bit 4k HEVC decoding, not offloading it to CPU. It will do everything I need at good-enough speeds, with 4:2:2 hardware decoding.Įugenia Loli wrote:Thank you for the replies, but I'm not a Resolve newbie. So I have decided to go with an Apple Mac Studio, loaded with 64 GB of RAM. But when it comes to general computation, basically, the menial workhorse stuff, Intel's architecture has gaping implementation holes. Basically, as long as you want video encoding/decoding and 3D gaming, you're good with an Intel ARC GPU. Same goes for StableDiffusion AI generation, and 3D like Blender/Maya. Basically, the moment you color grade or use Fusion, or use some other functions/plugins, it performs up to 2.5x WORSE than my 7 year old nvidia GPU. While it indeed supports 4:2:2 10bit HEVC decoding in hardware (and its encoding/decoding speed is on par with nvidia), it lacks other parts of general GPU computation features that are extremely important when working in Resolve. So, to answer my own question, the truth is, the Intel ARC 770, is not that good with Resolve. ![]() ![]() Thank you for the replies, but I'm not a Resolve newbie.
0 Comments
Leave a Reply. |