Announcement

Collapse
No announcement yet.

Testing Edius X on AMD and Intel CPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Testing Edius X on AMD and Intel CPUs

    Having mentioned this a few times in different posts I thought I’d start a very specific post about one thing. Which CPU type is best for processing in Edius X?

    My intention with this post is to have Edius X users help each other to fill in the gaps with our understanding of what is actually best for pure video processing within Edius X.

    This post is very obviously weighted toward Edius X for the simple reason that Edius X is what we will be moving forward with and it has supposedly had architectural changes that make it different compared to its predecessors. Although any such testing that may be suggested in this post is likely useful for other versions of Edius.

    To work out differences between any CPU for this test with Edius X we have to have a baseline or start off reference for comparison. To that end, my suggestion would be to use a codec or any process that does not require the use of QuickSync. So basically, no use of H.264 or H.265 codecs.

    Here’s why H.264/5 can’t be used. If using an Intel CPU that has QuickSync, that CPU has the ability to decode H.264/5 without using the CPU itself, as QuickSync is a separate co-processor to the actual CPU and is very specifically designed for various video tasks. Think of QuickSync as a dedicated video RISC processor. So, while QuickSync is a fantastic option for editing and really does help in any Edius setup that utilises it, using H.264/5 on a QuickSync enable CPU can’t be used to gauge the true video processing ability of a CPU.

    To further explain that previous point. When using H.265, this is a very difficult codec to work with as it is very difficult to decode when using just the CPU to decode it. So, when a QuickSync enabled CPU is used with Edius X and H.265, the codec has already had the massive advantage by being decoded into Edius’ YUV working space (video buffer) without using the actual CPU to do so. At this point Edius X, or any other compatible version of Edius, has basically got the entire CPU processing power available to it to manipulate its YUV frame buffer, because the very difficult job of decoding the H.265 codec into the YUV video buffer was done by QuickSync.

    Now, if we try to do the same with any AMD CPU or any Intel CPU that does not have QuickSync, the initial decoding of the H.265 codec and conversion into Edius’ YUV frame buffer will take up a large amount of CPU processing. In other words, software decoding. In some cases just this one single task of decoding a single H.265 stream may actually consume the entire CPU’s processing ability. So as we can see, if the CPU is doing the decoding of H.265 and using a lot of its processing ability to do it, the very same CPU will have a lot less CPU processing power available to it for actual video editing.

    With that simplified version of how Edius and QuickSync work, we can immediately see why a QuickSync enabled Intel CPU may initially appear to be a lot more powerful than an AMD CPU. And that’s because the AMD CPU has to decode the H.265 stream as well as edit it, all within CPU.

    Now at this point I have to be completely honest about my understating here as I can’t find the information from GV. If we take QuickSync out of the equation, are AMD and Intel CPUs basically seen as the same thing by Edius X? By that what I mean is. Does Edius X simply see any architecturally compatible CPU as just a CPU, as in, just a piece of computer technology that’s available to it to use for its internal processing of its YUV video buffer.

    Now let’s assume that the answer is yes at this point, Edius X just sees any compatible CPU as a reservoir of processing power for it to use as it sees fit. Then all we need is a baseline that we can use to test between various CPUs, AMD or Intel, and see if various factors such as core/thread count, clock speed, single thread speed, multi CPU etc. make any difference to real-time editing/timeline processing abilities.

    This is why the baseline or test codec can’t be H.264/5. So what we need here is a codec that is decoded entirely in software/CPU. Ideally a YUV source would be best but this is probably too impractical if a number of people are willing to test their various systems. For such testing I’d be inclined to suggest HQX as this codec is so light on the system and would free up any CPU to be used for actual editing/video processing.

    Another reason for HQX or any common software codec is because any number of people who are prepared to test such things can all easily generate the exact same file. Let’s say for instance that the test file was a one minute HXQ maxed out at 10Bit UHD 29.97 video file. Then anyone can generate a project with those parameters, add any footage to it, it doesn’t matter if it’s not the same, then export the timeline. This way the exported file will have the same the encode/file characteristics regardless of how it looks. With such a start off baseline file, comparisons across systems can be made.

    I’ll pause at that point. If there are enough other users out there interested in knowing what the results of such tests would be and who would want join in. Please use this post to discuss the test method etc. I’ve only made my suggestions as suggestions, if there’s a better way to do this that’s easier and/or more efficient, beneficial to others then I’d definitely be up for partaking.

    Enough people have complained about a number of things to do with Edius X and specs etc. me included. We have obviously not got the answers from where we should be getting them, so if we’re not going to help ourselves..........

    Cheers,
    Dave.


    "There's only one thing more powerful than knowledge. The free sharing of it"

    If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

    Is your Robot three laws safe?

  • #2
    What do you want to do Dave? I have created a small 1 min file HQX of a red background with a Quick title on top. UHD 29.97 10bit. As expected it plays fine on the timeline full buffers immediately.
    Ron Evans

    Threadripper 1920 stock clock 3.7, Gigabyte Designare X399 MB, 32G G.Skill 3200CL14, 500G M.2 NVME OS, 500G EVO 850 temp. 1T EVO 850 render, 6T Source, 2 x 1T NVME, MSI 1080Ti 11G , EVGA 850 G2, LG BLuray Burner, BM IP4K, WIN10 Pro, Shuttle Pro2

    ASUS PB328 monitor, BenQ BL2711U 4K preview monitor, EDIUS X, 9.5 WG, Vegas 18, Resolve Studio 17


    Cameras: GH5S, GH5, FDR-AX100, FDR-AX53, DJI OSMO Pocket, Atomos Ninja V x 2

    Comment


    • #3
      Originally posted by Ron Evans View Post
      What do you want to do Dave? I have created a small 1 min file HQX of a red background with a Quick title on top. UHD 29.97 10bit. As expected it plays fine on the timeline full buffers immediately.
      The idea is to see what the consensus is for such a test. I'm only suggesting HQX as it is CPU bound and easy to create. Once there's an agreed codec then there'd have to be something meaningful agreed as to test the HQX file. For instance. There could be a number of Edius CPU filters to test on the HQX file, filters that everyone would have access to. Once there's a standard and control element for such testing, then it'd be a simple process of people logging their results for their CPU. In amongst those results there will be a pattern that will naturally occur that will indicate what CPUs are doing what.

      With such information and knowing the CPU type, core/thread count, single, multi-CPU, core/thread utilisation etc. This will indicate the best CPUs purely for their absolute processing abilities within Edius and its YUV frame buffer.

      To be clear. Such testing wont help anyone, right now, who is bound to H.264/5 as their sole source for video files, as such users would always be better with QuickSync. Although and as I said earlier. If GV ever give Edius X the ability to decode H.264/5 with a discreet GPU, which I do believe will happen. Then in that instance, the information from this collaborative test that I'm suggesting, will become very critical for all Edius users.

      I say that because right now it may actually be that AMD CPUs are the best option for pure YUV manipulation but the vast majority of users wont be using AMD becuase of QuickSync and H.264/5. However, if discreet GPU decoding becomes an option for H.264/5 and is paired with AMD CPUs (assuming AMD is the better choice for YUV processing) Then I'm sure many people would be prepared to see such a system as a better option to Intel and QuickSync.

      I'd imagine that this scenario is quite likely and a lot more likely than us seeing a fully GPU optimised Edius. Or at the very least, this scenario I'm suggesting of discreet GPU codec decoding and using the most powerful CPU for YUV processing, is likely to happen first even if total GPU optimisation is a reality.

      Like I said, others may have a better solution for testing than what I've suggested and I also think that the more input from different people, the better. I'm prepared to go with the flow on this, it's more important that there's enough people who are interested in trying this test out and who have an interest in having certain questions answered.

      Depending on what testing is done and how much is done, I'm sure there'll be a good few critical questions that can be answered.

      Let's face it. Right now, there are quite a few of us on this forum who know a massive amount about Edius and what it does but I can guarantee you that there are a few questions about Edius X that non of us know the answers to.
      "There's only one thing more powerful than knowledge. The free sharing of it"

      If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

      Is your Robot three laws safe?

      Comment

      Working...
      X