Announcement

Collapse
No announcement yet.

Why Intel is better than AMD for Edius.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Why Intel is better than AMD for Edius.

    So I don’t take a different post completely off topic, I’m answering and explaining a few differences here about Intel and AMD CPUs that some people may be unaware of and also touching upon certain issues with discreet GPUs. I round up this post by mentioning something very positive which I think we will be seeing in the soon to be released update for Edius. Also, all of my references to Edius are to do with Edius X and the Edius workflows I’m talking about are those with H.264 and H.265 source media, as this was the basis of the previous workflow example and H.264 and H.265 are going to be the most representative of codecs used with Edius, both in professional and amateur use.

    Originally posted by noafilm View Post
    Or even a 10th gen i9 which you said will be more powerful than a 5950x, more powerful in what way, can it handle more then 4 tracks H.265 footage and how many before it starts to buffer, how fast can it export to h.265? I think everyone wants to know more specifics because now it's just an opinion without actual facts. About having to use a nvidia gpu seems logical because without one the 5950x won't start :) , sure you have a cost benefit here for not having to buy a gpu but I was more interested in en- and decoding performance. I have used a amd card with my 1950x which showed worse performance compared to the nvidia card I"m using now so nvidia cards are definitely the way to go with a ryzen processor.
    The last Intel system I worked on that worked “properly” with Edius was a 10th Gen i9. That system was noticeably faster than your 5950x when using just straight clips or clips with the colour correction filter. Faster as in more clips in real-time. This was using H.264/5 4:2:0 25FPS sources at both 1080 and UHD.

    You have to remember that the QS processing also helps with decoding, this is why even a 10th Gen Intel CPU will be better than your 5950x for Edius.

    This also brings us to another reason why the 5950x will always be a poor choice for Edius. Even if GV update Edius so that it takes account of the video decoding/encoding proceeding power of AMD APUs, this still won’t help you with the 5950x because it’s not an APU. In fact, in that instance there’s a chance that a much lower powered and cheaper AMD APU could be better than a 5950x in Edius.

    For instance. Take a very powerful Xeon system and compare it to a less powerful and much cheaper Intel desktop system with QS, for instance a 10850K. The cheaper, and technically less powerful system will be a much better option for Edius. This may end up being the same with AMD CPUs VS AMD APUs if Edius ever takes advantage of the iGPU on AMD APUs as it does with QS on the iGPUs of compatible Intel CPUs.

    The bottom line is that a 5950x just isn’t a great choice for Edius and it will always be beaten on performance in Edius by much cheaper Intel CPUs.

    You’ve also mentioned things about the Nvidia GPUs. For a number of reasons, QS is better than NVENC, not least of all because earlier versions of NVENC have B frame problems and aren’t as efficient as QS. There’s also the consideration of GVs implementation of NVENC and how it differs from the way others use NVENC with their post applications.

    Don’t forget, you “have to” use an Nvidia GPU or you won’t be doing much at all with Edius and your 5950x. Given that you need at least some sort of GPU just to be able to switch on your PC, you are immediately at other disadvantages compared to most other Edius users. You’ve obviously got your reasons for choosing such a hardware workflow but I doubt you’ll find many Edius users who would knowingly and willingly opt into buying something that simply isn’t as suited to Edius as a single Intel CPU/iGPU combination is, which will also be a lot cheaper and more powerful.

    Given that the vast majority of Edius users won’t be seeing any advantage what so ever with using an Nvidia GPU, then it would make sense to not have to buy something you don’t need. In which case a compatible Intel CPU, which is also the best performing CPU, coupled with QS, is going to be all that is needed for the vast majority of Edius use. This is also how Canopus/GV have historically designed Edius to run, as a CPU based post application which later took account of QS and one could argue that QS is still a CPU asset anyway. Maybe this changes in the future but even if it does it won’t benefit your 5950x.

    let’s not forget just how cost prohibitive any discreet GPU is right now, let alone Nvidia GPUs and the general (industry) consensus is that we are going to have to wait at least 2 years to see anything change with regard the silicon shortages and before we get back to any “normality” with prices. Personally, I think this current position we are in with GPUs and many other processor based items, is our future norm and prices will always be higher than we used to pay and shortages will always be a thing, by comparison.

    So with that in mind and also bearing in mind that if the least anyone would want from an Nvidia GPU is NVENC and to be up to date with NVENC you’d need at least Turing. You would have to be paying what can only be described as a huge amount of money just to get H.265 encoding, not even decoding. Then there’s also the fact that Edius does not scale directly with Nvidia GPUs as far as its very limited compatible processing tasks are concerned. Then buying into any AMD CPU right now, let alone the 5950x, is an extremely poor choice, not just for performance but also your wallet.

    Now talking about the future. Pat recently eluded to the fact that some Edius users are going to be happy with the next update. We can all hold our breaths and hope this means full discreet GPU utilisation. However, even if that isn’t the case, I think it’s a solid bet that we now get full DG1 and DG2 compatibility. That being the case, we are likely to see 11th and 12th Gen Intel CPUs taking a hefty leap in QS processing performance with Edius. This will also force the 5950x further down the food chain and given that 11th Gen CPUs have now become cheaper, they could now become the sweet spot for Edius. And let’s not second guess what may be about to happen with Edius and 12Th Gen/DG2 if this new update is what we think it’s doing to be, but this could be the biggest boost in Edius performance that we’ve ever seen.

    In one of my many video examples of Edius, I made a video that was showing some very odd behaviour with DG1 on an 11th Gen NUC, basically Xe graphics. And I say odd in a very good way. Not only was the QS processing noticeably better than UHD750 etc. or anything on desktop but there was also an effect on the colour processing. It was never explained why this oddity occurred, although, it none the less demonstrated some very beneficial behaviour with Edius as far as processing with Intel Graphics is concerned and at least hinted that there are benefits beyond just decoding and encoding H.264/5 with QS.

    Taking that last point a step further. I’d guess that if GV can leverage any further performance enhancements for Edius from already available technology, it’s more likely to be Intel based as opposed to Nvidia or AMD. So, while we are all asking for GPU acceleration for decoding and timeline/FX processing with Nvidia etc. Which I suspect we will never happen, or at least to the degree that we would all want, having seen such benefits in other post applications. I’ve got a feeling that we will see some very noticeable performance enhancements for Edius with DG1 and DG2.

    In fact, just to round off this post and to explain why Intel CPUs will always be a better bet for Edius compared to AMD and why a 5950x is not only a poor choice but quite possibly the poorest choice. I’m going to well and truly nail my colours to the mast and say that the next Edius update is definitely going to give us GPU optimisation, which is likely to be Intel based for their iGPUs and soon to be released DG2 Xe-HPG based discreet GPUs. And in the very unlikely event, maybe the same for Nvidia and/or AMD. Which would mean that everyone is a winner.
    Last edited by Liverpool TV; 11-24-2021, 05:15 AM.

    "There's only one thing more powerful than knowledge. The free sharing of it"


    If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

    Is your Robot three laws safe?

  • #2
    Originally posted by Liverpool TV View Post

    The last Intel system I worked on that worked “properly” with Edius was a 10th Gen i9. That system was noticeably faster than your 5950x when using just straight clips or clips with the colour correction filter. Faster as in more clips in real-time. This was using H.264/5 4:2:0 25FPS sources at both 1080 and UHD.
    You still haven't answered my question, how many tracks can that "10th Gen i9" handle in realtime in a multicam setting and what type of processor was it exactly?What time did it need to export, pls give more specifics because just saying "mine is faster then yours" doesn't mean anything. Not saying you are wrong but you need more specific test data to compare and that will make it much clearer for anyone buying a new pc what to expect.

    Comment


    • #3
      Originally posted by noafilm View Post
      You still haven't answered my question, how many tracks can that "10th Gen i9" handle in realtime in a multicam setting and what type of processor was it exactly?What time did it need to export, pls give more specifics because just saying "mine is faster then yours" doesn't mean anything. Not saying you are wrong but you need more specific test data to compare and that will make it much clearer for anyone buying a new pc what to expect.
      I already answered your initial question, the 10850 is more powerful than your 5950x and is a much better CPU for typical editing scenarios in Edius. It really is that simple.

      On the point of being clear, it’s something you should try. There’s a lot of information you are missing in what you are talking about, you’re not mentioning specifics about the projects and timeline setups. Basically, what you are describing doesn’t really mean anything because of the lack of other information.

      If you want to do things properly, then make a video showing the project setup and properly demonstrate exactly what you are doing with whatever media and whatever filters etc, you are using. Just as you’ve said, I’m not saying you’re wrong but you need to be more specific about your test data and the best way to do that is to do a video of it. Let’s face it, you didn’t even mention what codec you were using until your were prompted, so, who knows what other info you may have gotten wrong or been incomplete about?



      "There's only one thing more powerful than knowledge. The free sharing of it"


      If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

      Is your Robot three laws safe?

      Comment


      • #4
        You have not mentioned any specifics, I have in previous posts but I"ll just leave it here since you are so easily triggered and I know from the past that these type of discussions with you lead to nowhere.

        Comment


        • #5
          Originally posted by noafilm View Post
          You have not mentioned any specifics, I have in previous posts but I"ll just leave it here since you are so easily triggered and I know from the past that these type of discussions with you lead to nowhere.
          Like I’ve already said. You’ve not given anywhere near enough information to validate anything you’ve said and given that you’re prone to missing out critical information, you really do need to show what it is you are trying to achieve. A detailed video will be a good way of doing this.


          "There's only one thing more powerful than knowledge. The free sharing of it"


          If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

          Is your Robot three laws safe?

          Comment


          • #6
            Key to the problem is whether there is a need to use igpu,EDIUS was originally developed for CPU decoding, making it difficult for users to choose GPU. Remember, EDIUS is not the only NLE, and it is expensive. Most functions still stay in edius6.
            Other NLE utilize any GPU more efficiently than EDIUS.

            Yes, if h265 format is used, Intel contains igpu, so the performance is much better,H264 is not a problem for processors containing avx2.
            However, In any case in EDIUS,i9 9980x performance is significantly worse than that of the R9 5950x. Both CPUs are in my house.If AMD processor integrates Intel igpu, AMD processor will be invincible.
            igpu will be independent next year,I'm waiting for AMD 6950x,I hope to find a girlfriend before EDIUS supports AMD and NVIDIA GPU decoding,I don't know if there is hope in this life.

            Intel launched 6950x in 2016, while AMD will release 6950x next year. That's an interesting name

            China Guangdong TV station also said that the performance of AMD CPU is significantly better than Intel Xeon gold because they use MXF or prores format, and there is no difference whether they have igpu or not.This is the data provided by ivideostar.
            Last edited by [email protected]; 11-24-2021, 12:08 PM.
            CPU:AMD R9 5950X GPU:GTX1050 MEM: Micron 4G DDR4 2400x4
            motherboard:ASrock X570 matx
            SSD:intel p3600 sata HDD: HGST 8Tx5 raid0
            Power:Great Wall EPS2000BL 2000W
            OS WIN10 20H2

            Comment


            • #7
              Originally posted by [email protected] View Post
              Key to the problem is whether there is a need to use igpu,EDIUS was originally developed for CPU decoding, making it difficult for users to choose GPU. Remember, EDIUS is not the only NLE, and it is expensive. Most functions still stay in edius6.
              Other NLE utilize any GPU more efficiently than EDIUS.

              Yes, if h265 format is used, Intel contains igpu, so the performance is much better,H264 is not a problem for processors containing avx2.
              However, In any case in EDIUS,i9 9980x performance is significantly worse than that of the R9 5950x. Both CPUs are in my house.If AMD processor integrates Intel igpu, AMD processor will be invincible.
              igpu will be independent next year,I'm waiting for AMD 6950x,I hope to find a girlfriend before EDIUS supports AMD and NVIDIA GPU decoding,I don't know if there is hope in this life.

              Intel launched 6950x in 2016, while AMD will release 6950x next year. That's an interesting name

              China Guangdong TV station also said that the performance of AMD CPU is significantly better than Intel Xeon gold because they use MXF or prores format, and there is no difference whether they have igpu or not.This is the data provided by ivideostar.
              Hi Bob.

              Don't forget, this post is very specifically for Edius being used with H.264 and H.265 source footage. This is where the Intel processors completely dominate the AMD processors and where the 10850K and 10900K are king.

              I expect we will see some really good news with the next update for Edius X. This should make your 12th Gen Intel system a serious Edius beast.

              Cheers,
              Dave.

              "There's only one thing more powerful than knowledge. The free sharing of it"


              If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

              Is your Robot three laws safe?

              Comment


              • #8
                Originally posted by Liverpool TV View Post

                Hi Bob.

                Don't forget, this post is very specifically for Edius being used with H.264 and H.265 source footage. This is where the Intel processors completely dominate the AMD processors and where the 10850K and 10900K are king.

                I expect we will see some really good news with the next update for Edius X. This should make your 12th Gen Intel system a serious Edius beast.

                Cheers,
                Dave.
                I am not satisfied with the performance of the 12th generation Intel processor,12th P-core area is much larger than AMD, but the performance is hardly improved,Even if I turn off the E-core. I just hope that EDIUS can keep up with the times as soon as possible and be compatible with more GPUs. In this way, regardless of the choice of CPU, the performance is the best. The fight between the two CPU companies benefits consumers。

                have a good time
                Bob.
                CPU:AMD R9 5950X GPU:GTX1050 MEM: Micron 4G DDR4 2400x4
                motherboard:ASrock X570 matx
                SSD:intel p3600 sata HDD: HGST 8Tx5 raid0
                Power:Great Wall EPS2000BL 2000W
                OS WIN10 20H2

                Comment


                • #9
                  I think we need to focus on what the projects are, what equipment is used and the needed output. Then choose the compute hardware and software for the required effort for completion of the task. Nice if one is starting from nothing but clearly if someone has cameras, PC and software it will bias the output. That is until something changes in the chain that makes the current setup not effective. Then one needs to look at individuals bias for things. Cameras, PC etc. If things are purely based on test facts one may come to completely different conclusions especially if cost was not involved. One needs to be very specific for cameras used, format used, storage restriction in the editing system etc etc. There is no one size fits all. There are however a few things to consider. If h265 is being used then I feel there are really only two choices that are really effective. Latest Intel and Mac M1. From the tests I think the Mac M1 is the winner. Both outperform AMD solutions. Encode depends on NLE even using NVENC as Dave has said. It doesn't matter if it is EDIUS or Resolve I still export HQX and get TMPGenc MW7 to encode ! I then have all the choices for audio etc including X264 or X265.

                  As I have said before, if EDIUS is your only need get an Intel with iGPU. Interestingly at the moment for Resolve the winner would be anything with a RTX3090 since there is less use of the CPU compared to the GPU. Though the difference again is editing h265 where Intel may be the best choice there for a Windows system. With Resolve, M1 Mac's will now be the alternate having hardware for decode and encode that may be faster than Intel and certainly faster than NVIDIA systems for h265. I expect lots of Resolve Windows users will get a M1 Mac as well since files can be shared. Tests show though that for long encodes the Windows workstation is still the choice. This raised the other issue for EDIUS and in fact Vegas since most of the other competing NLE's work on both Windows and Mac.

                  Lastly if the system that you have does what you want then that is great. It doesn't have to be the best at things you do not need. If there are things you want to do that your system does not do. Open you mind up fully to what can be used.
                  Ron Evans

                  Threadripper 1920 stock clock 3.7, Gigabyte Designare X399 MB, 32G G.Skill 3200CL14, 500G M.2 NVME OS, 500G EVO 850 temp. 1T EVO 850 render, 16T Source, 2 x 1T NVME, MSI 1080Ti 11G , EVGA 850 G2, LG BLuray Burner, BM IP4K, WIN10 Pro, Shuttle Pro2

                  ASUS PB328 monitor, BenQ BL2711U 4K preview monitor, EDIUS X, 9.5 WG, Vegas 18, Resolve Studio 18


                  Cameras: GH5S, GH6, FDR-AX100, FDR-AX53, DJI OSMO Pocket, Atomos Ninja V x 2

                  Comment


                  • #10
                    Originally posted by Ron Evans View Post
                    I think we need to focus on what the projects are, what equipment is used and the needed output. Then choose the compute hardware and software for the required effort for completion of the task. Nice if one is starting from nothing but clearly if someone has cameras, PC and software it will bias the output. That is until something changes in the chain that makes the current setup not effective. Then one needs to look at individuals bias for things. Cameras, PC etc. If things are purely based on test facts one may come to completely different conclusions especially if cost was not involved. One needs to be very specific for cameras used, format used, storage restriction in the editing system etc etc. There is no one size fits all. There are however a few things to consider. If h265 is being used then I feel there are really only two choices that are really effective. Latest Intel and Mac M1. From the tests I think the Mac M1 is the winner. Both outperform AMD solutions. Encode depends on NLE even using NVENC as Dave has said. It doesn't matter if it is EDIUS or Resolve I still export HQX and get TMPGenc MW7 to encode ! I then have all the choices for audio etc including X264 or X265.

                    As I have said before, if EDIUS is your only need get an Intel with iGPU. Interestingly at the moment for Resolve the winner would be anything with a RTX3090 since there is less use of the CPU compared to the GPU. Though the difference again is editing h265 where Intel may be the best choice there for a Windows system. With Resolve, M1 Mac's will now be the alternate having hardware for decode and encode that may be faster than Intel and certainly faster than NVIDIA systems for h265. I expect lots of Resolve Windows users will get a M1 Mac as well since files can be shared. Tests show though that for long encodes the Windows workstation is still the choice. This raised the other issue for EDIUS and in fact Vegas since most of the other competing NLE's work on both Windows and Mac.

                    Lastly if the system that you have does what you want then that is great. It doesn't have to be the best at things you do not need. If there are things you want to do that your system does not do. Open you mind up fully to what can be used.
                    Hi Ron.

                    You don't know how understated you are with what you've said about the Macs and I suspect you were also erring on the side of caution. The M1 Max is simply shocking with H.265. Imagine editing HQX on a powerful Windows machine with Edius, that's basically how the M1 Max machines are dealing with H.265 and that's before you get to the Pro Res accelerators, yes plural, because there's two on M1 Max.

                    As I'm writing this response I've just had a new piece of kit delivered, a standalone H.264/5 HDMI recorder/encoder. While it is standalone it can also be used under Windows control, I'm going to see if Edius will see it as a UVC device. But this does bring us back to the whole H.265 thing and how it's best dealt with. Unfortunately and even if the next Edius updates allow for better iGPU integration, or maybe even discreet GPU decoding/acceleration. Edius will always be subject to what's available on the Windows platform for accelerated decoding etc. Although, most things on Windows will also be subject to the same limitation and that's where the new Macs win. Even a 3090 based WinX86 system can't beat the new Macs for the H.265 processing. That said and as you've already pointed out, any software that's cross platform will be able to take advantage of the new Mac stuff.

                    Anyway, I'm absolutely convinced that GV are going to up their game with the next Edius update as far as DG1 and DG2 are concerned and at a push maybe some cool stuff with discreet Nvidia and/or AMD GPUs. Regardless, as it stands right now with Edius. For editing and exporting H.265 and H.264, Intel definitely beats AMD.

                    Cheers,
                    Dave.

                    "There's only one thing more powerful than knowledge. The free sharing of it"


                    If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

                    Is your Robot three laws safe?

                    Comment


                    • #11
                      Currently I do not use h265 as source. Just h264 and ProRes so not a problem for me now. However when it comes to the future I am more likely to have to decide how to upgrade. With a new GPU or get a M1 Mac Mini. At the moment using EDIUS for my UHD GH5 files is a trial and Resolve is much easier. So it will depend on the next release of EDIUS if it uses GPU decode well that would be great. Then depend on which is available/cost a faster GPU than my 1080Ti if it is not fast enough or the Mac Mini. I expect I may prefer to get another camera than upgrade to anything.
                      Ron Evans

                      Threadripper 1920 stock clock 3.7, Gigabyte Designare X399 MB, 32G G.Skill 3200CL14, 500G M.2 NVME OS, 500G EVO 850 temp. 1T EVO 850 render, 16T Source, 2 x 1T NVME, MSI 1080Ti 11G , EVGA 850 G2, LG BLuray Burner, BM IP4K, WIN10 Pro, Shuttle Pro2

                      ASUS PB328 monitor, BenQ BL2711U 4K preview monitor, EDIUS X, 9.5 WG, Vegas 18, Resolve Studio 18


                      Cameras: GH5S, GH6, FDR-AX100, FDR-AX53, DJI OSMO Pocket, Atomos Ninja V x 2

                      Comment


                      • #12
                        Originally posted by Ron Evans View Post
                        Currently I do not use h265 as source. Just h264 and ProRes so not a problem for me now. However when it comes to the future I am more likely to have to decide how to upgrade. With a new GPU or get a M1 Mac Mini. At the moment using EDIUS for my UHD GH5 files is a trial and Resolve is much easier. So it will depend on the next release of EDIUS if it uses GPU decode well that would be great. Then depend on which is available/cost a faster GPU than my 1080Ti if it is not fast enough or the Mac Mini. I expect I may prefer to get another camera than upgrade to anything.
                        I’m also thinking of a new video camera and that’s where H.265 is going to probably matter. Outside of mostly Pro Res & BM, most new cameras for higher than 4K and HDR are probably going to be using H.265 as well. I’ve just tried some recordings from my new H.265 recorder. The files are UHD 59.94 and Edius can’t play them in real-time even in the lowest timeline resolution, Windows plays them perfectly. Moving to the Mac and I’ve just tried two streams, Picture in Picture, and with a filter on each, it all played through perfectly.

                        It isn’t even FX processing etc. that I’m looking for, it’s just a solid cutting experience at at least UHD 59.94 across multiple codecs, although likely to be a lot of H.265. This is where I’ve had nothing but grief with Edius after I built my Gen 11 system for Edius 10.

                        As I’ve already said, I’m sure we’re going to get great news with this next update for Edius. So I’ll hopefully still have a Windows option as I’m getting more used to my Mac setup. In fact, I’m betting on it now as the first of my GPUs has just gone on eBay and the other two will be gone by the end of the night. So the only way I’ll be using Edius after today is with the Intel iGPU.

                        "There's only one thing more powerful than knowledge. The free sharing of it"


                        If you don't know the difference between Azimuth and Asimov, then either your tapes sound bad and your Robot is very dangerous. Kill all humans...... Or your tape deck won't harm a human, and your Robot's tracking and stereo imagining is spot on.

                        Is your Robot three laws safe?

                        Comment

                        Working...
                        X