Announcement

Collapse
No announcement yet.

Cannot play any YouTube downloaded video in Edius X

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GrassValley_PS
    replied
    Please take that this general discussion to the Lounge. The OP asked if there was anything specific he might do with his setup. Please restrict your posts to that question.

    Leave a comment:


  • Liverpool TV
    replied
    Originally posted by Ron Evans View Post
    Sorry Dave I disagree, a codec is a codec defined by a standard. How it is implement is something for each company to decide. I did not suggest clock and cores in the way you took it. Read again. If you do not have QS or NVENC ( which is fixed by hardware I agree ) then the decode or encode will be governed by the software architecture and the speed of the CPU ( or GPU if it uses that ). IF the software is single threaded it will want the fastest single core speed it can get. In my simple example the CPU is placing one piece of the puzzle at a time. So the faster it can do that the quicker it will finish. We know EDIUS likes fast clocks. I do not think this is up for discussion and why Intel is the current preferred CPU regardless of QS as traditionally Intel has faster clocks and has been easier to overclock too. It is why Intel has been the leader for game PC's. As to NVENC, each series of NVIDIA cards have some changes to the implementation. They are not all the same and performance is also governed by clock speed and memory allocation even within a series. The CUDA cores can also be used independent of the NVENC implementation by the software.
    Ron, there's nothing to disagree with, everything I've said about codecs is exactly how it is.

    Your understanding of NVENC is also wrong. NVENC doesn't always change with a series of cards. From the GTX 1660 through to the RTX 2080 ti, these all use 6th gen Turing encoders, which are all the same. Also, NVENC performance within the same generation of NVENC encoder, Turing as an example, doesn't change either regardless of the cards' clock or memory.

    When you say "The CUDA cores 'CAN ALSO' be used independent of the NVENC implementation by the software". There's no 'CAN ALSO' about it, they are two different things and are always independent.

    Anyway Ron. I was just correcting you on something that you'd said that was wrong and was conflicting with something that I'd said about codecs which was right. I'm just trying to avoid misinformation. If you want to disagree with industry fact then that's up to you, just expect to be corrected when you're wrong.

    Leave a comment:


  • Ron Evans
    replied
    Sorry Dave I disagree, a codec is a codec defined by a standard. How it is implement is something for each company to decide. I did not suggest clock and cores in the way you took it. Read again. If you do not have QS or NVENC ( which is fixed by hardware I agree ) then the decode or encode will be governed by the software architecture and the speed of the CPU ( or GPU if it uses that ). IF the software is single threaded it will want the fastest single core speed it can get. In my simple example the CPU is placing one piece of the puzzle at a time. So the faster it can do that the quicker it will finish. We know EDIUS likes fast clocks. I do not think this is up for discussion and why Intel is the current preferred CPU regardless of QS as traditionally Intel has faster clocks and has been easier to overclock too. It is why Intel has been the leader for game PC's. As to NVENC, each series of NVIDIA cards have some changes to the implementation. They are not all the same and performance is also governed by clock speed and memory allocation even within a series. The CUDA cores can also be used independent of the NVENC implementation by the software.

    Leave a comment:


  • Liverpool TV
    replied
    Originally posted by Ron Evans View Post
    To me the codec is fixed by a standard. How the file is created to meet the standard ( encoded ) or read ( decoded ) is up to the software that is using it. This is where all the NLE's may well be different unless they use, for instance, dedicated embedded routines in hardware ( QS or NVENC etc ). It is a bit like doing a jigsaw puzzle. When finished all the participants have to have the same puzzle to meet the standard requirements. However, how they make the puzzle and how long they take will be different. Same to take puzzle apart. Could do one piece of puzzle at a time until the puzzle is complete ( single thread !! ) or get several sections done at the same time and then join together ( multithreaded !! ) etc. I know that is simplistic but I am sure you see the point. You can draw your own conclusions as to why EDIUS likes fast clocks with QS and doesn't seem to take too much advantage of high core counts.
    The fact there are multiple ways to decode or encode a specific codec isn't fundamental to clocks and cores. Clocks and cores is something completely different and shouldn't be confused with the ability to do the fundamental job of decoding and encoding. BTW, the fast clocks don't have anything to do with QS. If it's present and being used by any CPU within the same generation, regardless of cores and speed, the speed of QS is the same. This is why higher core counts make no difference with QS. The same is true for NVENC, it'll be the same from a GTX1660 right up to a RTX 2080 ti and is also unaffected by the core count and speed of the host CPU. As for a general lack of high core count utilisation within Edius for other tasks, I haven't a clue why that is.

    If you take a look at other software there are options to choose what decodes a particular codec, be that hardware assisted or done entirely in software.

    Like I was saying, the term codec isn't just a reference to the standard but also what actually does the job.

    Leave a comment:


  • Ron Evans
    replied
    To me the codec is fixed by a standard. How the file is created to meet the standard ( encoded ) or read ( decoded ) is up to the software that is using it. This is where all the NLE's may well be different unless they use, for instance, dedicated embedded routines in hardware ( QS or NVENC etc ). It is a bit like doing a jigsaw puzzle. When finished all the participants have to have the same puzzle to meet the standard requirements. However, how they make the puzzle and how long they take will be different. Same to take puzzle apart. Could do one piece of puzzle at a time until the puzzle is complete ( single thread !! ) or get several sections done at the same time and then join together ( multithreaded !! ) etc. I know that is simplistic but I am sure you see the point. You can draw your own conclusions as to why EDIUS likes fast clocks with QS and doesn't seem to take too much advantage of high core counts.

    Leave a comment:


  • Liverpool TV
    replied
    Originally posted by Ron Evans View Post
    The codecs are fixed by industry having their own doesn't work if no one else uses them.
    Hi Ron. The term codec applies not just to the standard but also to a particular filter that encodes/decodes. In this respect if Edius had its own software codec/filter, say for H.265, then it would be able to use that with a CPU and not have to rely on QS. For instance, if someone had a powerful CPU such as yours, if a software codec was being used by Edius then it would be able to effectively decode/edit H.265 using the CPU. In the past Canopus/GV had their own codecs within Edius, this made life easier at the time as Edius' future was always meant to be CPU bound. If I'm not mistaken, Edius had its own MPEG2 filter/codec and this is why back in the day it was so good with certain source material or captures. This is why we are having so much trouble now with Edius X, it's basically built for CPU usage and anything like QS or NVENC is just a bolt on that the original architecture wasn't designed to do. The lack of specific GV/Edius built in codecs is what's causing so many issues and why there's a massive disparity between using AMD CPUs or Intel CPUs without QS compared to an Intel CPU with QS.

    Leave a comment:


  • Ron Evans
    replied
    I think it is a matter of EDIUS using the available resource ( CPU cores and GPU hardware ) The codecs are fixed by industry having their own doesn't work if no one else uses them. HQX can be used by most of the competing software but with the available CPU and GPU resource now there should be no need to transcode to an intermediate unless one is doing a large number of multicam type project. It is just that at the moment QS is the only hardware resource used by EDIUS to accelerate timeline playback. The new architecture of EDIUSX promises to provide the opportunity of using other hardware resource I just hope the wait isn't so long that people just move to other NLE's that already do this now.

    Leave a comment:


  • createmedia
    replied
    Quite so I would love that to get fixed.

    Mike

    Leave a comment:


  • Liverpool TV
    replied
    Again, limitations of Edius that haven’t been addressed in X.

    The only joy you’ll get with such things is to use QuickSync.

    If GV where to introduce their own software codecs for H.264, H.265 etc. as they used to do in the past with certain other codecs, then Edius would be able to leverage CPUs properly for the decoding of such footage.

    For now, it’s really a case of QuickSync or nothing. That said, other GPUs could do this if GV give Edius that compatibility.

    Although what we are seeing is the direct result of GV relying on proprietary technology for core Edius functions, as in, using someone else’s hardware codec.

    Edius really needs its own software codecs to become more autonomous, like it was in the early days. This reliance on proprietary technology basically puts Edius users over a barrel, as in, not all Edius users can use all the codecs that Edius supports unless they all use the same limited hardware.

    Leave a comment:


  • Cannot play any YouTube downloaded video in Edius X

    This seems to be a unique issue to a Xeon Multicore environment (dual 18 core CPU).

    Edius is not at all happy trying to play any YouTube MP4 clips on the timeline it pauses, waits where I can do nothing and then comes back to present the next frame under the cursor and of course won't play in real time. If we go up to UHD it almost crashes Edius trying to play the clips - white windows or window borders).

    Using the playback resolution down to 1/8th doesn't make any difference. It is almost like the software is looking for something and not finding it.

    It was the same with my previous hardware an HP Z800 with two 6 core Xeons back to Edius 7 then to Edius 8.

    It seems the PC's with i7's etc., can play these clips OK.

    Is there any improvement that can be sorted out for Xeon based PC's (non Quick Sync) for this issue?

    Mike
Working...
X