Grass Valley Forums Facebook   Twitter   YouTube  

Go Back   Grass Valley Forums > Editors > EDIUS: Compatible Hardware and Accessories

Reply
 
Thread Tools Display Modes
Old 03-06-2017, 03:14 PM   #41
mark williams
Senior Member
 
mark williams's Avatar
 
Join Date: Jul 2009
Location: Deep South U.S.
Posts: 270
Default

Quote:
Originally Posted by AlbertGS View Post
...My planned config was i7-7700k, Asus Prime Z270-A, 32GB DDR4 3200, SAMSUNG 960 EVO 500 GB M.2 SSD (Read Speed up to 3200 MB/s) plus my existing HDDs.
Comments welcome :)
I am also using the SAMSUNG 960 EVO in my new build and got to say it is blazing fast. On my motherboard it is placed near the front fan so no cooling issues like on some boards where it is near/under the GPU. I noticed on your Asus Prime Z270-A board the m.2 location is the same as mine.
__________________
Mark Williams
natureflixs@hotmail.com
http://vimeo.com/channels/3523
http://www.pond5.com/artist/mark29
Edius 8 WG, i7 6700K, Asus 170-A mobo, 16 GB Ram, System 960 Evo M.2, Video multiple 850 Evo SSDs, Shadow Rock Slim cpu cooler, On board graphics, Corsair 400c case, HD Spark, Windows 10, Panasonic GH5

Last edited by mark williams; 03-06-2017 at 04:36 PM.
mark williams is offline   Reply With Quote
Old 03-06-2017, 03:51 PM   #42
GrassValley_PS
Moderator
 
GrassValley_PS's Avatar
 
Join Date: Oct 2009
Location: Cincinnati
Posts: 4,888
Default

Mark, have you done a benchmark speed test with Magician? If you look at my original post about my system, it has some amazing numbers.
__________________
Main System: i7 6950X OC to 4.5GHz, ASUS RAMPAGE V EDITION 10, Corsair Dominator Platinum 64GB DDR4 2800, SAMSUNG 950 PRO M.2 512GB , GeForce GTX 1080ti SC Black, Corsair AX1200i PS, Phanteks Luxe Case, 16TB RAID (4)HGST Deskstar NAS 4TB, Corsair H115i CPU Cooler, AJA Kona 4, BMD Intensity Pro 4K, Win 10
GrassValley_PS is offline   Reply With Quote
Old 03-06-2017, 11:14 PM   #43
mark williams
Senior Member
 
mark williams's Avatar
 
Join Date: Jul 2009
Location: Deep South U.S.
Posts: 270
Default

Quote:
Originally Posted by GrassValley_PS View Post
Mark, have you done a benchmark speed test with Magician? If you look at my original post about my system, it has some amazing numbers.
Yes. I was amazed.

Sequential MBs
Read: 2,954
Write: 1,477

Random IOPS
Read: 400,589
Write: 266,169
__________________
Mark Williams
natureflixs@hotmail.com
http://vimeo.com/channels/3523
http://www.pond5.com/artist/mark29
Edius 8 WG, i7 6700K, Asus 170-A mobo, 16 GB Ram, System 960 Evo M.2, Video multiple 850 Evo SSDs, Shadow Rock Slim cpu cooler, On board graphics, Corsair 400c case, HD Spark, Windows 10, Panasonic GH5

Last edited by mark williams; 03-06-2017 at 11:23 PM.
mark williams is offline   Reply With Quote
Old 04-03-2017, 12:41 PM   #44
Rmhs
Junior Member
 
Join Date: Apr 2017
Location: Israel
Posts: 3
Default Searching for a strong 4k beast

Hi I'm a bit in a hurry done a lot of home work and got myself confused

I'm searching for a strong machine that edits in 4k more than 5 layers real time in edius 7.5 and 8 and above ..with very fast rendering times....
My additional need is to have after effects and photoshop installed on the same machine.......got myself confused between Intel and amd ryzen 8 cores and all stuff around...

Can someone here please give me a full spec for best machine including mother board cpu gpu ssd nvme or other hard drives etc...

Thanks a lot for attention and complete answers..
Rmhs is offline   Reply With Quote
Old 04-03-2017, 03:45 PM   #45
AlbertGS
Member
 
Join Date: Aug 2015
Location: South Africa
Posts: 39
Default

Quote:
Originally Posted by Rmhs View Post
Hi I'm a bit in a hurry done a lot of home work and got myself confused

I'm searching for a strong machine that edits in 4k more than 5 layers real time in edius 7.5 and 8 and above ..with very fast rendering times....
My additional need is to have after effects and photoshop installed on the same machine.......got myself confused between Intel and amd ryzen 8 cores and all stuff around...

Can someone here please give me a full spec for best machine including mother board cpu gpu ssd nvme or other hard drives etc...

Thanks a lot for attention and complete answers..
If your source is h,264 4k, 5 layers plus may be wishful thinking on any PC I know of unless you transcode to an edit friendly intermediate codec.

Regarding Ryzen, you will obviously lose the advantage of Intel Graphics and QuickSync (QSV) in Edius.

Adobe is the only NLE I've seen tested Intel vs Ryzen and test results are interesting here:
https://www.pugetsystems.com/labs/ar...rformance-909/

My planned upgrade is i7-7700k (overclocked to 5ghz), Asus Prime Z270-A m/board, 32GB DDR4 3200ghz, GTX1070, SSD1 (Win10 and app software), SSD2 for current projects, WD 4TB HDD.

I chose the Kaby Lake 7700K because the gpu has h.265 10 bit hardware decoding which I'm hoping to see in Edius. Here is more on the graphics hardware encode/decode capabilities.
http://www.anandtech.com/show/10959/...00k-i3-7350k/6

Please let us know what you decide on and how it performs.
Al

Last edited by AlbertGS; 04-03-2017 at 03:54 PM.
AlbertGS is offline   Reply With Quote
Old 04-05-2017, 02:37 AM   #46
dnavas
Senior Member
 
Join Date: Aug 2010
Location: San Jose
Posts: 178
Default

Quote:
Originally Posted by David Clarke View Post
If anyone wants some Gh5 clips to test you can go to this video and download some:
https://www.youtube.com/watch?v=T-qU...ature=youtu.be
Ryzen datapoints, for those interested:

1) I loaded the Waterfall scene (60p) into an 8bit UHD60p project.
Single layer is fine and uses about 60% of my CPU.
Two layers requires 1/2.
Three layers requires 1/4 (but was sometimes choppy *).

2) I loaded the Seattle scene (24p** 4:2:2, 10bit) into a 10bit UHD24p project.
Single layer is fine and uses something like 40% of my CPU (hard to tell, as CPU use was all over the map between 25% and 54%).
With two layers, I had to drop to Full 8bit.
Three layers needed 1/2.
Four layers was not smooth, although the video was running quite close to real-time at 1/4

3) I loaded the Perpetua Dolly scene (30p 4:2:2, 10bit) into a 10bit UHD30p project.
Single layer is fine and CPU use was anywhere between 9% and 78% (I'm not kidding). I'm guessing an average of around 40% :shrug:
Two layers at full 8bit
Three layers required 1/4

4) I see Jerry dropped a PCC filter on his tests as well. For the 30p tests (which is the only example I tested), a single PCC was fine. Two required a drop down to 1/2 (and that left enough headroom for seven of them***).

I generally consider real-time UHD at 1/2 to be pretty reasonable (but I'm running 4k on a 27" monitor, and the difference between Full and 1/2 comes down to being a tad fuzzy). At 1/4, I get fairly serious aliasing. The real problem with the 1/2 setting is that it applies to all clips, and when I'm doing an HD inset (for example, a closeup of the keyboard during a piano performance), the HD video at 1/2 looks fairly terrible!

If anyone has a 69[50]0k to compare to, I'd definitely be interested in those data points.

At any rate, it might shortly (~six months?) be possible to do five layer 4:2:2 UHD video, so long as you're not expecting 60p, and as long as you're not actually doing something to the clips....

-Dave

* I consider video smooth if the audio is uninterrupted. In some cases, Windows seems to schedule the decode threads on the same physical CPU, and in that case, utilization falls and audio breaks up. This was the case for three layers in 4k60p. Strangely, it *only* affected the 4k60p footage.

** Yes, the clips are really 23.98 and 29.97 or some such -- I opened the matching project type....

*** The latest BIOS update allowed my 3200 memory OC to stabilize, but although measured bandwidth improved by about 50%, and latency dropped 25%, no difference was noted on the results of this test (and it's the only one I compared for before/after).

ed: fixed bandwidth improvement percentage
__________________
1800X @3.9G, 16GB Win7 / 2950X 32GB Win10, Gigabyte GTX 970 (IXOC), dual EA275UHD, QNAP 873 (w/10GbE). Video source typically HC-X1 or Panny TM700; Audio DR-701D and Rode NT4. Edius 8.5 WG w/ Vistitle & TmpGenC.

Last edited by dnavas; 04-05-2017 at 02:15 PM.
dnavas is offline   Reply With Quote
Old 04-05-2017, 09:03 AM   #47
AlbertGS
Member
 
Join Date: Aug 2015
Location: South Africa
Posts: 39
Default

4k a universal problem for all NLEs due to PC hardware limitations and an increasing challenge as h.264 is now the most popular 4k camera format and h.265 is coming. 4k/uhd affordable consumer cameras are everywhere as are affordable SUHD TVs - and h.265 HEVC 4k streaming is now a reality.
Without going into codecs and how they work - the problem is these compressed camera formats have to be decoded to individual frames for editing and playback.

As we moved from DV to HD over many years PC power followed Moore's law with cpu power often doubling in a single year at the same cost and the transition was fairly painless for most of us. Moore's law is dead and Intel has abandoned it's incremental Tic/Tick upgrades so we have seen little improvement in Desktop cpus for many years. The bad news is that silicon technology has "hit the wall" and we shouldn't expect much in the near future. Timeline editing is primarily CPU bound and transcoding to "edit friendly" intermediate codecs helps but is time consuming and can can result in huge file sizes of 10X and more.
Latest Intel 8 and 10 core i7s do help but are very expensive and quickly reach their limitations with additional 4k layers on the timeline. Ryzen has reduced costs but edit performance is disappointing so far. Even $10-20k workstations have their limitations and are beyond the reach of many of us non-pros.

So have we hit the wall with 4k realtime editing on an affordable PC?
The challenge is realtime timeline playback using compressed camera codecs - once editing is finished most of us are less concerned about final render speed and happy to take a coffee break or even leave it to run overnight.
There may be another possible solution to software timeline decoding which is choking our CPUs. Cameras use hardware encoders to compress video on the fly and save storage space.
Intel has had integrated GPUs on their CPUs for many years which may not be great for gaming but may have great potential for editing with their on-board hardware encoder/decoders and integrated GPUs have many advantages over expensive discrete GPU cards.

MAGIX (who now own Sony Vegas) seem to have taken the lead working with Intel using the latest consumer priced Kaby Lake CPU and are claiming "Realtime 4k h.265 10 bit color editing" on an affordable consumer level PC with no expensive discrete Nvidia/AMD GPU card.

Here is an Intel video:
How Magix Made 4K 360 Video Editing Swift and Easy
https://www.youtube.com/watch?v=2LrPUhToNAg


Simply put - they have reduced the CPU editing workload by using Intel Graphics h.265 hardware decoders on the timeline. Intel claims their Kaby lake GPU can "handle up to eight 4Kp30 AVC and HEVC decodes simultaneously." !!
This may well be the future for 4k editing on an affordable Desktop or even a Laptop.

FYI - here is more info on power of Kaby lake GPU
The Kaby Lake-U/Y GPU - Media Capabilities
http://www.anandtech.com/show/10959/...00k-i3-7350k/6

And here is why CPU technology "hit the wall" in this three part series
The Future of Computers
http://www.edn.com/design/systems-de...he-Memory-Wall

To summarize there are three walls limiting CPU technology

"- The Power Wall means faster computers get really hot.
- The Memory Wall means 1000 pins on a CPU package is way too many.
- ILP Wall means a deeper instruction pipeline really means digging a deeper power hole. (ILP stands for instruction level parallelism.)"


"Power Wall + Memory Wall + ILP Wall = Brick Wall"

"Taken together, they mean that computers will stop getting faster. Furthermore, if an engineer optimizes one wall he aggravates the other two. That is exactly what Intel did."

So for those, like me waiting to upgrade my PC for 4k editing - don't expect cpu technology to come to our rescue in the near future.
AlbertGS is offline   Reply With Quote
Old 04-05-2017, 01:49 PM   #48
Ron Evans
Senior Member
 
Join Date: May 2007
Location: Ottawa
Posts: 4,194
Default

I think it comes down to whether the NLE designers take advantage of all the available compute power. Most NLE's take some advantage of QS, some use any GPU for effects or rendering. The latest discrete GPU's have a lot more power available than the Intel integrated GPU but I think only ADOBE takes advantage of that at the moment for timeline editing. Storage isn't that much of a problem these days so using an I frame only codec is not such a disadvantage. The problem with transcoding to HQX etc is the time, almost like ingesting tape !!! Starting with ProRes may be more convenient. Since my goal is to crop/pan/zoom into the UHD image I do need a smooth experience and that may be the way I have to go to make the editing process easy on me. I have been doing it for over 3 years now with the FDR-AX1 but only for portions of the edit relying on the other cameras for most.
__________________
Ron Evans

Threadripper 1920 stock clock 3.7, Gigabyte Designare X399 MB, 32G G.Skill 3200CL14, 500G M.2 SATA OS, 500G EVO 850 temp. 1T EVO 850 render, 6T Source, 2 x 1T NVME, MSI 1080Ti 11G , EVGA 850 G2, LG BLuray Burner, BM IP4K, WIN10 Pro

ASUS PB328 monitor, BenQ BL2711U 4K preview monitor, IP4K, EDIUS 9.5 WG, Vegas 17, Resolve Studio 16


Cameras: GH5S, GH5, FDR-AX100, FDR-AX53, DJI OSMO Pocket, Atomos Ninja V x 2
Ron Evans is offline   Reply With Quote
Old 04-05-2017, 02:21 PM   #49
Bassman
Senior Member
 
Join Date: May 2007
Location: Texas
Posts: 2,473
Default

Great write up Albert. If Intel would put a hardware decoder on an HEDT processor we would be in business. But they seem kind of stubborn about this topic.

Any chance a hardware decoder could be a motherboard component or even an add-in card? Or does it have to be tied to the CPU for enough speed?

Seems like the industry is kind of asleep here. Folks are not going to buy new systems if they are not improving their workflow.
__________________
Asus Prime X299-A - Intel i9 7900x all cores @4.3GHz 2 cores @4.5GHz - 32GB RAM - NVidia GTX1070 - Edius 9 WG - BM Intensity 4k - Boris RED - Vitascene 2 - Windows 10
Bassman is offline   Reply With Quote
Old 04-05-2017, 03:52 PM   #50
dnavas
Senior Member
 
Join Date: Aug 2010
Location: San Jose
Posts: 178
Default

Quote:
Originally Posted by Ron Evans View Post
I think it comes down to whether the NLE designers take advantage of all the available compute power.
Well, Edius does support PCC on the gfx card (though it leaks memory like a sieve in Win7). I haven't been impressed by the performance on my 970 -- it chokes out on a couple, while my 1800 runs many. But that's a promising start at least.

There are a couple of factors at work. First, hardware vendors have to dedicate themselves to putting high quality encode/decode logic on their hardware. It's too easy to skimp out at 4:2:0 "because that's all a consumer will have". NVDEC doesn't support 4:2:2 https://developer.nvidia.com/nvidia-...#NVDECFeatures for example.
Intel has kept their hardware pretty flexible, but doesn't combine it with their 8 core chips, which is frustrating. Maybe there are Xeons you can put in a dual-cpu setup? And then, yeah, software companies need to support that hardware. That means headaches for them as nvenc or qsv or whatever breaks from driver release to driver release. And finally, the software companies need to cooperate with the hardware vendors to make sure the latter understand the needs of the former. The lack of support for 4:2:2 decode h/w is going to hurt AMD imho, and you'd think that NV would be interested in 4:2:2 for their quadro market. :weird:
__________________
1800X @3.9G, 16GB Win7 / 2950X 32GB Win10, Gigabyte GTX 970 (IXOC), dual EA275UHD, QNAP 873 (w/10GbE). Video source typically HC-X1 or Panny TM700; Audio DR-701D and Rode NT4. Edius 8.5 WG w/ Vistitle & TmpGenC.
dnavas is offline   Reply With Quote
Reply
 
Go Back   Grass Valley Forums > Editors > EDIUS: Compatible Hardware and Accessories
 

Bookmarks

Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
4K Editing and Thunderbolt 2 Jim N EDIUS: Compatible Hardware and Accessories 1 11-24-2014 11:54 AM
4k feedback editing vgtah Editing with EDIUS 2 02-24-2014 03:03 AM
i7 CPU for Video Editing zoolane EDIUS: Compatible Hardware and Accessories 30 03-15-2009 11:52 PM

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are Off
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:47 PM.


Copyright 2014 Belden Inc. All rights reserved.