Comments

john_dennis wrote on 11/5/2012, 11:25 PM
Since I use the Sony AVC codec and Quick Sync with similar results, I've had no burning desire to add hundreds of dollars and 250 Watts to my latest system build.
Guy S. wrote on 11/6/2012, 12:16 PM
If I were to delete the graphics card my total cost for this system, including two SSD drives, a 2TB media drive, BluRay burner etc. would be about $1k. That's a lot of editing power for the $$.
drewU2 wrote on 11/6/2012, 2:33 PM
Wow, I just moved one of my monitors from my EVGA GTX 660 to the onboard Intel 4000 on my i7-3770 and ran the test - Intel Quick Sync is 50% faster than 1000 Cuda cores!

I wish I had known about this before I spent $230 on a video card. O dear, what are you going to do. I guess I'll sell the video card. There's no need to run a video card simply for rendering when the onboard render is much faster. Yikes!
megabit wrote on 11/6/2012, 3:06 PM
" Intel Quick Sync is 50% faster than 1000 Cuda cores!"

It's untrue, plain and simple. If Vegas Pro 12 makes anybody think like this it only indicates how poor handling of CUDA currently is. The good news it shows how great potential there still is to take advantage of, should SCS care to optimize the code for the Kepler architecture and current nVidia drivers.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

drewU2 wrote on 11/6/2012, 3:57 PM
Well I would love to see CUDA be faster since I have already made the investment in the card, but right now Quick Sync is what I will be using to render. Time will tell.

And what do you make of tests like this: http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-5.html
Former user wrote on 11/6/2012, 4:30 PM
I was going to test the render speed on my system using your settings, but I can't find a template for "Sony AVC CODEC (720p-24)" in my install of VP12.

I did render out to: Mainconcept AVC/AAC - Apple TV 720p24 Video with GPU rendering turned on and 30 seconds of timeline rendered in 7 seconds. My system specs are in my profile.
john_dennis wrote on 11/6/2012, 4:45 PM
@megabit

I think there is room for optimization with all the cpu and video processing options available today.

In this thread you can see my results did not saturate all four (8 with Hyperthreading) Intel cores or use much of the available 16 GB of memory. I don't know whether the HD 4000 adapter was creating a bottleneck.

One of the main reasons I didn't buy a 3930 system was that I haven't seen all my available cores running consistantly near 100% since I had a Core2Duo. I didn't want to pay for capacity that I would leave stranded. Further, I think that current software is leaving a lot of hardware throughput unused, whether it be CUDA cores whatever the analogy for ATI is called or the specialized on-die rendering hardware inside an Intel chip.
john_dennis wrote on 11/6/2012, 4:54 PM
"I can't find a template for "Sony AVC CODEC (720p-24)" in my install of VP12."

You can create your own template using the Customize option. 1280-x720-59.94p is legal for Blu-ray. I use it all the time.
Former user wrote on 11/6/2012, 5:50 PM
I use custom templates all the time -- I just thought the OP had used some sort of pre-defined template that I didn't see listed and I just wanted to be sure to compare apples to apples (so to speak ;-)

Jim
diverG wrote on 11/7/2012, 6:36 AM
Just how do you set up quick sync for use within VP12? I get a message 'intel Quick Sync video is not available'.

My pc will render using quick sync providing I use Edius6.52 as the NLE. Just cannot get it to fire up with SVP.

Machine spec as per system 1

Edit: Brain now in gear. Using i7 2660K which has different graphics to i7 3770 HD3000 instead of HD4000

Sys 1 Gig Z370-HD3, i7 8086K @ 5.0 Ghz 16gb ram, 250gb SSD, 2x2Tb hd,  GTX 4060 8Gb, BMIP4k video out. (PS 750W); Vegas 18 & 19 plus Edius 8WG DVResolve18 Studio. Win 10 Pro (22H2) Bld 19045.2311

Sys 2 Gig Z170-HD3, i7 6700K @ 3.8Ghz 16gb ram, 250gb SSD, 2x2Tb, hdd GTX 1060 6Gb, BMIP4k video out. (PS 650W) Vegas 18 plus Edius 8WG DVResolve18 Studio Win 10 Pro (22H2) Bld 19045.2311

Sys 3 Laptop 'Clevo' i7 6700K @ 3.0ghz, 16gb ram, 250gb SSd + 2Tb hdd,   nvidia 940 M graphics. VP17, Plus Edius 8WG Win 10 Pro (20H2) Resolve18

 

Guy S. wrote on 11/7/2012, 11:46 AM
<<Using i7 2660K which has different graphics to i7 3770 HD3000 instead of HD4000>>

Your MOBO needs to support Lucid Logix Virtu MVP software, which allows a discreet graphics card and HD 3000/4000 graphics to be used at the same time. Tim20 explains it well in this thread: http://www.sonycreativesoftware.com/forums/ShowMessage.asp?MessageID=837261&Replies=5

I purchased the bottom rung Z77-based ASUS MOBO (P8Z77-V LX) and it included the Virtu software. Once I connected my monitors to the on-board video QuickSync was active and V12 now lists both nVidia GPU and QuickSync when rendering (QS has two choices listed, Speed and Quality).
Arthur.S wrote on 11/7/2012, 12:49 PM
What do you think to this statement ? "Quick Sync, like other hardware accelerated video encoding technologies, gives lower quality results than with CPU only encoders. Speed is prioritized over quality." Taken from Wikipedia. Just curious.
john_dennis wrote on 11/7/2012, 2:21 PM
Anecdotal response follows:

Given video from Sony consumer AVCHD cameras and over-the-air ATSC broadcasts encoded with the Sony AVC codec at about the same bit rate as the original source, I can’t tell the difference on my playback equipment.

The quote from Wikipedia refers to a test here of applications such as MediaConverter from Arcsoft, MediaEspresso from Cyberlink and others, not the Sony AVC implementation.

If your results prove that encoding video with software running on an x86 processor gives better results, which is possible, you always have the option of checking the box to use “CPU Only”.

End Anecdotal response.
diverG wrote on 11/7/2012, 3:22 PM
Thanks Guy

Virtu was supplied with mobo, I tried it with Edius but an updated intel graphics driver solved matters and virtu was no longer required. I'll try it again with Vegas.

Sys 1 Gig Z370-HD3, i7 8086K @ 5.0 Ghz 16gb ram, 250gb SSD, 2x2Tb hd,  GTX 4060 8Gb, BMIP4k video out. (PS 750W); Vegas 18 & 19 plus Edius 8WG DVResolve18 Studio. Win 10 Pro (22H2) Bld 19045.2311

Sys 2 Gig Z170-HD3, i7 6700K @ 3.8Ghz 16gb ram, 250gb SSD, 2x2Tb, hdd GTX 1060 6Gb, BMIP4k video out. (PS 650W) Vegas 18 plus Edius 8WG DVResolve18 Studio Win 10 Pro (22H2) Bld 19045.2311

Sys 3 Laptop 'Clevo' i7 6700K @ 3.0ghz, 16gb ram, 250gb SSd + 2Tb hdd,   nvidia 940 M graphics. VP17, Plus Edius 8WG Win 10 Pro (20H2) Resolve18

 

Guy S. wrote on 11/7/2012, 7:19 PM
<< I'll try it again with Vegas.>>

Good luck and let us know how it goes.
Guy S. wrote on 11/8/2012, 2:10 PM
Update: Last night I had a chance to compare my test renders from a few days ago and QuickSync (Speed) looked just as good as the others.

Please note that I viewed the clips on an HP 23" 1080p computer monitor, not a high-end broadcast monitor; there may very well be small differences that I didn't notice, but I was expecting a highly visible difference.
farss wrote on 11/8/2012, 2:38 PM
To compare the results don't rely on your eyeballs.
Use Vegas. Put one encode on track one, encode two on track two, set composite mode to Difference or Difference Squared. Check result with monitor and waveform scope. Anything either shows is the difference.

Bob.
john_dennis wrote on 11/9/2012, 1:27 AM
Per Bob's advice, I compared Quick Sync and CPU Only renders using difference composite mode. Per the scopes and monitor there is a quantitative difference if the subjective difference is imperceptable. This test doesn't expose which version might look the best in a blind test. Graphic is here.

The 8 bit renders on this machine are decribed as follows:

Auto.................................65 sec. .........102,158,542 bytes

Quick Sync Quality..........94 sec. .........86,751,648 bytes

CPU Only........................70 sec. ..........63,787,685 bytes

On my machine there would be no render time penalty rendering this project with the CPU only.
diverG wrote on 11/9/2012, 7:36 AM
@ Guy
Timeline 180 Sec
Test QS off = 200 sec, Test QS On= 52 Sec
Rendering to MP4 / H.264

I still cannot get VP12 to use QS so I did tests in Edius to satify myself that QS does work

Sys 1 Gig Z370-HD3, i7 8086K @ 5.0 Ghz 16gb ram, 250gb SSD, 2x2Tb hd,  GTX 4060 8Gb, BMIP4k video out. (PS 750W); Vegas 18 & 19 plus Edius 8WG DVResolve18 Studio. Win 10 Pro (22H2) Bld 19045.2311

Sys 2 Gig Z170-HD3, i7 6700K @ 3.8Ghz 16gb ram, 250gb SSD, 2x2Tb, hdd GTX 1060 6Gb, BMIP4k video out. (PS 650W) Vegas 18 plus Edius 8WG DVResolve18 Studio Win 10 Pro (22H2) Bld 19045.2311

Sys 3 Laptop 'Clevo' i7 6700K @ 3.0ghz, 16gb ram, 250gb SSd + 2Tb hdd,   nvidia 940 M graphics. VP17, Plus Edius 8WG Win 10 Pro (20H2) Resolve18

 

drewU2 wrote on 11/9/2012, 12:11 PM
Friends, I am more convinced than ever that I missed the boat with Intel Quick Sync. Seriously, I just uninstalled my new EVGA GTX 660 Superclocked 2gb DDR5 video card (with 960 CUDA cores) because Intel Quick Sync was significantly faster at rendering with equal results.

My encouragement to Sony would be to support this strongly. I had issues with rendering with CUDA, but with Intel Quick Sync, Vegas has been remarkably stable.

Here's the deal: of course I was hoping CUDA would be worth the $230 investment I made in the card, and to the extent that it sped up rendering by 50% over the CPU alone, it was worth it. But the reality is that Intel Quick Sync is 100% faster than the CPU alone, and I wish I had tested that before purchasing the video card, which is now up for sale on eBay.

Which begs the question, why hasn't Sony been more supportive, at least in their marketing, of Intel Quick Sync?
Guy S. wrote on 11/9/2012, 2:05 PM
<< Put one encode on track one, encode two on track two, set composite mode to Difference or Difference Squared>>

Great tip, thanks!

<<I just uninstalled my new EVGA GTX 660 Superclocked 2gb DDR5 video card (with 960 CUDA cores) because Intel Quick Sync was significantly faster at rendering with equal results.>>

Vegas uses Open CL, so yes, the CUDA cores are used, but via Open CL - that's why Sony's acceleration works with ATI and Intel GPU solutions that support Open CL.

I'm holding on to my nVidia card for now because a) I'll never get my money out of it, and b) I suspect that Open CL optimization will increase as nVidia releases new drivers.

I should also point out that I've only rendered to one CODEC. Once I look at timeline performance and rendering to other GPU-accelerated formats I may find that QS doesn't perform as well as a discreet card.
dxdy wrote on 11/11/2012, 12:51 PM
I have been comparing render results on my new 3770k (not overclocked) with my old i7-950. With a GTX 660ti, the 3770k looks to be about twice as fast as the 950 when debug frame serving to TMPGEnc for a SD Mpeg 2 render. If you are interested, you can see the project I used for this comparo here...

Shmuel wrote on 11/12/2012, 4:14 AM
Did you try to render longer projects, like 20 min and you will see that there is no sync between sound and video.

I tried it several times and the result is the same.
john_dennis wrote on 11/12/2012, 6:44 PM
I've rendered more than five forty minute projects using "AUTO" and not noticed a loss of video-audio sync. I forced Quick Sync Quality for one project and didn't notice loss of sync. Most of my output goes to DVD Architect as separate (Peter Duke will be so proud of me.) elementery streams.