4 hours to render 17 mins of video!

Former user wrote on 12/15/2018, 9:08 AM

I’m on V16.0 build 261.

Windows 10, core i7 2.8GHz, 12gb RAM, Vegas Pro V16 programme is on the C: SSD with the video from & saving to a diff HD. MB is Asus P6X58DE, core i7. Processor is at 95%, the rest of the devices are not stretched. GPU is a NVIDIA GTX650Ti which looks to be enabled as the GPU accelerator.

Source is a mix of iPhone X video, Lumix .MTS, Go Pro. 1080p.

Target is 1080p, 128, 50fps, MP4 using Magix AVC/AAC. Pixels are set at 32b. High quality. Variable bit rate 20 mbps average, single pass.

Anyone know if this speed is the best I can get?

Thanks in advance.

Comments

Musicvid wrote on 12/15/2018, 9:57 AM

Target is 1080p, 128, 50fps, MP4 using Magix AVC/AAC. Pixels are set at 32b. High quality. Variable bit rate 20 mbps average, single pass.

Well, your source is 8 bit, so 32 bit float environment is without basis, accomplishing nothing except to increase your render times dramatically. If you are using float as some kind of macabre leveling tool, don't do that. There are logistically appropriate filters for that. The answer to internet hype is directly below.

https://www.vegascreativesoftware.info/us/forum/10-bit-vs-8-bit-grading-the-musical--111748/

https://www.vegascreativesoftware.info/us/forum/faq-why-does-my-video-have-more-contrast-than-the-vegas-preview--104567/

Please, when you have a question, start with default template settings, and post them with MediaInfo! My crystal ball contract just went on subscription model, so I can no longer afford it. Pebcak.

https://www.vegascreativesoftware.info/us/forum/important-information-required-to-help-you--110457/

 

set wrote on 12/15/2018, 9:58 AM

If you haven't turn off the Resampling, that can increase rendering time.

Also you are editing in 32 bit (Full Range) pixel format? Yes... That will also increase the rendering time as well!

 

Setiawan Kartawidjaja
Bandung, West Java, Indonesia (UTC+7 Time Area)

Personal FB | Personal IG | Personal YT Channel
Chungs Video FB | Chungs Video IG | Chungs Video YT Channel
Personal Portfolios YouTube Playlist
Pond5 page: My Stock Footage of Bandung city

 

System 5-2021:
Processor: Intel(R) Core(TM) i7-10700 CPU @ 2.90GHz   2.90 GHz
Video Card1: Intel UHD Graphics 630 (Driver 31.0.101.2127 (Feb 1 2024 Release date))
Video Card2: NVIDIA GeForce RTX 3060 Ti 8GB GDDR6 (Driver Version 551.23 Studio Driver (Jan 24 2024 Release Date))
RAM: 32.0 GB
OS: Windows 10 Pro Version 22H2 OS Build 19045.3693
Drive OS: SSD 240GB
Drive Working: NVMe 1TB
Drive Storage: 4TB+2TB

 

System 2-2018:
ASUS ROG Strix Hero II GL504GM Gaming Laptop
Processor: Intel(R) Core(TM) i7 8750H CPU @2.20GHz 2.21 GHz
Video Card 1: Intel(R) UHD Graphics 630 (Driver 31.0.101.2111)
Video Card 2: NVIDIA GeForce GTX 1060 6GB GDDR5 VRAM (Driver Version 537.58)
RAM: 16GB
OS: Win11 Home 64-bit Version 22H2 OS Build 22621.2428
Storage: M.2 NVMe PCIe 256GB SSD & 2.5" 5400rpm 1TB SSHD

 

* I don't work for VEGAS Creative Software Team. I'm just Voluntary Moderator in this forum.

fr0sty wrote on 12/15/2018, 10:22 AM

I edit 32 bit projects all the time, it doesn't take that long to render that short of a video, so something else is up here. There are other GPU rendering options if you have a compatible GPU. For Nvidia cards, you have NVENC (you'll see NVENC render templates appear once you have the card installed and correct drivers updated), or VCE if you use AMD cards. Then there's also Intel's QSV if you have a compatible Intel CPU.

First thing I'd upgrade is that ancient GPU. Upgrade to a 20xx series Geforce card or any modern AMD card and you will instantly notice a huge rendering and timeline editing difference.

Last changed by fr0sty on 12/15/2018, 10:22 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Musicvid wrote on 12/15/2018, 10:38 AM

fr0sty wrote on 12/15/2018, 9:22 AM

I edit 32 bit projects all the time, it doesn't take that long to render that short of a video, 

Since we don't have access to the OP's source media, project, or system, do you have some tests with numbers to support your impressions?

Also, why do something differently (8-32-8) if it doesn't do anything good? Warning: Learning Curves at risk!

I ask only because my own recent tests suggest something quite different, only on a more modest system. Thanks.

Former user wrote on 12/15/2018, 10:41 AM

To JV -thanks for the good natured assistance!

 

Former user wrote on 12/15/2018, 10:43 AM

Target is 1080p, 128, 50fps, MP4 using Magix AVC/AAC. Pixels are set at 32b. High quality. Variable bit rate 20 mbps average, single pass.

Well, your source is 8 bit, so 32 bit float environment is without basis, accomplishing nothing except to increase your render times dramatically. If you are using float as some kind of macabre leveling tool, don't do that. There are logistically appropriate filters for that. The answer to internet hype is directly below.

https://www.vegascreativesoftware.info/us/forum/10-bit-vs-8-bit-grading-the-musical--111748/

https://www.vegascreativesoftware.info/us/forum/faq-why-does-my-video-have-more-contrast-than-the-vegas-preview--104567/

Please, when you have a question, start with default template settings, and post them with MediaInfo! My crystal ball contract just went on subscription model, so I can no longer afford it. Pebcak.

https://www.vegascreativesoftware.info/us/forum/important-information-required-to-help-you--110457/

 

Thanks. Sources appear to be 32 also....

Former user wrote on 12/15/2018, 11:09 AM


Yes its much faster when set to 8 bit pixels....

Q remains, am I getting an advantage on 32....

j-v wrote on 12/15/2018, 11:30 AM

@Former user

Sources appear to be 32 also....

Where can you see that?
And what means

Q remains

????

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

john_dennis wrote on 12/15/2018, 12:16 PM

@Former user

"core i7 2.8GHz"

You didn't state how many cores. 2,4,6,8,10, etc.

"12gb RAM"

The ram configuration can be achieved by adding an 8GB DRAM and a 4GB DRAM module. If this is the case (HP) the memory controller won't run in dual channel mode. 12 GB could also be achieved by adding two 4GB and two 2GB DRAM modules, but it is extremely unlikely for a mainstream manufacturer to do that.

"Anyone know if this speed is the best I can get?"

It might be right on, but you've given us a yardstick with no numbers on it. If this is your system then you are due for an upgrade.

I would only change the video card as part of a system upgrade.

OldSmoke wrote on 12/15/2018, 12:44 PM

Actually, ASUS P4P boards do not support i7 CPUs but rather Celleron and Pentium chips with socket 478.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Former user wrote on 12/15/2018, 12:58 PM

Actually, ASUS P4P boards do not support i7 CPUs but rather Celleron and Pentium chips with socket 478.

Whoops, you are correct I read the wrong book, its my Asus P6X58DE, core i7.

Musicvid wrote on 12/15/2018, 12:59 PM


Yes its much faster when set to 8 bit pixels....

As you now know, 32 bit will take longer to do the same thing...

Q remains, am I getting an advantage on 32....

No, you will not get any advantage. I've already posted a link to the controlled tests designed to answer just that, and which took many hours to complete. Also another link showing how to report file properties here.

Please be sure to be clear that there is NO difference between 8 Bits Per Pixel Per Channel and 24 bits RGB and 32 bits RGBA, which are sometimes reported. They are all exactly the same thing!

Last time ever, I think your source is 8 bit 420 ( 24 bit RGB}. It will render much faster in 8 bit space because that's where it belongs, and it won't get changed by the encoding pipeline, and then get changed back again by the encoder. Easy to understand?

Anyone know if this speed is the best I can get?

Nobody can know that. But now we know it's not the worst you can get!

Now go boldly forth and dare to make some mistakes, because there are no more to be found here!

 

Former user wrote on 12/15/2018, 1:05 PM

@Former user

Sources appear to be 32 also....

Where can you see that?
And what means

Q remains

????

In the Vegas file explorer area, at the bottom, eg file: 1920x1080x32. 32 means 32 bit colour yes? Same item as pixel format setting (?)….

Sorry, by 'Q' I meant 'question'.

klt wrote on 12/15/2018, 2:10 PM

1920x1080x32. 32 means 32 bit colour yes?

AFAIK It means 8 bit RGB (24 bit) plus 8 bit padding or 8 bit alpha channel. That sums up to 32 bit.

I think real 32 bit colour would sum up to 128 bit.

j-v wrote on 12/15/2018, 2:24 PM

We still don't know exactly the used Magix template, because probably the only available GPU is your old Nvidia.
That one was able to use Cuda for rendering but that also old option is gone in Vegas except when you use the old Mainconcept AVC codec, but that one is not available by default in VPro 16.
So now that GPU makes your visible screen en helps a litle with acceleration of some OFX effects.
To use the NVidia NVENC option in the Magix codecs for much faster (6-8 times) rendertime you must upgrade your NVidia GPU at least to

But if you can set yourself or someone else in your BIOS the option to show up and use the Intel HD Graphics 630 you get in the Magix codecs a bit faster encoderhelp compared to the NVENC option with also 6-8 times faster rendering than without that option. But take care in the lower bitrates that QSV renderoption makes sometimes litlle mistakes.
Sometimes you have to connect your first monitor screen ( on which the BIOS is visible) directly to your mainboard for showing up the QSV renderoptions in the Magix codecs.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 12/15/2018, 2:32 PM

 

In the Vegas file explorer area, at the bottom, eg file: 1920x1080x32. 32 means 32 bit colour yes? Same item as pixel format setting (?)….

Sorry, by 'Q' I meant 'question'.

No, I'm sorry that you may have cross-posted and missed my preemptory edit above.

Please be sure to be clear that there is NO difference between 8 Bits Per Pixel Per Channel and 24 bits RGB and 32 bits RGBA, which are sometimes reported. They are all exactly the same thing!

Feel free to mark one of our responses as a solution if you wish...

 

 

OldSmoke wrote on 12/15/2018, 3:12 PM

@j-v The last Nvidia card that supported CUDA with Mainconcept AVC was the GTX580, all later cards support NVENC.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

j-v wrote on 12/15/2018, 3:53 PM

Of course I know that but when I mentioned this I was speaking about the Magix codecs . Sorry that you probably not saw that. Not only the AVC codec but there is also the Magix HEVC codec for wich the Nvidia of PO is not sufficient because it was only available around the GTX 700+ series

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 12/15/2018, 4:09 PM

[sigh] Now I know how johnmeyer must have felt [/]

j-v wrote on 12/15/2018, 4:23 PM

😀😃😇☺️ tomorrow I'm going to eat self caught Greenland halibut😛👍👍👌

Last changed by j-v on 12/15/2018, 4:36 PM, changed a total of 1 times.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Chief24 wrote on 12/15/2018, 4:45 PM

I'll just do a little quick reminder on peter's hardware...it is an X58 motherboard, and him telling us that he's using an i7 at 2.8, would be possibly the i7-930. Additionally, this platform had the "Triple Channel Memory" configuration, so him having 12GB of RAM would also be "correct", probably either in a 6x2GB or 3x4GB (most likely at the time) setup. He also states he's using a "different HD" for source files, but that platform was only SATA II version at the time, so the best he could hope for is to change out that "spinning platter" for a cheap SSD, knowing that performance of the SSD will be "hampered" due to the technology existing at the time. Based on Peter's comments, I doubt trying to "overclock" any would be beneficial to him, or even be possible (haven't had that platform in quite some time/years'). Additionally, this platform negates the possibility of using the CPU integrated graphics, since there are none, leaving it up to a discreet graphics card. Until he is prepared to purchase an updated system more in line with today's hardware, recommending high-end graphics card(s) is also a potential waste of money, at this time. I would recommend either an AMD RX 580 (8GB) or GTX 1060 (6GB), as the other concern is that the PCI-e was only at version 2 then. At least with either of the aforementioned cards, once he does update, they will work in a new system, and either are at good buys currently, price-wise. Effectively, time to put some cash aside for a new system. Good Luck!

Self Build: #1 MSI TRX40 Pro Wi-Fi w/3960X (be Quiet! Dark Rock Pro TR4) @ stock; 128GB Team Group 3200 MHz; OS/Apps - WDSN850X PCI-e 4.0x4 4TB, Documents/Extras - WDSN850X PCI-e 4.0x4 4TB; XFX AMD Radeon 7900XTX (24.12.1); Samsung 32 Inch UHD 3840x2160; Windows 11 Pro 64-Bit (24H2 26100.2894); (2) Inland Performance 2TB/(2) PNY 3040 4TB PCI-e on Asus Quad M.2x16; (2) WD RED 4TB; ProGrade USB CFExpress/SD card Reader; LG 16X Blu-Ray Burner; 32 inch Samsung UHD 3840x2160.

VEGAS Pro 20 Edit (411); VEGAS Pro 21 Suite (315); VEGAS Pro 22 Suite (239) & HOS (Happy Otter Scripts); DVD Architect 7.0 (100);

Sound Forge Audio Studio 15; ACID Music Studio 11; SonicFire Pro 6.6.9 (with Vegas Pro/Movie Studio Plug-in); DaVinci Resolve (Free) 19.1.3

#2: Gigabyte TRX50 Aero D w/7960x (Noctua NH-U14S TR5-SP6) @ stock; 128GB Kingston Fury Beast RDIMM @4800 MHz; OS/Apps - Seagate Firecuda 540 2TB PCI-e 5.0x4; Documents/Extras/Source/Transcodes - 4TB WDSN850X PCI-e 4.0x4; 4TB Inland Performance PCI-e 3.0x4; 2TB Inland Performance PCI-e 4.0x4; BlackMagic PCI-e Decklink 4K Mini-Recorder; ProGrade USB SD & Micro SD card readers; LG 32 Inch UHD 3840.x2160: PowerColor Hellhound RX Radeon 7900XT (24.12.1); Windows 11 Pro 64-Bit (24H2 26100.2894)

VEGAS Pro 20 Edit (411); VEGAS Pro 21 Suite (315); VEGAS Pro 22 Suite (239) & HOS; DVD Architect 7.0 (100); Sound Forge Audo Studio 15; Acid Music Studio 11

Canon EOS R6 MkII, Canon EOS R6, Canon EOS R7 (All three set for 4K 24/30/60 Cinema Gamut/CLog3); GoPro Hero 5+ & 6 Black & (2) 7 Black & 9 Black & 10 Black & 11 Black & 12 Black (All set at highest settings - 4K, 5K, & 5.3K mostly at 29.970); Sony FDR AX-53 HandyCam (4K 100Mbps XAVC-S 23.976/29.970)

karma17 wrote on 12/15/2018, 7:40 PM

I get confused about the 8-bit video being rendered out at 32. I've heard more than one person with some knowledge say that doing so can reduce banding. Is that not accurate? Just trying to understand it as I don't fully get it. http://www.moviestudiozen.com/doctor-zen-faq/105-what-is-the-difference-between-8-bit-and-32-bit-pixel-format-in-vegas-pro-settings

Musicvid wrote on 12/15/2018, 7:53 PM

@karma17

Great question. But the article leaves some important things out. Grading in 32 bit renders exactly the same results as 8 bit if the input, the output, or both are in 8 bit. That is significant and plain as day in the link to the tests I've posted above and below. Here is what "no difference" looks like in Vegas:

When using 8-bit input/output, the 32-bit floating point (video levels) setting can prevent banding from compositing that contains fades, feathered edges, or gradients.

The only sense or difference 32 bit grading ever made was with 10 bit source and 10 bit output, so maybe for the next round of UHD HDR broadcast platforms?

Remember, anyone is welcome to post their own empirical test results in my thread, but my confidence in usupported internet hype is zero.

https://www.vegascreativesoftware.info/us/forum/10-bit-vs-8-bit-grading-the-musical--111748/

karma17 wrote on 12/16/2018, 1:37 AM

This is very helpful, and I appreciate you taking the time to explain it.

My point of confusion was in misunderstanding where the 32-bit part comes in.

So, the way I understand it, even if I bring in 10-bit source, Vegas uses the 32-bit internally, but the video/codec itself is rendered out in 8-bit depth. Is that mostly correct?

When I was looking at the white paper on Pro Res, it says Pro Res can handle up to 12 bits. So if I bring 10-bit source, edit in 32-bit, then render out in Magix Intermediate, which I understand is a Pro Res equivalent, is that 8 bit or 10 bit?

Thanks again. I'm in over my head on much of this.

http://www.apple.com/final-cut-pro/docs/Apple_ProRes_White_Paper.pdf