VP14 build 244 BUG - 10-bit HEVC renders 1920x1080 to 1920x1088

NickHope wrote on 3/31/2017, 7:05 AM

I'm finding that the new 10-bit HEVC from a 1920x1080 project is rendered to 1920x1088. i.e. The wrong resolution. There is a narrow black stripe top and bottom that extends almost to the ends.

Tested with 23.976p and 59.94p projects. My system has a Intel i7-5960X CPU which doesn't support Quick Sync.

Can others please test this on their systems? If so, please state your CPU so we know if Quick Sync was used or not.

I made a support request: [Ticket#2017033117007475]

Comments

GJeffrey wrote on 3/31/2017, 8:04 AM

Confirmed on 25p & 29.97p project.

I have a Intel i7-3930K CPU, so no Quicksync (GPU preview on or off, same problem)

D7K wrote on 4/1/2017, 1:11 PM

Yep, and in project media it reports color depth at 15 HEVC and pixel dimensions 1920 x 1088.

Intel(R) Core(TM) i7-7700K CPU @ 4.20GHz

Peter_P wrote on 4/2/2017, 9:15 AM

On my i7-6700k with only internal Intel GPU the dimension seems to be OK with the default HEVC 1920x1080 29.97 fps template :

Video
ID                             : 1
Format                         : HEVC
Format/Info                    : High Efficiency Video Coding
Format profile                 : Main@L4@High
Codec ID                       : hvc1
Codec ID/Info                  : High Efficiency Video Coding
Duration                       : 16 s 750 ms
Bit rate                       : 12.1 Mb/s
Width                          : 1 920 pixels
Height                         : 1 080 pixels
Display aspect ratio           : 16:9
Frame rate mode                : Constant
Frame rate                     : 29.970 (30000/1001) FPS
Standard                       : Component
Color space                    : YUV
Chroma subsampling             : 4:2:0
Bit depth                      : 8 bits
Bits/(Pixel*Frame)             : 0.195
Stream size                    : 24.2 MiB (100%)

 

NickHope wrote on 4/2/2017, 12:25 PM

On my i7-6700k with only internal Intel GPU the dimension seems to be OK with the default HEVC 1920x1080 29.97 fps template

But if you change the new Bits per pixel setting on the Video tab to 10?

balazer wrote on 4/2/2017, 2:29 PM

A bit of background on this problem:

MPEG encodes video using 16 x 16 blocks of pixels (or 16 x 32 for interlaced video). 1080 is not a multiple of 16, so the encoding is forced to encode 1080 video using 1088 rows of encoded pixels. The bottom 8 rows of encoded pixels can be black, grey, or filled with other garbage. For a 1080 stream, the sequence headers should say the vertical size is 1080, meaning that the decoder will produce 1080 output and ignore the bottom 8 encoded rows.

There are a few possibilities here:

  1. The decoder is disregarding the vertical size field in the sequence headers and producing all of the encoded rows in its decoded output, 1088 instead of the 1080 it should be. That's a decoder bug.
  2. The decoder is decoding correctly, but simply misreporting the vertical size in the video metadata as the encoded size instead of the size indicated in the sequence headers. That's a decoder bug, but just a cosmetic bug with the metadata it reports.
  3. The encoder is incorrectly setting the vertical size in the sequence headers to 1088 instead of 1080. That's an encoder bug.

So to understand where the problem lies, the encoded video stream needs to be checked with different decoders. ffmpeg almost certainly reports the vertical size correctly, so I'd recommend checking with that. If ffmpeg says 1088, then the video was encoded incorrectly. If ffmpeg says 1080, then the other decoder was decoding incorrectly and/or reporting the vertical size incorrectly.

Nick, you didn't say how you were decoding the video. That's key here, since it could be an encoder bug or a decoder bug.

Cornico wrote on 4/2/2017, 2:36 PM

At the moment you hit the renderbutton you can see at the bottom of the preview window that the projectsize is being changed into the "wrong "size untill the rendering is finished.

NickHope wrote on 4/3/2017, 12:37 AM
Nick, you didn't say how you were decoding the video. That's key here, since it could be an encoder bug or a decoder bug.

In Vegas Pro 14 build 244 and in MediaInfo. ffmpeg also reports it as 1920x1088. So it appears to be an encoding bug.

Peter_P wrote on 4/3/2017, 12:43 AM

On my i7-6700k with only internal Intel GPU the dimension seems to be OK with the default HEVC 1920x1080 29.97 fps template

But if you change the new Bits per pixel setting on the Video tab to 10?


... I get a render error message.

Her the template I changed to 10Bit

and when starting to render :

Marco. wrote on 4/3/2017, 2:57 AM

I get same error message when trying to render to 10 bit HEVC.

GJeffrey wrote on 4/3/2017, 4:25 AM

On my i7-6700k with only internal Intel GPU the dimension seems to be OK with the default HEVC 1920x1080 29.97 fps template

But if you change the new Bits per pixel setting on the Video tab to 10?


... I get a render error message.

Her the template I changed to 10Bit

and when starting to render :

 

Unfortunately sounds normal. Quicksync 10bits encoding only works with 7xxx serie Intel cpu.

NickHope wrote on 4/3/2017, 4:27 AM

Unfortunately sounds normal. Quicksync 10bits encoding only works with 7xxx serie Intel cpu.

But I can render 10-bit HEVC and I don't have Quick Sync at all. So the decoder must fall back to regular CPU encoding if there's no Quick Sync.

GJeffrey wrote on 4/3/2017, 4:32 AM

But I can render 10-bit HEVC and I don't have Quick Sync at all

Are you sure? 😉

So the decoder must fall back to regular CPU encoding if there's no Quick Sync.

Looks like a bug then.

NickHope wrote on 4/3/2017, 4:37 AM

But I can render 10-bit HEVC and I don't have Quick Sync at all

Are you sure? 😉

I have i7-5960X, same as mentioned here. No integrated graphics and thus no Quick Sync Video capability.

So the decoder must fall back to regular CPU encoding if there's no Quick Sync.

Looks like a bug then.

Perhaps the lack of Quick Sync is what makes it work for me. For those who have problems, I wonder if they all have Quick Sync, and I wonder if they can disable that in the BIOS to see if HEVC works without it.

Marco. wrote on 4/3/2017, 4:38 AM

Yes, to me it looks like a bug. My system actually is 7i CPU (and Intel HD Graphics 520).

NickHope wrote on 4/3/2017, 4:41 AM

Possibilities might be to disable Intel integrated graphics in the BIOS or go Control Panel > Display Adapters and disable it in there. Obviously the dedicated graphics card would then have to handle the displays, if it's not already.

Marco. wrote on 4/3/2017, 5:03 AM

Neither of it seem to be possible here. There are no such choices.

Peter_P wrote on 4/3/2017, 5:09 AM

So the decoder must fall back to regular CPU encoding if there's no Quick Sync.

Yes, and I can not disable the internal GPU, because this is the only garfik I'm using and I definitely need it for QSV Support in Handbrake to render smooth UHDp30 HEVC output ;)

Cornico wrote on 4/3/2017, 5:39 AM

On my laptop(signature) I have the choise to execute Vegas 14 with the build-in Intell or with the Nvidia GPU.
When I change my settings on the start pic into Intell I am able to trigger this bug and get the error Peter showed.
Changing the grafics inside Vegas in Options/Preferences gives no problem with this bug.

NickHope wrote on 4/3/2017, 6:24 AM

Confirmed here on my AMD laptop. HEVC 10-bit fails with that same error message if my Intel HD graphics are enabled but succeeds if they are disabled.

Can't report it as https://support2.magix.com/customer/en is under maintenance.

NickHope wrote on 4/14/2017, 12:46 PM

Regarding the original issue (10-bit HEVC renders 1920x1080 to 1920x1088), I received an email from Support that included the following:

"We are aware of this rendering issue for HEVC files and we are looking into resolving it in a future update for the software. The date of said update is currently unknown however."

phil-d wrote on 11/13/2017, 5:09 AM

Hi

Just to add finding this issue as well using Intel QuickSync on build 216 where I can't render 10-bit files using Intel HEVC, and my processor is an Intel i7 7700 which does support 10-bit encoding. I can't disable QuickSync it as it is my only graphics card.

Is this still not fixed after all this time? How do I render 10-bit?

 

NickHope wrote on 2/20/2018, 11:05 AM

The VEGAS Pro 15 build 311 release notes state "It is now possible to render 10-bit HEVC files using SkyLake or older processors with QSV". Could those with compatible CPUs please test this and confirm?

Marco. wrote on 2/20/2018, 12:24 PM

Yes, 10 bit HEVC rendering works fine on my Skylake based notebook.

Former user wrote on 2/20/2018, 3:59 PM

I rendered out to fhd from a sample 10 bit hevc from gh5. Haswell Core i7-4790K. (Intel® HD Graphics 4600) . Allows a qsv quality max setting of 7 for this cpu. The mb/s appears to max approx 12000 though for qsv. The data rate is low though, the nvenc on gtx 1080 allows better, still, nice to have.