Integrated Graphics vs PCI-e Video Card

burchis13 wrote on 8/8/2013, 4:15 PM
Does anyone use a NLE system having a motherboard with integrated graphics instead of an expansion PCI-e card?

I am building a new system and the motherboard (GIGABYTE GA-Z77X-UD5H) I'm using has built in integrated graphics.

Just wondering if I will be satisfied with onboard graphics. Should I wait and see or should I go ahead and purchase a video card?

Comments

videoITguy wrote on 8/8/2013, 4:42 PM
You need to study and really understand what the release notes of Build 670 of VegasPro12 is telling you. This is new information that the user community has yet to digest.

Note this thread:
http://www.sonycreativesoftware.com/forums/ShowMessage.asp?MessageID=866348&Replies=22
john_dennis wrote on 8/8/2013, 6:01 PM
I have one of each and since I don't do it for a living, I am somewhat neutral on HD 4000 vs the nVidia card in my old system. My particular motherboard will allow both to be installed concurrently or I could just add a powerful nvidia card in the future. (There are PCI-E considerations but you seem to have chosen not to run X79 already.)

If I was upgrading today, I would look hard at the i7-4770 and the Z87 motherboard as there were improvements in the on-die video inside the 4770.

In the long-run, I would root for OpenCL over proprietary hardware/software accelerators.

Full Disclosure:
I've picked some winners and losers in the past as my retirement account reflects.
ushere wrote on 8/8/2013, 6:52 PM
have both - 550ti in pc / hd4000 in laptop.

can't run nb titler with hd4000 (prodad works fine), otherwise quick sync screams through renders.

i'd try on board, if it doesn't do what you want, throw in a video card of your choice....
wwaag wrote on 8/8/2013, 6:59 PM
See my new post. http://www.sonycreativesoftware.com/forums/ShowMessage.asp?MessageID=866636&Replies=0

I have the same motherboard you are contemplating.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Pete Siamidis wrote on 8/8/2013, 7:09 PM
I'm using a Haswell 4770k cpu in my Vegas Pro 12 pc, and I tried it with the integrated gpu and with my NVidia 670. With the NVidia 670 I get full 60fps at full quality on the timeline even with multiple effects added like color corrector, sharpen, etc. With the Haswell 4770k's built in gpu the framerate on the timeline is substantially more choppy to where it was not workable for me. So in my case I decided to keep the NVidia 670 in my machine, having a fully 60fps timeline at all times just makes editing much easier.
MSmart wrote on 8/8/2013, 10:22 PM
@Pete, what HD4600 driver version are you using?

http://www.intel.com/p/en_US/support/highlights/graphics/4cp-hd4600gfx
burchis13 wrote on 8/9/2013, 6:14 AM
Thanks to all for your input.

I still use Vegas Pro 9, because it works!

I will start with the Intel HD 4000 graphics and reserve the option to add a video card later if needed. The processor I have chosen is the Ivy Bridge I7-3770. Added 16gig of G.Skill Ripjaws memory and a SSD for the Windows 7 Pro 64bit OS, plus another 1TB of storage with a WD Black SATA III HD.

I am wondering if the Intel HD 4000 graphics came be upgraded to the HD 4600 current version?
john_dennis wrote on 8/9/2013, 9:06 AM
Because the HD4000 is on the processor die, it's silicon and can't be changed except by changing the processor. Since Intel changed the socket from 1155 pins in Ivy Bridge to 1150 for Haswell, a processor swap without changing motherboards is not possible.
John222 wrote on 8/9/2013, 4:01 PM
Until recently, I too was using the on board HD4000 graphics on my GigaByte MB and it worked just fine. I edited several 2hr HD videos for BluRay without a problem. I've since upgraded to a Radon HD6870. But I really only did it so I could use multiple monitors.
MSmart wrote on 8/9/2013, 6:56 PM
But I really only did it so I could use multiple monitors.

From the Multiple Display FAQ...

Starting with the 3rd Generation Intel® Core™ Processors with Intel® HD Graphics 4000/2500, three displays support may be possible depending on your computer's configuration. Check with your computer manufacturer if three displays are supported on your computer and which display combinations are supported.

http://www.intel.com/support/graphics/sb/CS-031040.htm

When I finally get 'round to building my new video editing rig I plan to use the integrated HD4600 graphics at first.
Pete Siamidis wrote on 8/9/2013, 7:26 PM
@MSmart, I don't recall the version number but it was the latest version at the time I did the test, which was on 6/16/13. I'd have to yank my 670 from the system and reboot to get the Intel Control panel again to know for sure which version of the drivers are installed, as I've never been able to get the 670 + 4600 working together at the same time.
wwaag wrote on 8/9/2013, 11:19 PM
"I'd have to yank my 670 from the system and reboot to get the Intel Control panel again to know for sure which version of the drivers are installed, as I've never been able to get the 670 + 4600 working together at the same time."

No need to remove your card. In the future, select Start, left click Computer, select properties, then device manager, then your display adapter, right click properties. and finally driver. Your currently installed driver will be displayed.

I've had no problems getting a 650 and 4000 working together, http://www.sonycreativesoftware.com/forums/ShowMessage.asp?MessageID=866636&Replies=1
although our motherboards may differ.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

MSmart wrote on 8/10/2013, 12:41 AM
Interesting results there, wwaag. I may have to rethink my position on going integrated to start.

Which 650 board specifically do you have? If you don't mind saying.
Pete Siamidis wrote on 8/10/2013, 1:30 AM
@wwaag, on my machine if the Nvidia 670 is plugged in, then the Intel 4600 will not appear at all in the device manager display adapter area. It may be a motherboard thing, perhaps there is a bios setting that automatically disables the Intel 4600 if any external gpu is plugged in. I'll have to look into that and see if there is some way to enable both with my motherboard.
wwaag wrote on 8/10/2013, 10:26 AM
Here is the video card I'm using. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125444

and the motherboard, the same the OP is considering. http://www.newegg.com/Product/Product.aspx?Item=N82E16813128545

In the bios settings, there are 3 options for internal graphics--auto, disabled, and enabled. If you choose, auto the internal will be disabled if an external card is found, so the option to choose is Enabled--at least for my mobo.

Once you have both display adapters shown in Device Manager, then you use the Windows Display Control Panel (left click on Desktop, select Screen Resolution) to configure the two adapters in terms of extend/clone options. Initially, only your main display will be show shown. Click on the Detect button and a gray box will appear to the right of your blue display box. Click on the gray box and it will show what additional display outputs are available. Hope this helps.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

John222 wrote on 8/10/2013, 11:20 AM
Maybe so, but my desktop pc motherboard only had support for one monitor. You might be referring to laptops where multiple monitor support is typical.
Pete Siamidis wrote on 8/10/2013, 11:51 AM
Wwaag, thanks for the info, I did find two settings in the bios related to using the iGpu alongside an external gpu. One is the setting you describe, the other sets whether or not the iGpu is the primary display device. I'll have to play with that later. Is your setup stable when using both the cpu gpu and an external gpu? My pc is rock solid so I don't want to risk introducing weird issues with Vegas Pro.
wwaag wrote on 8/10/2013, 3:59 PM
I've only been running this configuration for a few days, but so far so good.

By bios also has a second setting which controls which display you use duriing the initial boot. In this case, I've chosen the internal graphics. So right now, I use the HD4000 for bootup, and the primary monitor. The Nvidia card is for the second desktop monitor as well as the preview device--the TV. Seems to work OK.

Best thing to do is just experiment and see what works. Good luck.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Pete Siamidis wrote on 8/10/2013, 5:51 PM
I just tested it out, I set the bios iGpu setting to 'enabled' and set the primary gpu as PCIE, rebooted and it all worked fine. I installed the latest Intel video driver just to be sure, and my NVidia 670 was also using the latest driver as well. Vegas Pro 12 runs fine, and I can switch the timeline between the two gpu's to directly compare them. On my machine there is no comparison though, with the Intel 4600 the frame rate is choppier say maybe 10 fps or so, but with the Nvidia 670 it runs at full 60fps. This is with a 1920x1080 28mbps 60fps source with sharpen and color correction filters applied, and video preview set to "Best". So no competition there, the NVidia 670 is far better than the Intel 4600 on the timeline in my case. I tried doing some encoding tests, but I would get an error any time I tried to use Intel Quicksync. So ultimately I just went back to me previous setup of having the iGpu disabled and just using the 670, for my particular needs it just works much better.
wwaag wrote on 8/10/2013, 7:33 PM
Remember that to use quick-sync you have to have at least one of your monitors (if you have 2) using the HD4600. I have one driven by the HD4000 and the other driven by the 650. I have the 650 driving the second display which I use for preview. In my case too, it seems better than the intel.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Pete Siamidis wrote on 8/10/2013, 10:26 PM
Yeah I had my primary monitor driven by the 670, and the secondary monitor was driven by the 4600. Perhaps it has to be the reverse...who knows. In any case the 4600 isn't bad, it's certainly far better than cpu alone on the timeline. In my case cpu alone was basically unusable in "best" mode once the effects were stacked on, whereas if I had to only use the 4600 I could manage with that. But now that Vegas seems to support the 670 it just runs so well on the timeline, better than my previous 560ti which I have since sold. My best guesstimates for timeline performance in my situation are around 10fps for the 4600, around 30fps for the 560ti and seems to be full speed 60fps for the 670. I wish they made a motherboard with Intel's HD 5200 Iris Pro, I'd be curious to see how that performs with Vegas Pro 12.
burchis13 wrote on 8/11/2013, 11:12 AM
Will I be able to use the 'Quick Sync' feature with only having one monitor and Vegas 9.0 Pro?
wwaag wrote on 8/11/2013, 3:55 PM
AFAIK Vegas 9 does not support any type of GPU acceleration, although I might be mistaken.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

kodack10 wrote on 8/17/2013, 4:05 PM
Pete please tell me how you are getting your GPU to work. I have a nearly identical setup, a 4770k/HD4600 but I'm using an Nvidia GTX660 for video.

Ever since I put the 660 in my system, GPU acceleration has stopped working. I still have it as an option, it still reports it's available when I try to render, but the speed is exactly the same as CPU only. For comparison my out dated GTX260 rendered roughly twice as fast as my newer, faster card and that was on an old quad core2duo cpu.

I have latest drivers for everything and cannot get it to render with GPU accelration ever since putting the 660 in my system.