Automation to use graphicscard

Hi!

I have a problem with my system using the wrong graphicschip for Automation.

I have an i7 processor in my laptop, with a haswell chip. I also have an nVidia GT520M in there. Funny thing is, I can set automation to use my nVidia card, just like I do for other games like Minecraft and Battlefield, but Automation refuses to use my nVidia card, even if I tell (in the nVidia controlpanel) the card to kick in when the .exe is launched.

Speedfan shows my core0 goes up to 80 degrees, and core1 to 74, core2 and core3 a bit lower than that, while my GPU sits around 60 (because of the heat of the CPUs), so its clearly using my haswell chip.

Is there something I’m missing to tell Automation to use the graphicscard? Is there a way you could build in a tool in the options panel, which detects graphicscard (or chip) and then tell the game to use the one or the other? I’m not even running FXAA and SSAO and HD resolution + high quality but I really wish to!

Someone has a solution, or could you make that option in the options menu available?

I found one “solution”, which is to deactivate the haswell chip in the BIOS, but if I do that, it will dramatically affect my power usage, as it will run on the graphicscard all the time. Of course this is not the solution im looking for. So maybe zeussy can tell me if it is possible to fix the options menu like suggested in the OP? Or someone else has a solution to this?

Hey Bart!
Actually what you did in the Control Panel should work… it used to work when I tried it months ago. Have you tried the other way - right-click on Automation and “Start/Run with… HighPerformance Chip” or something like that? And if you tried that already too… then there needs to be some kind of fix I guess.

I actually tried that a while ago, but that tickbox also says it standard uses the nVidia one… Tried it again just now and still the same issue as before…

Tthe cpu heat build up is just from the cpu useage, if you were running your integrated you would be forced at min settings.

What sort of settings are you running and what framerate are you getting? I’d expect very poor framerates on anything but the very lowest settings if it is using the integrated chip.

The reason I believe its using the chip is that it showed the same symptoms when I ran Minecraft on my cpu chip(Minecraft is really CPU intensive). Temperatures around 85, FPS of only 7-15. Since I used my graphicscard for Minecraft, CPU temps are around 60-65 and I get 140+ FPS.

As for Automation, I understand it uses CPU, but not as intensive as Minecraft with 60 mods running in the baclk (Feed the Beast). Plus, my framerate is also pretty low: around 15-20, sometimes dropping under 10s for a second. As for settings: I run 1280 x 720 windowed, no SSAO and currently WITH FXAA. Quality slider is just under 50%. If I set it to max quality with SSAO and FXAA, and 1600x900 (my screen size) resolution, FPS drops to 1-5 and heat of my CPU goes up to almost 90s. So its clearly not running on my GPU.

Bump de la bump.

Wizzy, you of all people should know that bumping threads is kinda frowned upon here, especially when they are the 2nd post on the page.

Mmm, that does sound like it must be using the CPU then. Not sure, on our nVidia/Intel Integrated dual GPU laptop it seems to use the nVidia GPU… Will get Caz to look into it.

Thanks Daff. Would be nice… Also, we are talking about the same situation before and after a complete fresh install of Windows 7 64-bit, with reinstalling all drivers for my nVidia and such.

If it’s of any help, I tend to see quite high CPU temps in my desktop when the game is running, and only certain parts heat up the GPU significantly. The game is also bottlenecked by the CPU on my laptop, rather than the GPU.

Will the rendering be improved later on? Even the menu needs full graphics power for example, even though it looks quite simple

Automation hammers the cpu. it only seems to use two cores but it will load them both right up when your’re in the designer.It did it with my PhenomII-965 and still does with my i5, though the framerate is totally stable with the i5 (wasnt with the phenom, variable framerate and sound skipping etc)

Yeah, unfortunately Automation is pretty brutal on machines.

The longterm plan is to port Automation to a different game engine (Unity). Which should allow for a lot better scaling of graphics than we currently do, and hopefully not be so hammering on the requirements.

Oh great! So I guess you finally take the step going to Unity :slight_smile:
I suggested this about 8 months ago to Daffy. He wasnt so sure back then, though. Good to see this! Better lighting, shadows, code processing/running, easy plugins like NGUI and sound implementation coming up!

Been playing Kerbal Space Program lately, Unity does a great job.

I have come across this issue now, and it seems to differ by machine by the sounds of it rather than it being an issue with the game. On my Windows 8.1 machine, by default it doesn’t switch over to the NVIDIA chip at all, it just sits there with low FPS and hammers the integrated chip. However, on mine if I manually add Automation Launcher.exe and tell it to start on the High Performance GPU, it works fine that way. I am on driver 340.52 for the Nvidia chip, and 10.18.10.3540 for the Intel chip. This is also a Haswell machine, so I wonder if it’s maybe that Optimus works better with 8.1?

Anyway, this probably won’t be an issue at all in the long term as mentioned, this is just another experience of it.

I only have some framedrops when morphing a car. But not very often

My system is a i5, 16gb ddr3 and a hd7970 running on w8.1 pro