Excessive GPU Loading and can't disable shadows

Discussion in 'General Technical Support' started by Prinsn, Dec 13, 2012.

    • E5500 2.8 GHZ
    • Radeon 6850 HD
    • 6 GB RAM
    • not solid state
    Up until now, I've had 100% instances of CPU throttling, but have been able to play the game on High+ ("Ultra") Settings since launch, without it ever mentioning GPU. GPU fan was never noticably loud, either.

    Come patch, GPU throttling kept popping up, so I decided to turn down my settings.

    I've turned my settings to everything from Low to High (Ultra is defaulting to Low) with shadowquality=0.

    Results
    • Shadows are always on
    • FPS is identical to previous on patch in low-activity areas, but it's saying GPU instead of CPU throttle in certain circumstances
    • FPS visibly stutters, but is numerically identical to previous patch in high-activity areas, GPU throttle when looking away from busy base, CPU throttle when looking at busy base.
    • GPU is temping up to 90C while looking at the in warp gate, claiming GPU throttle (looking at the ground)
    • Limiting FPS (to 30 or below, normal play is 20-25) only lowers GPU MAX temp by 5 degrees.
    • FPS is identical between Low and High, no change in GPU activity, same throttle indicators
    • Manually limiting the FPS to below the GPU limited threshold does not prevent GPU heating
    There is no reason a 6850 should hit 90 degrees with everything on low, in sanctuary, at 3 AM EST.
  1. Go to useroptions.ini and set ShadowQuality=0 to disable shadows.
    Also you may want to decrease the RenderQuality to 85%
    Disable also Motion Blur and Ambient Occlusion.

    Your CPU is weak and throwing all the load on your GPU will not give you decent fps.
    Try overclocking your CPU around 3.5Ghz (with a good aftermarket heatsink!) and try again.
    90C is too much for your GPU, maybe you have cooling issues in your PC Case. Install more fans and clean up all the dust inside.
  2. Well for one I agree with above

    Two you've bottlenecked your GPU, it's going to take the load off your cpu anyway :)
    Set it to high in process priority if that helps any.
  3. Pretty sure you failed to read my post.

    shadowquality is set to 0, I said that.

    The 90c only happens in planetside, at all settings, regardless of GPU load, and only since the patch.

    I'm not trying to increase FPS by loading the GPU. I up GPU settings because I can, as they have no affect on FPS because it's CPU locked FPS.

    I am getting the same FPS as before, but now the game is indicating GPU lock in non-graphic intense instances.

    The GPU is also running at 90C at all low settings, in non graphically intense instances.


    I am well aware my CPU is weak, but it handled the game well enough to play before the patch.

    I have no FPS increase or decrease from changing my settings from low to high, with verifiable changes in graphical quality, no changes in whether or not CPU or GPU is the limiting factor, and little to no changes in GPU heat.


    How am I bottle necking the GPU if changing the settings from anything to anything else has no impact on GPU limited FPS?

    I had 100% pure CPU bottleneck yesterday, now I have a GPU bottleneck, regardless of my graphical settings on a card that, yesterday, could run the game on ultra without any problems at all.

    I also have the game set to automatically set to high priority on launch, through another utility.
  4. You seem to have no clue what you're talking about..

    A. A Bottleneck in your case, is the CPU, your CPU communicates with your GPU to process it all.. So you can have a million bucks GPU but a buck CPU in there, it won't do anything great.

    B. You can't be FPS CPU locked or GPU locked, where did you get that idea from? Seriously.. If you're on a bad FPS, it's the entire combination.. Maybe your GPU Should be able to handle it, but your CPU clearly does not. So upgrade your CPU, your GPU will have more breathing space.

    C. Temperatures are normal, don't worry about it, they can stand 110 degrees and at full load your's isn't awfull.

    D. Upgrade. Simple as that, upgrade your system.

    If you ran it fine in the Beta, that's normal, people seem to be forgetting that the BETA is a fraction of the entire game, thus logical that you can play it better than the full blown game.

    You have 1 bottleneck, which is your CPU, bottlenecking the GPU that is. Your GPU should be able to run it, your CPU won't.. So there's no way you can run the game without having your CPU to spit at it since it can't process it.
    If you set your video settings high, your CPU will have a hard time, since it DOES calculate everything before and after it went by the GPU.

    Simple as that, read better into it, and draw your conclusion.

  5. Guys if you want to disable shadows you need to change 2(!!) options!

    100% guaranteed fix TRUST me:

    ShadowQuality=0
    OverallQuality=-1

    This tells the game ''I'm running a Custom setting (OverallQuality=-1) with no shadows (ShadowQuality=0)

    If you don't do this there will always be shadows regardless of your .ini file.

  6. A) I know. I said this. Read.

    B) Yes you can. The ingame FPS counter will even indicate which is the limiting factor. You clearly don't know what you're talking about, because a GPU not getting information from the CPU has all the breathing space in the world because it has nothing to do.

    C) 90C doing no work on rendering is not normal

    D) "Learn to play" Way to help out there, guy.

    Game is still in beta, this is a beta release, I didn't play in beta, I played before update #1 and it was fine, now it's not, now I'm getting excessive GPU loading.

    I know I have a CPU bottleneck, but the game is now indicating a GPU bottleneck and loading the GPU excessively compared to more.

    My conclusion: You have no reading comprehension.

    I thought Graphics Quality was the determining factor... will try.
  7. Dude. Please read... You have ''OverallQuality=3'' it should be ''OverallQuality=-1'' so that's a minus 1. That means you want to use a CUSTOM setting ;) Trust me it's a 100% guaranteed fix. It's helped many others and it's the ONLY way to turn off shadows completely.

    For once and for all


    If someone wants to disable shadows COMPLETELY go to your UserOptions.ini and change the following:

    ShadowQuality=0
    OverallQuality=-1 (that's a minus one)

    Save the file. Restart client. Enjoy. Thank me and the others who found this out later.
  8. Ignore cyba he knows nothing, if he wants to let his cards overheat let him keep that 6850 UNDER 80C while playing. It is possible that they have implemented some graphics option that was not in before that is causing your GPU to overheat, in my own experience 68xx cards throttle from heat at around 86c. Just to make sure, I don't know you, but the fans are working on the card right?

    The first thing I would do is run a Furmark benchmark with burn-in and note your temps. If it takes longer then 50 seconds to hit 84C your card "SHOULD" be cooling fine and the next step is to attempt to trace it back to an option in the .ini. If you can find no option that brings temps back to normal then send the dev's a detailed description of your hardware WITH a dxdiag AND your clock speeds and voltages for both CPU and GPU unless of course they are stock. Just PM the info to one of the CSR guys (twist is extremely helpful) and they will forward it to the proper people.

    As others have mentioned setting overallquality=-1 just means the game will not overwrite the values you set in the .ini.
  9. if I knew nothing I would not be a network engineer, at the same time being a system engineer AND being a graduated 3D artist that clearly knows what he's talking about..

    The ingame debug that shows you what your limit is, isn't actually a limit counter, it simply shows you what is having a harder time: Either GPU or CPU.

    My post was correct, and I stand correct.
    90C for a video card isn't bad, however when it's not rendering full, BUT your cpu being a lacking factor, your GPU can actually be forced up to a higher rendering level trying to keep up the data even though quality might be set low.

    You guys should stop being ignorant and actually read about how hardware works compared to software.
    I'm not sucking this outta my thumb I'm certified to know this crap.

    Go home, I so hate free games for having idiots who know nothing come in and yell about stuff they don't know anything about.
    Pain the butt.. Go play CoD or something..

    Btw, even if you don't agree with me, your CPU sucks and needs an upgrade, you can't yell about anything else before you've actually gotten rid of your bottleneck, being the CPU. Other tests won't do much, including the stupid FurMark benchmark the other guy posted, because your CPU will drive your GPU up to higher temps and processing attempts, It's how the system works. release your GPU of tension by upgrading your CPU, do YOUR side of the system in the correct state, then yell on a forum about your issues.

    The issue right now, is yourself.
  10. You seem to have alot of degrees to know nothing. The video card can not process information that has not been sent to it. That's why he has the same FPS on all settings the video card can handle all the information it's getting. Alos if my GPU temps were 90C I would be stressing.
  11. It is what you want it to believe. You seem to be forgetting RAM as well. GPU´s don´t work the way you think it works..
    As a 3D artist you´re supposed to know how a system renders. if your CPU is a bottleneck, it stresses out the GPU, and vice versa.

    Simple as that.

  12. Actually the issue is the ignorant atempting to school the ignorant. Your post is a prime example of that. You being a network engineer does not give any credience to your knowledge as there exists no comorbidity between network engineering and software engineering or the interaction of software with hardware.

    Your correct, your not sucking it out of your thumb. You are in fact blowing it out of your ***.

    GPU's exist for the sole purpose of saving your GPU the task of rendering. If your CPU is lacking you will simply fail to pump enough data to the GPU to reach 100% load and max framerates your GPU is capeable of.

    PS2 at lower settings will attempt to perform some rendering calculation CPU side to compensate for GPU that is lacking in processing power. Higher settings off-load more tasks to the GPU. If you have an uber GFX card turning down the settings will not improve quality or performance. For mid-range systems these options exist solely for the purpose of allowing your to ballence the load between your GPU and CPU to reach the maximum possible framerates. This is where most people fail when tuning there settings.

    As I understand it and to my ability to test. High/Ultra=GPU, Low/Medium=CPU. I may be partialy incorrect as there might be some varience between indidual settings in the game where Medium is also GPU.

    PS2 while on DX9 uses the full range of DX9 capabilities. It also uses PhysX. PhysX calculations are extencive. Having an AMD GPU puts you at a disadvantage here because PhysX calculations are still being done but are being done on your CPU. There exists no possibility of off-loading these calculations to the GPU.

    Having the readout flicker between CPU/GPU is perfect. This means that your CPU is able to push enough data to your GPU to fully utilize the cards capabilities and that your CPU is being utilized to the fullest extent possible (perhaps not 100% but as much as is required for the GPU to do it's job). If you are 100% CPU bound and showing less then 100% load on your GPU then you may benefit from using High/Ultra settings. If your 100% GPU bound, you may benefit from lowering some settings and off-loading them to the CPU.

    Most people will find they are GPU bound when nothing is happening and CPU bound when all hell is breaking loose.

    Now you have to understand why rendering a simple scene can generate more heat then a complex scene. GPU die's have many areas that carry out instructions. The current generations of both NVIDIA and AMD GPU have an internal structure that has several hundred to several thousand identical processing structures that can be programed to perform tasks.

    If I repeatedly call a simple rendering routine typicaly I'll see one area of the die used over and over again for that same function. This simple routine will take very few clock cycles to complete and then paint the image. Then it starts over, using the same area, repeatedly causing localized rapid heating of the die.

    Now a very complex scene is different in how it uses the internal structure of the GPU. It's going to spread the work over more of the internal processing units and take many more clock cycles to complete. This causes heating over a larger area and a longer amount of time. This increased time interval as well as area is what allows the cooler to pull the heat out of the die and radiate it to the air.

    You can reach 100% GPU load by asking each of the operating structures in the die to paint one pixel. Depending on your GPU it may be able to paint 2048 pixels @ 10,000FPS. And it will get very hot, very fast.

    Now back to PS2. PS2 is pushing the draw capibilitys of both the software interface (DX9, CPU intensive), and the rendering capibilities of the GPU. SOE is using nearly every DX9 function in every frame rendered. This causes your GPU to load to process the frames. Some of the effects PS2 is asking your GPU to render are done so repetitivly that your hotspot loading the GPU die faster then heat can be spread and dissipated. There isn't an effective solution for this issue. It's inherent to the way the rendering has to be programed and processed.

    Optimizing the software routines to reduce CPU overhead and GPU processing cycles increases framerates. This also increases the heat generated. It's a direct relationship and can't be helped.

    "My system can play game X which looks better and has better GFX then PS2 and can do it at NN framerate, PS2 is broken!"

    The rendering structure of game X is not the same as PS2. Game X uses a different engine. Game X may be using a bunch of pretty effect Y that can be done 500 times on 500 units in the GPU at the same time and generate 120FPS. PS2 my be using effect Z which can only be done on 10 units in the GPU and has to be done N times per frame to generate the desired result. Result more heat and slower frame rates the game X.

    All that aside, PS2 while not being the "best looking" (subjectivly) game or using the most recent DirectX version can and will tax your GPU heavly and will generate more then a few BTU of wasted heat.

    PS2 can be currently described as "worst case senerio" for generating heat on both CPU and GPU as it performs piles of the same functions on both. Nothing to be done about it. Crying isn't going to help. It will get somewhat better eventualy but your asking software engineers to handle issues that are due to a hardware/software interface in an API that they didn't write for hardware they didn't design.

    Now to the thermal issue.

    90°C is quite hot. I don't know what the thermal throttling threshold on the 6850 is but what your describing sound like your hitting the ceiling, the GPU is clocked back until tempratures drop then normal processing resumes. Ensure you have adequate case ventalation. Even having a clean GPU cooler won't provide adequate cooling is case tempratures are 55°C.

    Assembly of the heatsink to many of the GPU's is miserly for many manufactures. Removing the cost-effective and "usualy adequate" thermal pads between the GPU's heat spreader and the Cooling solution and replacing it with a higher quality thermal transfer paste or liquid metal may improve the situation. Improving case ventalation may improve the situation. If not then upgrading the cooling solution on the GPU my be nessary to maintain performance. A larger copper/aluminum cooler or liquid cooling may be in order. Perhaps just modding the existing cooler to utilize a larger fan man provide the solution. This should fix the "stuttering".

    To the CPU Issue. 2.8ghz is miserly for what PS2 demands. I have a E6700 (2.66ghz) clocked to 3.33ghz. It's satisfactory until I upgrade my GPU. You should be able to push ~3.2ghz without thermal issues on the E5500 if your hardware will allow. It's not hard, just spend a day reading about overclocking. If not, then do the best with what you can.

    ******** at SOE isn't going to make them perfect the engine any faster then they currently are. Quiting the game isn't going to motavate them. Not buying SC isn't going to motavate them, it will do quite the oppisit. If you want the game improved there are three steps.

    #1 Play. More player=more popular=more SOE resources allocated to PS2.
    #2 Report. Report performance changes at each patch. Not ***** and whine, just give the information.
    #3 Don't change 50 settings and post the results. Just your systems specs and your settings and the difference between pre-and post patch. "I tried every settings possible and it's all junk" has no value to the dev's.

    And for the benefit of the community. Don't be afraid to not know something. I've read so many posts like the one quoted I feel like smaching my head in to the desk every time I see one. Spend a few weeks studying microprocessor manufacturing and technology, understand what a "bump" is and what the subtrate does and why picking the right combination fo the two is importent. Understand how processor architecture, be it GPU or CPU, effects thermal limits, thermal management solutions. Understand what specific heat capacity and what BTU/H mean before you post here and fill peoples heads with nonsense which they then take as fact and regugatate on this and other forums.
    • Up x 1
  13. Most part of your story is correct.. IN fact, I´ve been saying the same damn thing about the GPU/CPU ratios..

    Meh, It doesn't matter, believe whatever you guys want to believe.

    Fact is still, cpu sucks, Upgrade the CPU and your problems will be 80% less.
    No rocketscience needed.
  14. Pretty sure you're ********.

    @YB: Yeah, I editted it and noticed it changed after the fact. Actually works, so thanks for that. Initially, it actually crashed the game so I had to re-do the entire .ini file again.

    @WallofTextman

    The info is useful, but it's largely not relevant, I understand all points, but you address none of the spontaneous changes from one version to the next, and unique instances.

    My CPU isn't an "issue", I'm aware that it's a major bottleneck. The issue is that with a patch, suddenly it's saying the GPU is the bottleneck and heating it up beyond sane limits, and that thermal output cannot be altered, regardless of settings, render load, or render quality.

    Further, the GPU indicator is not correlated to card heat: it'll say GPU as soon as I boot up, CPU when looking in another direction (that has more things to render), and doesn't show any performance decreases in the game at that level of heat.

    And saying that asking them to fix it is unreasonable on any level makes little sense, as optimizing the game should lower loads, thus lower thermal output by taking stress off the GPU, and by this occurance we see a de-optimization over a patch, while they are themselves intent on optimizing.
  15. Yeah I live in the desert and only get max 53C are you sure you didn't flip your monitor to read in Fahrenheit?

    For the guy that is really bent on the CPU thing. Your right it would be beneficial to upgrade. However your completely wrong on thought process. He needs to upgrade just because it would be beneficial. What you fail to understand is that the CPU has to process information from the server to your computer. Then graphics related information is sent to GPU. If your processor is way slower than the GPU then the GPU should not be receiving enough data to over heat it.

    If the person is actually getting that hot then I would suggest s/he look into whether or not the heat sink is actually in contact. More to the point the devs still haven't properly implemented multi-core so it doesn't matter right now. I know I am CPU bound because they wont make the software touch more than 30% of my system resources. Fact: the game has a lot of work to be done on the software side t make the game run right. Especially if your not in XP.

    If I knew how to make XP access all my hardware I would certainly make a dual boot for this game. Not sure that it would matter though since current gpx cards don't use dx9. You can instal everything needed DX9 and other support crap to make it better but current graphics cards have to do full render on DX9 info. Whereas DX9 cards manage per instance rendering. This is why I keep saying we need to be pushing for DX10/11 support. DX9 expired a long time ago and puts undue stress on the newer cards. Also the lack of multi-core support puts unbalanced heat stress on the CPU which will eventually lead to micro fractures in the crystal structure it is made of. Which of course will break electric flow and burn out the CPU prematurely.

    Part of the trouble is that they cheated a bit on the engine it is an old engine meant for a different game gone past. That limitation of age is half the problem with current day support. Half of the reason we have hackers so early on is because the engine was cracked in another game already.
  16. Then you're wrong. You may think that I'm ********, look even Kowalsker gave you a very decent explanation of how things work..
    I do admit my explanation actually sucked, if I read back I need to rethink what I was actually trying to say really..

    What the hell are you trying to achieve here? Everything people reply to you gets ignored or flooded with stupid arguements..
    Do you want your system to work fine?

    Btw, you still ignore the fact that the BETA was alot less stressful than the full game, so the performance SHOULD be less.
    Not to mention your CPU doesn't provide much performance. Simple as that..

    What answer do you actually expect to get here? You seem to know best, why ask others on a public forum if you think they're all dumbasses.

    This is wrong btw, where the hell did you get that info?
    DX9 is inside DX11, DX11 is a newer version with ofcourse additional support for better rendering and processing, but it doesn't mean new GPU's don't support DX9, that's bull.

    @TopicStarter;
    You are right that your GPU has a high temperature, sure. But if that's the only thing being odd other than your CPU being crap, what do you expect to achieve here? The time you've been posting on this forum about your issues, you could've gotten 5 New CPU's already and fix the damn thing.
  17. So let's hypothesize for a momment then. Function X being done pre-patch 100% CPU side as coded. Takes 50ns to complete CPU side. Post-patch, function X is re-written for GPU calculations, function X is intergral to every scene and called n,000 times per scene, function X still takes 50ns on your specific GPU, still called n,000 times per scene. Now your CPU is doing less, your GPU is doing more, generating more heat. Time to complete the render of each frame remains unchanged therefor framerate remains unchanged on your specific CPU/GPU combination. Others may see a performance drop or increase depending on their specific CPU/GPU combination. Which is exactly the senerio we see with this patch as well as a typical responces in beta.

    20 people say "Great Job SOE I was getting 25 FPS in big battles, now I'm getting 40!"
    20 other people say "WTF did you jackwads do? I can't even play now!"
    20 other people say "I don't even notice a difference!"
    20 other people say "I see slightl more FPS in battles, but overall my framerates dropped a little when nothing is happening around me"
    20 other people say "My frame rates seem the same now, but I'm getting stuttering!" (Thermal cycling on the GPU @ thermal throtteling threshold)

    Everyone's hardware is a unique instance. Few people seem to have the exact same MB/RAM/CPU/GPU/HD/SSD & Overclocking combinations & Game settings. Most people will see varied responces to the current round of optimizations. It all depends on so many factors. CPU speed can be one factor, but so can the ammount of L2 & L3 cache on your CPU. If were calling the same functions, CPU or GPU side and the function is the same and often the results are the same, more cache could generate more framerates. GPU archatecture and how it executes operations can have a direct coralation to how any given change effects performance and heat.

    If I had to guess what is causing heat and loading for the general player. I'ed suspect something simple. Like a pixel shader, being applied to a great many pixels, perhaps fullscreen application of one or more pixel shaders. They execute fast, perhaps 5-10 clock cycles? 1920x1080 = 2mil pixels on a 900mhz GPU would be what 40-20FPS for one shader and 90-180million executions of that shader per second?

    Does this give you more insight as to what may and quite likely happened to your system with the patch?

Share This Page