Auto-detect video settings in PC games

Recommended Videos

MindFragged

New member
Apr 2, 2009
104
0
0
Just wondering, does anyone else find the settings that games auto-detect from your GPU for their graphics settings wildly optimistic?

I've played with the recommended settings recently for Dishonored, Far Cry 3 and Deus Ex: HR and all have been languishing between 20-35 FPS when I play them. I was surprised especially for the last two, since the games are meant to be optimised for AMD (so they say).


Is this a common problem, or do I need to fix some things? I'm using an AMD Radeon 6770 HD (not exactly great, but not as bad as all that)
 

crono738

New member
Sep 4, 2008
550
0
0
I seem to have the opposite problem from time to time. Auto-detect sets everything to laughably low settings when the game runs perfectly on my rig at max settings.
 

wintercoat

New member
Nov 26, 2011
1,691
0
0
I have an...outdated GPU and CPU, and the auto-detect always puts the settings at medium/medium-high. I barely make the minimum requirements in most cases. The lag is so bad that it takes me several minutes to adjust the settings to where the mouse pointer isn't jumping around the screen like it's teleporting. It's probably because my screen resolution is set to 1920x1080, so it assumes that the game will play fine at higher settings.
 

Images

New member
Apr 8, 2010
256
0
0
Auto detect rarely gets it right for me. Its estimation is far over or under reality in most cases. I usually tweak it myself.
 

Doom972

New member
Dec 25, 2008
2,312
0
0
Auto-detection tends to get things wrong. It's usually recommended to let the game choose a preset for you by auto-detection and then changing specific settings until you are satisfied. The first thing to change is resolution, since games tend to choose either the screen's native resolution (which is usually too high), or the minimal resolution of the game. I manage to get much smoother framerate by using 1600x900 instead of 1920x1080 (my native screen resolution). Afterwards it's time to mess with Anti-Aliasing, Anisotropic Filtering, Dynamic Shadows, etc.
 

chukrum47

New member
Jun 10, 2011
52
0
0
I have never once played a game with the default/recommended settings. I've always found that by turning off both Anti-Aliasing and Anisotropic Filtering I get MASSIVE performance increases and can afford to increase the other settings instead.
 

Da Orky Man

Yeah, that's me
Apr 24, 2011
2,107
0
0
I basically always have to alter the settings to make the game either playable or not look like 70s Doctor Who. Skyrim in particular decided to set everything except resolution to the lowest possible, while cranking resolution to 1920x1080. After shifting settings up to medium/high, I was getting 30-40 fps, and this is on a laptop.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
chukrum47 said:
I have never once played a game with the default/recommended settings. I've always found that by turning off both Anti-Aliasing and Anisotropic Filtering I get MASSIVE performance increases and can afford to increase the other settings instead.
Indeed, AA and AF do give a huge hit in performance (AA way more than AF), though, you should at least run with bi- or trilinear (weirdly, I've seen some games that don't have trilinear) AF, that should give you better picture for pretty much no performance hit. The other options above that would probably drop your FPS.

I also tend to turn off shadows if possible. It tends to make stuff run much more slowly, and older games tended to have really glitchy shadows that, say, were projected on the wrong side of a wall, or jerked around and stuff. That's not really the case but point is, I'm used to not having dynamic shadows.
 

Terratina.

RIP Escapist RP Board
May 24, 2012
2,105
0
0
Somehow auto-detect decided that I needed a letterbox resolution (my monitor is 1920x1080). No thank you. My eyes may be small, but narrow letterbox resolution is not what I needed. Happens almost every time with first time setup.
 

Phrozenflame500

New member
Dec 26, 2012
1,080
0
0
In general auto-detect always gets it slightly lower then the setting I normally go for.

This may be just that I have a high tolerance for low FPS though.
 

clippen05

New member
Jul 10, 2012
529
0
0
Usually I set it higher than what they auto-detect. After playing for a little I make adjustments as necessary to make sure the framerate is sufficient.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
I usually have the opposite problem, it usually sets me to low for everything despite having a 1 year old system.
 

GundamSentinel

The leading man, who else?
Aug 23, 2009
4,448
0
0
It's always pessimistic for me. Seems to forget I have a second GPU (fair is fair, my second GPU sometimes forgets it's there as well :/). Ah well, I never play with recommended settings anyway.
 

spartandude

New member
Nov 24, 2009
2,721
0
0
it usually gets it right with new games, however older games tend to get it wrong.

For example im maxing out Skyrim, Witcher 2 (except for uber sampling) to name a few but Oblivion auto detected me to lowest settings
 
Aug 1, 2010
2,768
0
0
I find auto-detect to be one of the most crap features ever created.

On my old gaming laptop that chugged while chopping down a Minecraft tree, it would always go crazy optimistic. FPS games were set at ultra. I would start the game and crash within seconds.

On my new desktop computer, that essentially resembles the monolith from 2001 and can run damn near anything at any setting, it goes stupid pessimistic. I load up Thomas Was Alone, a unity game that consists of colored squares and rectangles and it goes to the lowest possible settings.

So yeah.

Fuck auto-detect.
 

Owyn_Merrilin

New member
May 22, 2010
7,370
0
0
Doom972 said:
Auto-detection tends to get things wrong. It's usually recommended to let the game choose a preset for you by auto-detection and then changing specific settings until you are satisfied. The first thing to change is resolution, since games tend to choose either the screen's native resolution (which is usually too high), or the minimal resolution of the game. I manage to get much smoother framerate by using 1600x900 instead of 1920x1080 (my native screen resolution). Afterwards it's time to mess with Anti-Aliasing, Anisotropic Filtering, Dynamic Shadows, etc.
Pretty much this. Once you learn which features tend to blow up your computer and which ones don't affect it all that much, you get a better idea of what to do. This is one way to do that.
 

Owyn_Merrilin

New member
May 22, 2010
7,370
0
0
DoPo said:
chukrum47 said:
I have never once played a game with the default/recommended settings. I've always found that by turning off both Anti-Aliasing and Anisotropic Filtering I get MASSIVE performance increases and can afford to increase the other settings instead.
Indeed, AA and AF do give a huge hit in performance (AA way more than AF), though, you should at least run with bi- or trilinear (weirdly, I've seen some games that don't have trilinear) AF, that should give you better picture for pretty much no performance hit. The other options above that would probably drop your FPS.

I also tend to turn off shadows if possible. It tends to make stuff run much more slowly, and older games tended to have really glitchy shadows that, say, were projected on the wrong side of a wall, or jerked around and stuff. That's not really the case but point is, I'm used to not having dynamic shadows.
You know, I don't think I've ever seen a game where cranking AF did all that much to the framerate. Even 2X AA is a pretty big hit, but I run just about every game at 16X AF with wildly underpowered hardware, doesn't seem to do anything but make my textures look prettier.

Shadows, lighting, and shaders can all be huge hits to your framerate, though, with some engines being worse with it than others.
 

Weaver

Overcaffeinated
Apr 28, 2008
8,977
0
0
crono738 said:
I seem to have the opposite problem from time to time. Auto-detect sets everything to laughably low settings when the game runs perfectly on my rig at max settings.
Basically this for me. I mean, it's usually WAY off. Usually I can significantly boost the settings from recommended and still run at 120fps at 1080p.
 

DarkhoIlow

New member
Dec 31, 2009
2,531
0
0
I never use auto detect, because I know it's all a bunch of bullshit. I always tend to fiddle with my settings to my own preferences.

For example, I couldn't give a toss about shadows in games and most of the times those are the primary things from the video settings that I turn off to gain more frames per second in the game. I know shadows help a lot in certain games with the atmosphere, but in all the games I've played in my lifetime I didn't found a need for them.

So yeah, no auto detect for me. I tend to spend a lot of time tweaking my settings to get the best visual + performance out of the games I play. I like to have the texture quality to highest possible and trying to lower the others to balance the fps drop.
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
It's WAY too optimistic on my laptop, and WAY too low-ball on my desktop.

Dangit, Bethesda! I can Ultra Skyrim on my desktop and medium it on my laptop, so why do you always set everything to "high"?