Poll: What's more important, higher resolution or anti-aliasing?

Recommended Videos

Drathnoxis

I love the smell of card games in the morning
Legacy
Sep 23, 2010
6,023
2,235
118
Just off-screen
Country
Canada
Gender
Male
Aliasing is annoying. Really annoying. It's probably the most irritating thing related to graphics to me. The weird thing is that there seems to be more games out now that have horrible aliasing as compared to 10 years ago. It must be because everybody is trying for a higher and higher resolution and making tradeoffs, right?(or maybe it's just because I just got a new New 3ds, aaaagh why is the aliasing so baaaaad???)

I mean look
Here's Wind Waker

And here's Skyward Sword 9 years later

Why?! Why is everything so jagged?
 

SmallHatLogan

New member
Jan 23, 2014
613
0
0
Well, I find aliasing is less noticeable at higher resolutions. I'm playing Xenoblade Chronicles on the 3DS at the moment and it looks like hot garbage, but I play plenty of PC games with AA off (because I don't have a gaming PC and have to do what I can to get a decent framerate) and I don't really notice it. So my vote goes to resolution.

Edit: I should clarify that I have a laptop and always stick with the native resolution so it's not really a decision I have to make. If I had a gaming PC with more options I might take a resolution hit for some better AA.
 

The Lunatic

Princess
Jun 3, 2010
2,291
0
0
Generally go for resolution, but, I play on PC, so, don't too often have to choose between the two.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Depends on the level. 4x looks nice and you don't have to lower your resolution much to get it. Going up to 32x isn't really worth it though.
 

CaitSeith

Formely Gone Gonzo
Legacy
Jun 30, 2014
5,374
381
88
Well, with lower resolutions, anti-aliasing becomes less relevant. Anyone who played Wii in both SD and HD TVs can tell you the frustration of having their game experience worsened in the later by the lack of anti-aliasing.
 

Chimpzy_v1legacy

Warning! Contains bananas!
Jun 21, 2009
4,789
1
0
I'm fortunate enough (after having spent a little over a year saving my hard-earned money for it) to have a PC powerful enough that I rarely have to make the choice between the two,

But that wasn't always the case, and back then I generally preferred (and still do) sacrificing AA over resolution. I will happily sacrifice both in favor of a good framerate if needed.
 

Poetic Nova

Pulvis Et Umbra Sumus
Jan 24, 2012
1,974
0
0
Favoring resolution over anti-aliasing here. It's a pet peeve of mine when a game doesn't run in native res.
 

JohnnyDelRay

New member
Jul 29, 2010
1,322
0
0
I've gamed on all kinds of setups with graphics from the very bottom to the highest. My personal preference has always been resolution, but even more so as I see the kind of AA we get in bigger titles nowadays. It's just nasty for some reason. But I have never felt the need to play on anything higher than 1080p, unless on a massive living room TV from not too far away.

Despite all that FPS is still king, and I'd rather have everything look like a labyrinth staircase running up and down people's arms rather than have a jaggy game. From more recent memory I've played sessions of Killing Floor on a laptop at 800x600, wasn't pretty but at least it wasn't a slideshow. Of course back in the days of Half-Life 1 I played at some pretty crazy resolutions on my 14" CRT.
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
As you crank up resolution, you don't need as much anti-aliasing. A greater number of smaller pixels on screen makes edges appear far smoother, and at a high enough resolution it will become impossible to perceive the edges of pixels. At 1080p, as much as 4 to 8x MSAA must be used to eliminate jagged edges, which is extremely taxing on any PC or console. At 4K, cheap and easy FXAA is supposedly enough to smooth any remaining jagged edges ...you just need a PC or console powerful enough to render in 4K. Having said all that, screen size also affects the ability to see jaggies, as a massive screen requires larger pixels or more space between pixels.

Since 4K TVs are becoming more common, I am thinking next gen consoles will be 4K capable. For games and hardware available today, I would choose a reasonable level of AA over resolutions higher than 1080p. I really love the new DSR and VSR options that Nvidia and AMD have recently developed, as they somehow seem to be less taxing than even 4x MSAA.
 

wizzy555

New member
Oct 14, 2010
637
0
0
I think that wind waker pic has been smoothed over for promotional use.

People are complaining about aa on handhelds instead of being amazed it actually exists.... I'm so old.
 

Drathnoxis

I love the smell of card games in the morning
Legacy
Sep 23, 2010
6,023
2,235
118
Just off-screen
Country
Canada
Gender
Male
wizzy555 said:
I think that wind waker pic has been smoothed over for promotional use.
I don't know, maybe it has, but right now I'm playing Luigi's Mansion on the Gamecube and it has far less aliasing than Skyward Sword.
 

veloper

New member
Jan 20, 2009
4,597
0
0
Make the pixels small enough and AA won't even be noticeable.

AA is just the workaround that exists, because pixels are still too big on many screens. It's use is limited. No amount of AA or even SSAA is going to make tiny text readable on low resolutions.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
veloper said:
Make the pixels small enough and AA won't even be noticeable.
And that would be the correct answer. I'm not really sure why the question is posed as either...or. Not saying that it couldn't be but this one is just odd. Anti-aliasing is fundamentally "correcting" an image that doesn't have a high enough resolution.

With the problem being jaggyness in image, either increasing the resolution or introducing AA would smooth it out to appear normal. OK, realistically, of course, it's a mix of both. Resolution is, generally, the better way of handling that but often hardware limitations get in the way, hence the resolution is not high enough. This is where AA comes in to "cover up" and provide sort of pseudo higher resolution by filling in the missing bits.

So, if you could literally have only one or the other, then the answer would be, or even should be, "higher resolution", since that would make AA obsolete. At least in theory. However, at the same time, that's also the cheap[footnote]figuratively[/footnote] way out as it's "throw more hardware[footnote]also, "money"[/footnote] at the problem".

Is it really either...or? No, not really. It's a balance, in the end. Also, there is the fact that art style could also compensate when the resolution is utilised better, however, yet again it's a balancing act between these three factors. And other factors in the system, as well.
 

PacDwell

New member
May 16, 2009
32
0
0
My monitor is a 16:10 (1920x1200) screen. Whilst playing Witcher 2 I knocked down the resolution to 1680x1050 (still 16:10) and added some FXAA.

It ran much better than at 1920x1200 without AA, and looked just as good.
 

VarietyGamer

New member
May 25, 2016
32
0
0
Higher resolution is always preferable. Anti-Aliasing is what we rely on when higher resolution is not possible for whatever reason. A clearer, crisper image is always preferable to a soft, mushy one.

I game on a 3440x1440 ultra-wide, so I always take the fps hit and go native, rather than relying on 2560x1200 with AA. The image is just crisper/more pleasant on the eyes. Once 8k displays become commonplace (heck 4k is enough), and gpu's can run everything at native res, AA will go the way of the dodo. Won't be required.
 

EHKOS

Madness to my Methods
Feb 28, 2010
4,815
0
0
I've been going back to some Unreal games on PS3 while waiting for Burning Blood...DEATH TO ALIASING!!!