Can we all agree that Temporal AA is the greatest graphics innovation in the last 5 years?
>perfect image quality
>zero jaggies
>much cheaper than MSAA
Would it be going too far to say MSAA is obsolete now?
Can we all agree that Temporal AA is the greatest graphics innovation in the last 5 years?
>perfect image quality
>zero jaggies
>much cheaper than MSAA
Would it be going too far to say MSAA is obsolete now?
Other urls found in this thread:
phrogz.net
twitter.com
>>perfect image quality
I don't know what temporal AA is but that looks blurry as fuck. Do not want.
aa off looks better, little bit of jaggedness is worth the clarity..just going off this one image
In Fallout 4, which your example is from, temporal AA introduces a ridiculous amount of blur while in motion. It is much cheaper than MSAA but if you only care about still images then SMAA is better, it also eliminates jaggies when still and doesn't blur the screen much at all.
it is okay for games like r6 siege, but blur is really fucking annoying
>Ghosting/Smudge
>Blurry
Thanks but fuck that shit.
The jaggy surfaces are the only that reminds you that you're still playing a video game and not real life.
Adding to this, SMAA is even cheaper than temporal.
>Just blur my shit up
Might as well just turn on FXAA if you want a blurred mess
This is idiotic
You can take a N64 game, render it at 4K and apply 8xMSAA and it sure as shit is still going to look like a game.
There is no such thing as good AA
>little bit of jaggedness is worth the clarity..
You're exaggerating. The loss of clarity is not a big deal.
Sometimes it even makes the graphics more photorealistic. Look at pic related: A real film camera would not capture as much detail as the hyper-detailed shot on the right.
No, It's blurry as fuck
We have regressed to N64 levels of AA because Consoles lack bandwidth
MFAA is the best AA by a mile
It's fucking shit and blurry, TAA works by predicting the next frame so it only works when you move the camera and doesn't work when you are standing still. Leading to a clear somewhat aliased image when you are standing still, and a blurry vaseline smeared antialiased image when you are moving. It's very jarring.
Define good.
...
When you're sitting in front of your TV at a normal distance, you won't notice a bit of blur. You will notice crawling and jaggies, though.
Only autists care about pixel-perfect texture quality.
That image is a terrible comparison
It's one flat texture
Please tell me this is sarcasm
I was joking
These shitty "AA" copouts used in recent games that just make shit blurry are awful, bring back SSAA and MSAA
>in front of your TV at normal distance
Part of the problem, stop pertaining to weak console hardware
>TV
nigger what?
I'm a brainlet, please explain how MSAA, SMAA, FXAA and all these things work
wtf i hate aa now
>armchair rendering engineers think they understand TXAA
If it's so bad, why is it the default AA method in Unreal and Unity? Why have Capcom, Square Enix and Bethesda all shipped AAA games with it?
And Temporal AA.
Temporal AA is the fucking worst. Even FXAA is better. T-AA kills all jaggies sure but also makes the game a blurry shitty mess.
>TAA
>ghosting
>blurry
yeah haha no
>zero jaggies
Can you even see the picture you posted?
Why do developers bother using anything but SMAA?
It looks as good as MSAA and runs without a problem
Because console
The tech illiteracy of the majority of Sup Forums users means I rarely doubt posts like yours. Guess the joke's on me.
Is it true that AA is less of a problem with 4k?
Yeah I've been wondering the same. It's barely more resource intensive than FXAA and creates image quality comparable to MSAA.
Because it is extremely cheap like OP says and barely impacts the framerate. But that doesn't stop it from looking like shit.
Because consoles can't handle traditional AA and MSAA is difficult to implement in modern engines that use deferred rendering (though not impossible).
>Not just rendering the game at 4 times native resolution and supersampling
Even a 1080ti isn't capable of doing this in modern games, unless you're using a 480p monitor
Personally I'm reminded by the fact I'm staring at a monitor, and controlling my character's actions by clicking buttons.
I think no AA looks perfectly fine. I hate all this blurry shit they keep adding into the game such as motion blur, bloom, depth of field etc. all of that can go into the trash.
A form of AA where the removal of jaggies is pretty much perfect is fine too, but none of that inbetween shit.
>MSAA
Find pixels covered by more than one triangle. Render these areas at a higher res to remove jaggies. New lighting techniques has made it incompatible with modern engines.
>SMAA, FXAA
Try to detect jaggies in the image, and just blur them. They all work the same way, but SMAA is better at not smearing the whole screen.
>Temporal AA
Mix multiple frames together. Looks great when the camera is perfectly still, but blurs everything when it's moving.
Mothing blur is the worst thing ever, it literally makes me sick.
t. Brainlet
4 times 480p is almost half the pixels in a 1080p image. Do you think 1080ti struggles running 1080p?
I liked it on bloodborne (though the game ran at 30fps).
What do you all have against Jaggies?
Who the fuck uses anything but supersampling in current year? Take your blurry shit and fuck off
I dunno, I remember something called TXAA from Black Ops II on PC and that was probably the best AA I've ever seen
Motion blur is one of the best post-processing effects if used sparingly and not to hide low framerate. IIRC Source engine games used it in a lowkey manner
>slap blurry foggy filter ontop of everything
>HAHA! ISNT THAT SO MUCH BETTER! I CANT SEE JAGGIES!
Not too well versed on supersampling and anti-aliasing but do you need MSAA x2/x4/etc...on if you are playing at 1440p or 4k?
>Can we all agree that Temporal AA is the greatest graphics innovation in the last 5 years?
Maybe if you only want to take bullshots.
In-game is by far the one that looks the worst, shits itself all over the place when something is in motion creating something even jaggier than motion blur.
4K is basically the equivalent of MSAA x2 all the time at native . If you're supersampling 4k from 1080 then no
Alright thanks for clarifying.
but you're wrong. It's better than MSAA x2. Supersampling is better than all MSAA amounts.
Native > SuperSampling > MultiSample
Well, it's not that simple really. Jaggies have more to do with PPI than the actual resolution. If you have a massive monitor and sit close to it at 4K resolution, you will see jaggies.
>pour soap on the image
>call it "perfect graphics"
once gpus and displays will be able to render 4k at 140 or more hertz to deal with blurring and ghosting, it's gonna be the best thing ever
txaa is the only thing so far that can handle the shit on the left side of picrelated without shitting itself
yeah dude looks great
i too like putting virtual vaseline all over my display
TAA looks amazing if you super sample by 150-200%.
Otherwise it blurs the image too much for my liking and introduces lots of ghosting in areas where there are many moving parts. (i.e. windy forests)
jesus who the fuck cares
Resolution does not matter for aliasing, what's important is pixel density. The higher the density the less aliasing is noticeable. I have a 27" 1440p monitor which has a density of 108 pixels per inch and aliasing and crawling are still noticeable without AS. My 49" 4K TV has only 89 pixels per inch, the aliasing and crawling is even worse without AS.
To get by completely without AA you'd need something like 4K on a 15" screen
Doesn't Nvidia have its own form of multisampling? TXAA or something?
Dunno mate, i guess we are some people who just can't handle it, i never get sick, not driving or sailing, but games with motion blur gives me headache and nausea.
>Can't put in actual anti-aliasing
>Use post-process blur filter
>Can't put in actual shadows
>Use excessive ambient occlusion
AA**
How very insightful. Go home everyone; you can't discuss topics this autist doesn't like
That shit is blurry.
SSAA
Supersampling looks perfect if the resolution is high enough. It doesn't blur anything and completely removes the jaggies.
Sadly, it's a kind of brute force, and doesn't worth trying on new games unless you have a 3 way sli gtx1080 xp.
They could improve performance with eye tracking, supersample the part where you are looking, and downsample everything else. It would probably even increase performance, and it would look much better (for one person).
They might also could edge detect and only supersample those parts.
As far as I know, the gtx1000 series are able to render different parts at different resolutions.
There's only one good type of AA and that's higher resolution. I run my games at 1440p or higher on my 1080p display, way better than shitty AA.
Temporal filtering is great but the image gets a bit grainy, depending on the person it may not be optimal at all. Personally, I don't really care since the extra 50 fps if compared to MSAA are really fucking great.
>how do you fix clipping and jaggies?
>just fucking blur everything lol
TXAA is temporal AA, niggus. Nvidia has MFAA which only works if MSAA is enabled. Supposedly is better
TAA is a cheaper process, because it doesn't work while in motion, it relies solely on focusing the screen again when the camera stops moving. This is painfully lazy way to get performance, but since you won't notice 99% of the time developers have been sucking dick about it.
>zero jaggies
then what the fuck is all that shit with TAA on.
The performance overhead also is only something like 5% when using it over MSAA or SMAA. Even shit like FXAA looks better than TAA. Being the lightest doesn't mean shit if it looks like crap both in a still shot being a blurry mess with jaggies still and in motion with the aa function of TAA completely null and still manages to make the picture look like shit.
Congratulations OP truly you're retarded or poor enough that your first experience with moderate anti aliasing happens to be temporal anti aliasing as you seem to never have been able to run multi sampling or super sampling or even FXAA.
This
All the bullshit techniques nvidia or whoever keep coming up with are fucking blur
Downsample or nothing
It hugely depends on how far you're looking at the screen from.
PPD (pixel per degree) is the best measurement for that.
At 150PPD you are much likely couldn't notice any alisaing, but you can try it.
Put on a jagged image, then try to increase distance until the alisaing is gone, measure the distance and put in your numbers into this calculator:
phrogz.net
Select PPD, and you will see how much you need for maximum clarity.
Then you can calculate resolutions at different screen sizes and stuff.
For example, for 150 PPD on a 27" display looking from 80cm, you'd need at least 6145*3457 resolution.
This is also the maximum resolution that makes sense on a 27" desktop display, as nobody has better eyesight than that. You literally cannot detect a single white pixel on a black screen.
Because it's cheap way of making still shots look like they have moderate amounts of AA while completely nullifying the effect in motion for performance gains making the entire picture look like shit both in still and motion.
What are you retarded, don't you fucking know about profit margins. Cheaper option to extract more money, this ain't fucking rocket science.
Before reading the post I thought this was to prove that AA off is better.
These are how your options should be set.
Textures: Max
Anisotropic Filtering 16x
Object Detail: Min
Object Quantity: Min
Particles: Min
Shadows: On
Shaders: Off
Post Processing: Off
MSAA: 0x
V-sync: Off
Nah OP is trying to validate his poorfag rig with shitty TAA saying it's better when it's the cheapest shit option for consoles that actually looks the outright shittiest.
>Anti-aliasing: on/off
>It's actually TXAA
Would you rather have your eyes scratched by jaggies out the ass or have them covered in vaseline?
>v-sync off with 90% of the most taxing systems off
This isn't third world anymore if you don't even use v-sync to prevent tearing when turning off 90% of shit I doubt you could get up to the level of even the baseline hertz of your monitor for it to matter at this point. Simply I'm at a loss for words. Do you live in literally fucking Ghana the dumping ground of the world. Where do you get your electricity, has anyone actually sought you out to see if you house is intact still. Holy shit fucking what.
Clearly was, OP just used reverse psychology to get replies
If it's TAA I just turn it off entirely and turn on my gpu SMAA or MSAA, ya know like a reasonable person.
>poorfag pretends running a game at minimum settings is better than running it at max settings
>V-sync: Off
good job making your GPU work harder for something you won't take advantage of if you're gaming in 60hz, I hope you like screen tearing too.
You're retarded
I have pretty good eyesight, and mine came out as 120PPD.
I also used a distance where I definitely couldn't tell pixels apart, and the jaggies were definitely gone.
That means... I will never need a higher resolution TV, as I have an oled 4K 55", and it's already above 120PPD from 2 meters.
This is fucking awesome!
Tearing will always occur with vsync regardless of framerate.
Shader effects and post-processing absolutely ruins the clarity of games. It's just overly and under-saturated garbage.
I have a 144hz gsync display, I cap my games at 142fps to stay in gsync range rather than risking buffering frames.
I fixed my AA problem with 1440p monitor.
with vsync off*
unless you have an adaptive sync display**
>Tearing will always occur with vsync regardless of framerate.
How high are you? How much coke did you snort to see the world like you do?
>Fallout 4
>Aliasing out the arse
>Either use FXAA (a literal blur filter) or TXAA (which wrecks my poor 840m)
see
>game doesn't implement anisotropic-filtering properly
>Distinguising details at 150ppd would require 20/8 vision. According to 3 the theoretical upper limit of human visual acuity lies somewhere between 20/10 and 20/8 vision.
TFW in your lifetime humanity reached the maximum resolution you can biologically see.
We are living in the future.
Isn't temporal AA basically the same thing as that checkerboard rendering meme consoles use for their fake 4K? You render at a much lower resolution and then upscale it and add a bunch of filtering to hide this.
All I know is that on PC in 1080p it looks blurry as all fuck. It's really bad.
>Enable any AA option
>Get at least 20 FPS drop
Once we get 4k 32inch screens, Anti-aliasing will be obsolete.
the worst are the streamers who immediately disable vsync then start complaining about tearing and blaming the game for their own retardation
>All these years of playing vidya on PC
>Still have no idea what Texture filtering does
Name a game.
you know how significant vsync delay is? you can feel it very easily
you fuckers without 144hz monitors have no idea how pointless adaptive sync is, you cna't see tearing at 144hz
it is still technically there, but you can't see it - too fast
all these sync add another buffer to controls ->worse gameplay, the point of high refresh rate is not to run it at higher fps it is to have less delay in controls, and it feels nice