Why don't all games come with SMAA?
>Low cost
>Looks great, isn't blurry dogshit like FXAA that is in everything
Why don't all games come with SMAA?
Other urls found in this thread:
iryoku.com
en.wikipedia.org
twitter.com
Because it's proprietary.
It does look fantastic, but doesn't MSAA look better?
It's just a customized MLAA.
Because it's heavier than FXAA while not smoothing edges as much as FXAA
SMAA = sharper image, more aliasing
FXAA = smoother image, less aliasing
Because some of us don't have rich daddies that buy us 10k gaming setups.
Just bloody play Open Arena ye CoD weabo.
You can just inject that shit with ReShade in any game.
SMAA is highly efficient
I fucking reshade. If you configure it correctly, you can make any game look way better.
>blurrier image, less aliasing
Fixed that for you.
SMAA is okay but MSAA isn't supported that much any longer because the way games are rendering have changed. Future is shader based anti-aliasing. SMAA works still because it's post processing based algorithm though. You can always inject SMAA by yourself using ReShade for example but that doesn't fix shitty image quality alone.
You are literally talking to someone that is currently posting from a 2003 eMac, And accessing the internet via a first generation MacBook with a burnt-out GPU.
Then you're not running the games proposed to include SMAA anyways, so what does it matter to you?
Because people don't realize that games are not programmed efficiently anymore.
>no adblock
Newer versions of UE4 support MSAA. Problem is that it literally kills performance more than just increasing the resolution on GCN AMD cards.
> AA thread
Alright faggots
What the hell is EQAA and CSAA and how can I use them?
SMAA is the best AA at the moment, MSAA even if MFAA is too demanding
FXAA can sometimes be good if devs are careful with it, like Rainbow Six Siege, FXAA looks incredible, but most games it's blurry
Don't even get me started on TAA, just a big blur filter with added ghosting
Well good thing nobody's using a gamecube amd card
No it isn't. SMAA was made as a collaboration between Crytek and the Killzone devs.
It's hard to get adblock on TenFourFox, Also, I don't really care about adblock on this machine, Don't use it for browsing, Mainly GNU emacs.
kek, basically any AMD card past the 6970 can't into MSAA without getting 0fps
>O ur method shows for the first time how to combine morphological antialiasing (MLAA) with additional multi/supersampling strategies (MSAA, SSAA) for accurate subpixel features
CSAA is an Nvidia only meme, EQAA is an AMD meme. They're both MSAA but marketed as revolutionary technology that you can only get by Playing it the way it was meant to be played(tm) or Evolving your gaming(tm).
So it can't actually be forced on every game and the performance drop is just as big as 4x MSAA?
EQAA suck, CSAA was decent but Nvidia dropped support
Nvidia now uses MFAA, which is incredibly underrated, as long as you keep your FPS above 30, it works wonders, turns 2xMSAA into almost 4x
Can you actually force MFAA? Even though there's an option to turn it on on the control panel I think games must support it as awell.
Unless MFAA applies to forced MSAA though nvidia inspector?
As long as you have MFAA turned on in the Nvidia control panel, any DX11 game that has MSAA supports it
Just put on 2xMSAA, it makes it 4xMSAA for the cost of 2x
why don't all games come with 2x, 4x, and 8x SMAA
Does that even exist?
There's SMAA T1x and SMAA T2x
>SMAA
>low cost
Not true.
It renders the image at a higher resolution. With deffered rendering (most modern engines) this is quite possibly the slowest option available.
There's nothing slower than TXAA
thats SSAA
You don't know what you're talking about. SMAA is post processing.
AFAIK SSAA = FSAA = simply rendering the game at a higher resolution.
Then how come I only lose a few FPS when I inject it in modern games?
>SMAA
Shit you're right. My bad.
>attention whoring with your piece of shit computer
boo hoo, don't come into a thread about graphics options if you can barely run diablo 2 faggot
because 1080p is on its way out and once you go 1440p and higher you don't need anti aliasing anymore especially the higher you go so its a waste of money to invest in the technology that gets more and more obsolete by the day
Even in 4K you still need a little bit of post processing AA to ompletely get rid of all the jaggies. FXAA looks great because the bigger the resolution the less blur it causes. Also SMAA looks even better obviously
God tier:
>running your games at a higher resolution than your display
High tier:
>MFAA
Everything else tier:
>everything else
Blurry shit tier
>TAA
Absolute trash tier
>FXAA
>God tier:
>>running your games at a higher resolution than your display
Unless it's 4x as big it blurs the image. Without smoothing you get wird artifcats. That's why there's a smoothing option in the nvidia control panel. It only looks right if it's 4x with smoothing at 0%
In defense of TAA it COMPLETELY eliminates jaggies. All of them. Doom is completely jaggy free
>Once you go 1440p or higher you don't need anti aliasing any more
Do you own a monitor better than 1080p? I have both a 1440p and 4k monitor. Aliasing is abundant.
no I only own an 8k desu and you don't need AA at this resolution m8. Jaggies are for poorfags.
I know for a fact you're not playing vidya with one of (if not the only) affordable 8k on the market.
>Unless it's 4x as big it blurs the image.
Depends what you're scaling with. GeDoSaTo supports lanczos scaling and it works really well with 1440->1080.
I've wanted to try gedosato scaling for a while but does it support every DX11 game? What about DX12, Vulkan and OpenGL?
I'm the attention whore?
You are clearly trolling, That is otherwise known as 'Attention-Whoring'.
I am just giving a different point of view, Is it a bad thing to think differently? I am just talking about why millennial programmers/gamers don't know fucking shit about shit.
Also, Why am I the faggot...
If YOU are the one that sucks activision's cock for shitty, Inefficient games?
Also doesn't downscaling usually go better with mitchell or catmull_rom? afaik lanczos is more useful for upscaling, not downscaling
Also, It is not a piece of shit computer, It runs better than your fucking chromebook.
Why is it that namefags are bigger faggots than tripfags?
It only supports from DX (8 or 9? not sure) to DX11. Most games work, but you have to fuck with a few settings to get it to start working at all with them. A good way to start is to use the pcgamingwiki to figure out the engine of the game then copy the settings from another game with that engine that has a gedosato file already. The default CS:GO profile causes the game to not start though because the HUD intercept gets caught as a cheat so if you wanna play CS you gotta remove that first.
No idea, I've just found that lanczos is way less blurry than billinear and bicubic when running at a res between 1x and 2x native.
>broken Mac with no GPU
>thinking their view matters for PC gaming
Shoo buddy, if you want to talk about PC gaming stop being a poorfag and get a PC.
>I'm the attention whore?
>eMac
>eMac (Hostile)
>hurddurrrr the good old times
so, this is the power of autism
So how has DX12 been doing? I don't think i've played a game that used it yet
Worse than Vulkan
It's okay, but it's nothing close to what was promised.
Neither AMD or Nvidia have full spec for DX12 & both their drivers are shit
Volta & Navi should be way better on DX12 in 2018
>but it's nothing close to what was promised.
For developers it's what was promised. The issue is the drivers and how hard it is to properly code pseudo-lowlevel apis like DX12 and Vulkan
If i recall correctly Battlefield 4 definitely runs better _with a weaker cpu_ in Mantle versus DX11. After all the whole point of these new low level apis was to reduce overhead, not improve performance. If the cpu you're using us a 7700k at 5GHz than it won't make any difference.
What I don't understand is why most games that have DX12 actually run slower than DX11
>wanting post-process antialiasing
disgusting
Doesn't Mantle just crash on any modern AMD card?
>What I don't understand is why most games that have DX12 actually run slower than DX11
I've noticed this too, these next gen APIs seem kinda lame. Even in emulators like Dolphin OpenGL is a lot more smooth than Vulkan during shader generation.
interesting, if i wanted to test DX12 performance any games you guys recommend?
Vulkan is pretty awesome, DOOM runs amazing.
>if i wanted to test DX12 performance any games you guys recommend?
Off the top of my head I can only remember two: Rise of the Tomb Raider and Battlefield 1
ROTTR runs worse on DX12, dunno about BF1
>interesting, if i wanted to test DX12 performance any games you guys recommend?
There really isn't a lot to choose from.
>dunno about BF1
It does too.
I think Hitman is the only game that actually runs better in DX12
DX12 isn't magic, especially if you're not CPU bottlenecked in DX11. If the game is worse at handling the GPU in DX12 than the driver is in DX11, it could run worse. DX11/OGL drivers probably contain shitloads of optimizations and hardware-specific tricks/hacks/shortcuts in various scenarios or games.
>all the DirectX 12 exclusive games are from Microsoft Studios
kek, reminds me of Halo 2 Vista.
the fuck halo wars 2 came to PC, how is that?
>DX11/OGL drivers probably contain shitloads of optimizations and hardware-specific tricks/hacks/shortcuts in various scenarios or games.
They definitely do. OpenGL in particural is so customized on the nvidia side that most games that are tested on nvidia often have many glitches on the amd side. For example, Rage
>Not having SLI 1080tis and supersampling 8K
>really do haveSLI 1080ti's
>1440p 144hz master race
I could never go back
>wanting micro stutters
no, thank you
>microstutter when running capped
Better than macro stutters from underpowered hardware.
>wanting input lag
no, thank you
SMAA is still a post-process AA solution which will never look as good as real AA
deferred rendering is actually going out of favour and people are starting to use foward rendering again
No discernable difference with GSync/Freesync and either in-game FPS limit or RTSS, user-chen.
>wanting large FPS swings in any case