GPU's BTFO. You now only need this cable to do anti-aliasing on your PC instead of wasting GPU power

GPU's BTFO. You now only need this cable to do anti-aliasing on your PC instead of wasting GPU power.

>pcper.com/reviews/General-Tech/Marseille-mCable-Gaming-Edition-Remove-Aliasing-HDMI-Cable

Other urls found in this thread:

pcper.com/reviews/General-Tech/Marseille-mCable-Gaming-Edition-Remove-Aliasing-HDMI-Cable
pcper.com/files/review/2017-09-25/hitman-comparison.png
twitter.com/SFWRedditGifs

ew, FXAA quality AA? pass I would rather have no AA.

>$150 for GTX1080 graphics
bargain of the century

The cable is sharper than FXAA.

And let me guess, you cannot see faster then 30FPS?

>$150 is worth a little more anti aliasing

You've basically described the GPU industry.

FXAA doesn't look that bad to my eye at 4K.

>HDMI
>pricing
>doesn't work as good

The cable doesn't make consoles magically run games at 60fps

Does this increase input latency?

That's most likely a gimmick product.

>pcper.com/reviews/General-Tech/Marseille-mCable-Gaming-Edition-Remove-Aliasing-HDMI-Cable

Its significantly better than having no AA, and doesn't have the same over all blur smoothing effect that FXAA does. In Hitman it seemed close to the in engine SMAA. Pretty impressive. If PCPer's assessment that it adds no latency on the signal processing then this is a pretty big jump forward. Separate ASIC edge smoothing would be a huge boon to the gaming market. If these tech keeps evolving it could have pretty big implications for the display and GPU industries.

>using anti-aliasing

Some cable boxes have really shitty upscaling, it's usually better to leave it to the TV if it's not a chink piece of shit

But if so this might be an option.

>not using AA free from any performance impact
If this little ASIC could muster an 8X pass then that'd pretty much solve the issue of jaggies forever. Resolution and pixel density is increasing. We aren't going to need much more to increase image fidelity and get it to a level where its imperceptible from reality.

I'd be super interested in seeing this decapped and examined by chipworks. It'd wager its using commodity tier 40nm bulk silicon They likely could achieve that performance right now if they had the money to pay for a newer process.

If you connect two together, can you upscale 480p to 4k?

Would this have any use on a retropie system, all them old games getting new crisp graphics or am I out my mind?

What if you connected 4 of them, can you see into the future?

It would probably just look like one of those shitty emulator screen filters.

Na, this shitty product is doing a simple edge detect blur.
It's pretty obvious on this comparison image.
pcper.com/files/review/2017-09-25/hitman-comparison.png

While the outlines of character models are nicely softened the edges of the road, the circular paving stones and bench around the tree are hardly touched.

It's pretty much shit. And you have to power it.

It will definitely increase input latency because it is modifying the frames as they pass through it.

>only the highest quality snakeoil

>simple edge detect blur
That's literally how FXAA works. Edge detect is depth-aware, though.

you can run at lower resolution without AA

>It will definitely increase input latency
>Comparing a standard HDMI cable to the mCable, we saw no additional lag introduced by the signal processing.

Look at their fucking test though.

It's fucking PONG.

>Sharpening halo's
>Loss of detail
>Screenshot showing the mCable output doesn't show the HUD items, because it probably fucks them up

How about no

>m'cable
kek

...

>Buy absolute bottom tier GPU
>Then buy 150 dollar cable to reduce jaggies

Should've put that money on better GPU for same effect.

GAMING
EDITION

dont forget it also increases latency

>paying this much money so virtual grass looks slightly less shitty

Why are gamers so stupid?