SONY FAGS ON SUICIDE WATCH
youtube.com
Uncharted 4: A Thiefs End Upscaled to 4K via Xbox One S
Other urls found in this thread:
Should I buy a GTX 1060 or 1070?
HAHAHAHAHAHAHAHAHA
With 500ms of introduced input lag, who gives a fuck
also jesus christ Xbone owners, take the fucking sticker off the front of the system. These dipshits probably leave the little manufacturing stickers all over laptops as well.
Seriously, sony fags will buy PS4 Pro? XBONE its better and cheaper.
Show me some PS2 games like Ace Combat 5 upscaled to 4k, then we'll talk.
GTX 1080 if you want 4K at a reasonable frame rate.
There's literally no evidence this isn't just a ps4 with the HDMI in on the xbone
That's not the point
A 1080 can't do 4k at a reasonable framerate.
If you're okay with medium settings then even a 1060 would be okay for 4k.
>People thinking 4k and 1080p upscaled to 4k are the same thing.
top fucking kek
The real question is:
What looks better?
1080p upscaled to 4k on a 50" 4k TV,
or 1080p native on a 50" 1080p TV.
>using a camera to show detail of an upscale
wew this laddie
So.. What happens if I have a PS4 Pro being upgraded by an Xbox One S?
>Upscaled
Who cares
>Who cares
Who want PS4 Pro?
do you have a 4k monitor
you can't screenshot anything that's not a game on Xbox One, so he has no choice but to take a picture
I don't get it, what's the difference between this and just using the ps4 on a 4k tv
Simple solution.
Don't fucking do it.
id love to see the comparison between PS4pro and standard upscaling.
As someone with a 62" 1080p TV, 1080p looks like trash on it.
I'd never plug my Playstation into my Xbox, the input adds latency and if it's the same on the One S as it was on the regular Xbone, it'll actually drop frames as well.
What's so good about 4K again?
Graphics are fine right now
For some reason normies want 4k when fucking consoles can't even achieve 1080 most of the time , let alone 60fps or anything higher than 30 fps.
No idea.
That's literally what it is in the video
>all this muh upscale
When will you faggots realize native 4K is not cost effective at all yet?
>Horrendous input lag
Why would you ever do this?
is that your wife's son in the pictures?
>sony/samsung/xbox chips
***
What's so good about 1080p again?
Graphics are fine right now
- Some retard, 2006
They don't have anything other than buzzwords.
>upscaling to 4k is worth anything
You're fucking retarded.
>only 50 fps with 2x 1080s
>game doesn't even look that good
top tier optimization
>one terribly unoptimized game
>should I get shit, or something mediocre
..really? What fucking answer are you expecting
Seems to be like HD-DVD and Bluray, where people are shitting themselves over "MUH NEW TECH!" that the vast majority of people won't even have for some time, and just keep using the same stuff.
Same people that are rushing out to buy the upgraded Playstation and Xbox One, but likely using 720p cheap TVs and still claiming they're getting 4k since it's more of a marketing meme than anything by this point, like blast processing.
Nigga, people were using CRTs and complaining about being unable to read the text in Dead Rising at launch. It's heavy marketing to get people to upgrade to something that doesn't make much difference in the grand scheme of things, since there isn't a huge leap from 720p > 1080p, and 4k means jackshit when most devs and films won't utilize it correctly, let alone the consumers lacking the stuff to view it proper.
>DX11
basically pedantic autists about resolution and fps memes
youtu.be
>this game already looked good at 60 frames per second
Is this guy legitimate retarded or what?
The average Sonygger eye can't see more than 30FPS.
1080 was a huge jump great. tried playing dead rising back then on a shitty hotel tv, text was unreadable.
but 4k? It has no practical gaming enhancements. Shit is just sharper.
the performance hit on my 980 gtx isn't worth it when ultra graphics and AA
Downsampling is a thing.
Clarity. Less jaggies. You can use a larger monitor.
He could actually do this with other hardware.
The same way people get around recording/streaming things off hardware that doesn't support it.
Ok but why is he pretending the game is running at 60fps? I don't get it.
That's why you don't use a 980 for 4k.
Fuck, the new Titan is underpowered for 4k.
It's not the resolutions fault that you have a GPU that can't handle it.
That won't work well because Ace Combat's framebuffer resolution is most likely 512x448. Uncharted works well because 1920x1080 gives plenty of resolution sample to upscale with, plus 4K is a clean 2x scale.
Higher resolution = smoother image, graphics were "fine" in 2005, but overspending on visual effects while cutting down on everything else is how we do things right now, so publishers are going to push for higher resolutions and better image quality, even if it's not necessary.
>Downsampling is a thing.
ITT Sony should atleast announce that all games will recieve SSAA if nothing else
>adding even more input delay instead of just using the TV's built in scaler
maybe he has TruMotion on
Apparently his TV doesn't have a built in upscaler.
He clearly says his TV doesn't support that.
He made a simple mistake. Calm your autism.
...
>Needing an scaler when the resolution is just 2x horizontal and vertical pixels
? so..doesnt do anything
The text was perfectly readable in 720p displays.
4k is just as big of a jump as 1080p was to anyone who cares about quality. The level of detail on textures being produced nowadays is being completely wasted and blurred at 1080p.
Damn, Gothic 2 has aged really poorly.
no card can
theyre just milking it hard
>Let's downsample a 1800p image to 1080p and then upscale it to 4k
I've got a feeling I didn't get your point.
>no card can
You're fucking idiots. Just turn down some useless settings and you'll get amazing 4k performance. I bet you're the kind of people who look at 4k benchmarks with fucking MSAA x8 turned on.
You proved his point. If Witcher 3, a year old game, just barely can run 4k @60fps on a titan that sure as fuck means that a titan can not do 4k gaming. Just as my old HD6950 can't do 4k gaming even though it probably could run Minecraft at the resolution.
A proper 4k card needs to be able to run actually new games, and even upcoming games, at that resolution.
I have a fucking 1080. I can't run Witcher 3 at 4k 60fps even with AA turned off.
fyi ps4p can render internally at 1440-1800p
Because western devs don't know how to make engaging and challenging gameplay with good controls to save their lives, so everytime they rehash one of their latest movies they have to put in something easily discernible and it just so happens that gwaphix are what get the most attention of your average american idiot.
PS4 Pro upscales to 4k as well.
>PC screenshot of Witcher 3 being labeled as console
kek
>hair works off
>PC gaming no compromises
Hahaha nice.
That's what I've heard. Also why I though was a weird post.
Why would a PS4P downsample the resolution if it aims for upscaled 4k? Unless he meant that if you use it with a 1080p monitor, that it will downsample to 1080p.
>hurrrr a year old game, because graphics advanced so much since then
This is reasonable framerate. I'm sure it can be OC'd to get to 60+ in most new games.
Put folliage draw distance from Ultra to High. The difference is almost zero and you'll get like 10 fps out of that alone. Put Shadow Details at the absolute lowest, it makes literally zero difference and you'll get 5~7 fps out of that alone as well. Hairworks is also a massive hog. I have a fucking 970 and I can run it at 4k with completely stable 30 fps with mods, you sure as hell can do it with a 1080.
The Titan X CAN run new games and upcoming games at 4k.
I never get this whole argument where "HURR, IF YOUR GPU CAN'T RUN NEW GAMES ON ABSOLUTE MAX AT 4k AT 60 FPS THEN IT'S JUST AS BAD AS MY SHITTY 6 YEAR OLD GPU THAT CAN'T RUN MODERN GAMES AT 60 FPS AT EVEN 1080P" is somehow a defensible position.
And I say that as someone with a 280x.
As someone who only wants 1080 60fps Ultra because they're used to 720 low settings, what card should I get? I was originally considering a 1070, but that seems to be for 4k/1440p.
thats the thing, there isnt a difference.
this is what you call shitposting. and dumb consolewar faggs will always take the bait.
The PS4 Pro doesn't render to 3840x2160p native. It renders at some arbitrary horizontal resolution by 1800p minimum, which is them upscaled to 3840x2160.
Majority of those cards listed can easily handle 2560x1440 and should be more than capable of handling an arbitrary by 1800p resolution at 30fps. Given that 99% of all consoles games will run at 30fps at this high resolution.
>you have 1080p monitor
you get 1800p SSAA
>you have 4k hdr tv
you get 2k upscaled via checkboarding
they forgot to upscale his legs
>using the old version of that picture that is missing a glove
come on, user. we all expect more from you.
i love the guy in OPs video he really appreciates the quality i love it because i consider him a normie and i didn't know normies cared about graphics that much
DELETE THIS
Just get a 1070 so if you decide in the future to go 1440 you're already good
Even a 980 could run games in 4k30 with console settings. But apparently the PS4 is weaker than that so it won't.
MS could pull it off since the Scorpio has something better than the 980 (6 tflops).
>Even a 980 could run games in 4k30 with console settings
Hell, in some games a 980 could run them at 4k60 on console settings.
>$1200 gpu vs $399 gpu
Some nice clarity man.
>Even a 980 could run games in 4k30 with console settings.
you'll get burned alive suggesting anything less than unstable 40-60fps on pc though
>4k gaming channel
>it's just videos with off screen TV footage of console games
kek
>399 gpu
You mean full gaming machine and controller.
It doesn't matter how much graphics have or doesn't have advanced. Only that it can hardly manage today's games, or even yesterday's games. How the hell is it going to handle tomorrow's games?
The Titan is a fucking monster of a GPU. But that doesn't change that it is not a proper 4k card. It can run 4k, but anything can run 4k if you diddle the settings. Doesn't make it a 4k card.
>But that doesn't change that it is not a proper 4k card.
It is, even in the graph you posted the absolute lowest framerate was ~39 FPS.
Literally all you'd have to do to average 55-60 FPS in FO4 on that GPU is turn the volumetric fog setting down a notch or 2.
The Titan XP is indeed a 4k-capable card on high or even max settings.
4k movies have been a thing for a while now
>How the hell is it going to handle tomorrow's games?
You should know that graphics stagnate throughout every console generation. They aren't going to get better. In fact, right now they are unoptimized because devs are still brute forcing shit. The card will probably do better with future games.
>xbox owners have to connect to a PS4 to get exclusives
because pushing the boundaries with cell/newtech almost bankrupted them pandering to technophiles isn't the way
>everyone needs 4kbd
>ps3 had tons of useless shit like printer and linux support
PS4 Pro RENDERS at 4K. There's a difference.
I could actually be totally wrong, but I thought that's why they were hyping it up so much. Doesn't the PS4 Slim upscale to 4K?
Everytime i see that it makes me angry. The first thing i did on mine was remove the fucking sticker.