He makes a youtube video shilling linux

>he makes a youtube video shilling linux
>you can mass screen-tearing the whole way through

what say in your defense, freetards?

Other urls found in this thread:

phoronix.net/downloads/drm-next-4.15-dc-2/
youtube.com/watch?v=3esPpe-fclI
twitter.com/SFWRedditVideos

Screen tearing happens in w**dows as well. If you say otherwise you are a brain washed micropenis cuck

Post video. Thanks.

he didn't turn on vsync

>what is vsync

This is he fault of the driver makers you retarded numale cuck

>screen-tearing the whole way through
Mmmm

>what is input lag

something that doesn't matter unless you're top 5% and playing competitive

Disable compositing.

Sup Forums here.

When you have v-sync disabled, you will see tearing at low fps no matter if windows or gnu+linux.

Youtube videos are usually re-encoded/uploaded at 30 fps, so of course it's gonna look like that.

V-sync gives input lag, so you should disable it and run the game at high fps so you can't notice tearing. At about 120 fps or more tearing becomes unnoticeable. This is not OS specific.

>Youtube videos are usually re-encoded/uploaded at 30 fps, so of course it's gonna look like that.
Sup Forums, you're a moron.

>Youtube videos are usually re-encoded/uploaded at 30 fps

I watch all types of videos and even hulu, and never get it. On xfce also.

maybe he disabled it
i only enable vsync on slow paced games like turn based strategy, for fps i always disable vsync

Please stay in Sup Forums. In fact, this stupid thread belongs in Sup Forums.

This is so false I dunno whether to laugh or cry

Enlighten us

>install ubuntu
>screen tearing up the ass on desktop
>try to rearrange monitors
>display image is corrupted
sounds about right

he's not wrong, besides youtube having support for 60fps video (though there's still a lot of 30fps video on it)

Encoding or transcoding video will drop frames to reach the desired framerate (assuming you're maintaining video length) so it won't induce tearing.

>he's not wrong
But that's where you're wrong.

You fucking faggots can't even rearrange your monitors without fucking it up? Pathetic.

which part of what he said was wrong?

The part about tearing at low fps, the part about uploading to Youtube at 30fps making tearing a thing, the part about 120fps making tearing unnoticeable.

What's at fault here that causes the screen tearing? The monitor and the operating system, the game, or the stream/video?

Granted I had 2 monitors one on iGPU and one on an r9 380, but both Windows and macOS handles this just fine on the exact same system.
You would figure on one of the most supported Linux distros simple shit like vsync and monitor arrangement should just work

he didn't say low fps causes tearing, but tearing is more noticable at low fps, which is true
30fps recordings are low fps, a 30fps recording of a 120fps session will have more noticable tearing than the 120fps session itself

The monitor refreshes at its own rate, 60, 70, 120hz or whatever. The frame buffer updates as often as the graphics card completes a frame.
If the framebuffer is updated at the same time that the monitor is pulling the contents of it then the monitor will get part of the old frame, part of the new frame. The difference between the two frames is shown by the tearing.

>30fps recordings are low fps, a 30fps recording of a 120fps session will have more noticable tearing than the 120fps session itself
No. That's pure bullshit. Tearing is a result of progressive scanning. Screen recording is not done progressively (unless you are using a physical camera and god I hope you are not using a physical camera to record your gaming) so it will not introduce or increase the visibility of tearing.

im running windows with a 1080ti and i noticed screen tearing in portal and portal 2. i dont think linux or his drivers are at fault there.

If you don't vsync of course it'll tear.

Portal is a high speed game, if you have graphics card that can put out hundreds of FPS and your monitor can't keep up then it's gonna tear noticeably. That's unavoidable.

Or get a GSync monitor.

I just wish we could get Freesync support on Linux

phoronix published several days ago the kernel they used to test the drm-next amdgpu dc driver that has been merged which includes freesync etc

phoronix.net/downloads/drm-next-4.15-dc-2/

kernel flag is:
amdgpu.dc=1

Hahaha oh wow this makes me laugh but also so mad. You pay out for a (relatively) decent graphics card, run a 10 year old game like Portal on it, and still get screen tearing because you don't know how to vsync. What really makes me mad though is thinking that's normal. Jesus Christ I hope you're literally retarded for your sake, because this is inexcusable.
Fuck I am so mad.

Oh that's cool, thanks. Is it hard to change the kernal though? And couldn't it cause errors? I never messed with kernals so Idk

I'm curious, couldn't you just cap the FPS to your monitor's refresh rate and it would produce the same results and Vsync or is that not how it works?

just read up on how to enter kernel flags for grub at startup and(normaly when you select the kernel to boot press o to edit the stuff and add the flag where it belongs) it should work but if it doesnt you can just boot the normal kernel you got installed and it wont change a thing so its pretty save

sudo nano xorg.conf.d

Section "Device"
Identifier "Radeon"
Driver "radeon"
Option "TearFree" "on"
EndSection

Problem solved. Idk about Nvidia, probably a similar fix.

Liar

>sudo nvidia-settings
>Check full pipeline compositing
Done

There you have it. I don't understand the linux bashing. I have had the same amount of problems with windows and they are often more difficult to track down. don't get what the fuss is about.

Pretty much but your monitor update must be in sync with your framebuffer, which is what vsync is doing.

>144hz monitor
>framerate lock to 120
>tearing still present

I like this bait, it's hardbait.

You mean good, but don't bother.

This actually says it's 25 fps, so it supports my point stronger.

what vsync does is it provides feedback from the gpu when precicely each frame is scanned out to the display, allowing for software to flip (switch to the next complete frame, assuming double+ buffering) exactly /between/ display refreshes
capping just tells the software to render no faster than X fps, without vsync, it doesn't know /when/ the display refreshes, it also doesn't account for clock drift (when two clocks running at what should be the same frequency, drift apart from one another, your display may not for example be running at /exactly/ 60Hz)

>What's at fault here that causes the screen tearing?
using xorg/x11 instead of wayland

Does screen tearing actually bother anyway? I mean I see it all the time on every platform and I just thought of it as a graphical glitch
Only recently have I seen people complaining about it

What is adaptive sync

not a common feature

>what is fast sync or gsync?

It's literally supported by every program when forced through the driver control panel. I find that it's kinda shit though since you still get screen tearing under your monitors framerate.
What I've found works best is capping my framerate at 120hz on my 144hz monitor then enabling gsync.

It bothers me, yes. It's not so bad in video games when you're focused on the action more than the visuals, but in any kind of video it's obnoxious

>Or get a GSync monitor.
Won't work. I had to use fast sync with Portal because it would literally black screen with gsync enabled. Not sure why but it's the only game I've had do that.

Recognized the video:
youtube.com/watch?v=3esPpe-fclI

OP is faggot, video talks about how to use compositor to reduce screen tearing, screenshot is literally from section presenting screen tearing

Could be a capture card in the middle of its draw cycle suddenly being presented with a new frame

>furfag profile pic

Triple buffering gives like 1 frame of input lag at worst
Nothing noticeable, unless you're esports cancer

> Driver makers don't want to support an OS where everything newer than a fucking teletype requires layer after layer of abstraction and hacks to work.
> This is their fault somehow.

works on my machine