Expect that OpenGL will be simpler than WebGL

>expect that OpenGL will be simpler than WebGL
>it's harder

Where the fuck are the vertex and fragment shaders in this boilerplate code which is already able to render things??

Other urls found in this thread:

opengl-tutorial.org/
learnopengl.com/
opengl.org/wiki/Shader_Compilation
twitter.com/AnonBabble

Sup Forums is that way

nobody understands your technobabble

It's technobabble mostly related to video games. I didn't know you were everybody though, thanks for the heads-up.

What the fuck are you even talking about you fucking faggot? Your drivel bores me.

HOLY SHIT A TALKING DUCK

>3d
Voodoo shills pls go.

It's the weekend, don't bother asking here.

I literally understood none of that

isn't WebGL just a JavaScript API ?
why would you expect it easier?

>Expect that Assembly will be simpler than Game Maker
well gee

A vertex shader takes a 3D mesh, and figures out how it will appear from your perspective.
e.g. making it big if it's close up and small if it's far away.

A fragment shader then colours in the areas that the vertex shader says to colour in.

I assumed it would be easier because I'd be programming without the clusterfuck that is Javascript, and the API wouldn't have been modified weirdly to be browser safe.

Turns out though that implementations of OpenGL can take much stranger forms.

OpenGL is super low-level.

what the fuck man

Fortunately I have boilerplate code that does almost all that crazy stuff for me.

What I can't figure out though is what's hardware accelerated and what's not in a code sample like this.

Does OpenGL even have distinct vertex and fragment shaders like WebGL, or is it looser?

the memory is distant, but you have to write the shaders in a different file.
anyway there should be a million tutorials everywhere. here's this : opengl-tutorial.org/

are you using the fixed function pipeline? (glbegin(), glend() etc.). you shouldn't do that. use the programmable pipeline.
the vertex and fragment shaders should each be stored as single strings.

learnopengl.com/

really this is not related to videogames. go to Sup Forums or /agdg/

theres no shader there. its pure hardware pipeline.

start here
opengl.org/wiki/Shader_Compilation

What in the programmable pipeline do those two functions replace?
And I don't see why not to use a programmable pipeline as long as it's still possible to get the shaders working.

And yet it is capable of rendering an animated triangle.

*why to

yes. this is how they did graphic before shaders existed.

>wants to make a game engine
>expects things to be easy
I hope tho you're not writing all this trash to make the actual game, do you?

just read the tutorials faggot. you might aswell have made this thread on Sup Forums.

Here's something that bugs me as someone who obsesses over efficiency of code.
Most 3D graphics have a depth buffer that is used to determine what is covered and therefore will not render.
But, if the thing covering has transparent bits, then regardless of the depth buffer the thing behind must render.

If the depth buffer first selects things to not render, then when something renders and turns out to have a transparent part which exposes something chosen not to render, what happens?

alpha blend