Why didn't you need a GPU for this game?

I don't remember needing one. Are GPUs just a scam to make us spend more?

Other urls found in this thread:

en.wikipedia.org/wiki/Graphics_processing_unit
en.wikipedia.org/wiki/ATI_Mach
twitter.com/NSFWRedditVideo

You needed GPU for that you dumb fuck nigger. At that time they called that a VGA card.

Your computer had an integrated one. And games back then were really well engineered cause they were made by real engineers instead of today's shitty "designing" art teams.
All DOS games didn't really need anything but an integrated GPU to function

because it was 2d and optimized by carmack

carmack was a genius

A better question is why do we need CPUs which such good GPUs right now.

Lots of games used software mode that had the textures look worse but you didn't need a dedicated video card to run it

You don't really know how computers work, do you?

I do.

I have a masters in computer science.

It's because it uses raytracing, which is simple enough that it can be done on the CPU. For more advanced graphics more processing power is needed. That power can come from the CPU, but since it isn't optimised to do the kind of calculations advanced graphics need, it probably won't be able to keep up with rendering a frame every 1/60th of a second. Since GPUs are specifically made for those kinds of calculations, they can easily keep up with rendering a frame every 1/60th of a second.
> inb4 'replying to bait'

>raytracing
It's actually called raycasting.

afaik Doom wasn't done with raycasting but some binary partitioning of game world to resolve visibility

If so, why didn't he make it an actual 3d game like System Shock?

Pics or it didn't happen

Optimization and use of well known shortcuts to processing like the quick inverse square root.

Ok fucking underaged faggots. I will assume that you are either children or just plain braindead (probably both).
A GPU is a graphics processor, what at one time was known as an "accelerator card", it was called that because before 3D accelerators, videocards DID NOT process graphics.
A GPU isn't a "VGA card", videocards back then only told the monitor what to display but didn't do any processing of the actual graphics whatsoever.
All the calculations to render Doom's engine were done by the CPU, it didn't matter what videocard you had.
That doesn't mean the GPU was integrated into the CPU like today, there was NO GPU, integrated or dedicated.
Educate yourselves.

Fuck off faggot

That's doom2. Doom1 used Ray casting.

Your mother.

stay butthurt brainlets

You are still a fag though.

...

And next time mention that you are talking about 3D accelerator cards.

I'm not OP idiot.
I just saw your plain retarded wrong posts and sperged the fuck out, doesn't mean I'm not right.

Doesn't matter.

no, it was Doom 1 already

And listen up faggot. If you look up wikipedia you will see 2D cards on that article.

en.wikipedia.org/wiki/Graphics_processing_unit

Now go suck a fat dick.

The original DOOM did NOT use raycasting, except for hit detection when firing a weapon.
It uses BSP layering to determine paint order, raycast rendering is stupidly fucking expensive and would never have been used.

>brainlet quotes wikipedia because he doesn't know shit
Why don't you quote me your blog as well? I'm sure it is a very reliable source.

You underaged assholes suck. Fuck off

64453724isfuckingstupid.tumblr.com

One way to understand is to look at how a computer does calculations.

There are good videos on youtube that e.g. shows the mechanism computers uses to add numbers with the use of Dominos.

Then you understand that the computer is built to control electricity in configurations that allow it to do calculations, additions require some configuration (of transistors really, or on a higher abstraction: gates such as NAND gates) while multiplication requires another configuration. (sidenote: you can take it many steps deeper like how do a transistor work and you get into physics)

Turns out that 3d graphics require alot of multiplication. Some call it "matrix multiplications" but that's just many multiplications that are being added together.

So.. a GPU is a piece of hardware with alot of these configurations of gates that can do one thing: multiply and add. And their architecture allows them to do many multiplications at the same time.

I think that one reason why the GPU is split from the CPU (and in some cases it isn't look at APU) is that you really do not need the amount of multiplication and addition unless you're gaming or doing other work that benefits from that such as deep learning or encoding video, the CPU can do it too just not as fast.

GPUs as we know them didn't exist back then

only shitty VGA cards did which didn't accelerate shit

it was raycasting with a BSP tree to slice up the map and determine visibility, instead of a simple grid/array map as used by the earlier "3d" games like wolfenstein

>raycast rendering is stupidly fucking expensive and would never have been used.
ray tracing is expensive, casting isn't (it carves the screen into vertical slices and uses one ray for each slice instead of one per pixel)

>ray tracing is expensive, casting isn't
/thread

That said, did VGA cards from this time offer any gimmick functions besides passing pixels to the monitor?

>Why didn't you need a GPU for this game?
Because it relied on graphics that were simple enough that the CPUs of the day could do them fast enough.

You don't need a GPU for today's games either -- you can render them entirely in software. It's just too slow, given the overwhelming amount of graphical detail in modern games. You need a GPU for that because GPUs are optimized to do certain particular computations very quickly, much faster than a CPU can.

Because GPUs are rubbish for everything else. The things they can do efficiently are quite restricted.

Because a CPU can easily render pixel graphics, but more complex shit such as polygons and drawing and complex effects require dedicated and specifically designed processing, you stupid dumb cracker.

Acceleration for Windows applications was probably one of the first really big instances in mainstream consumer video controllers. Beyond that, early true “GPUs” as we know them today were mostly used in CAD accelerators, paired with a more typical video controller to handle most of the display functions while the GPU focused only on the applications that supported it.

Ah you're right, I was thinking of Wolfenstein 3D. My bad.

Carmack leveraged some black magicks to optimize the shit out of this engine.

>are GPUs just a scam to make us spend more?
To a certain degree yes, developers put graphical features into a game that only work on certain graphics cards (Nvidia hairworks for example). Honestly I don't see the point of buying a high end graphics card, you can perfectly run modern games on medium settings at 60fps with a graphics card under 200 euros.

Also this topic is borderline Sup Forums related.

according to how he talks about it now, some places were pretty hacky

Did those magical VGA cards offer hardware doublebuffering?

>tfw window decorations are more GPU intensive than its content
:^)

>mfw this thread made me want to play Doom
thanks guys

Today we have tons and tons of abstractions because frameworks and engines. Games never looked better, were never easier to develop, but also consume more resources than ever and somehow manage to be shit anyways.

Sort of on-topic, did anyone other than 3Dfx make "3D accelerators"? From what i can tell all the other manufacturers were doing combined 2D/3D cardswhereas the first voodoos kept it 3D only.

Because doom is basically a top down shooter with clever rendering. Each map is basically completely flat plane, you can set textures of objects like floor, walls & ceiling. Then the level had info what parts of it should be drawn from where you are, and in between what to put objects/enemies. From that point on they just had to draw slightly distorted tiles on a flat 2d screen, moved up/down based on level "geometry". So no 3d, just sprites.

Duke Nukem wasn't flat plane

so?

he is right dip shit

Yes, There was the S3 Virge and Rendition Verite, Nvidia NV100 and ATI Rage, out at the same time as Voodoo cards.

I remember first playing Doom on a friend's Tandy about a hundred years ago. I didn't even know what a GPU was back then.

mechwarrior 2 was my first 3d accelerated game.

Whiplash/Fatal Racing is a gorgeous example of early 3D CPU-only rendering. Played on the right hardware it's incredibly smooth and never drops a frame, but will make any sub 120mhz CPU cry for mercy.
One of the first games to support hardware acceleration as well, the 3DFX version loses some texture definition but gains even smoother motion and transparency. Damn good game too.

>the absolute state of Sup Forums

Because System Shock can't run on a SNES or even on a GBA.

Software rendering. Are you really that dense?

>en.wikipedia.org/wiki/ATI_Mach

The ATi Mach line was a series of 2D graphics accelerators for personal computers developed by ATI Technologies.

DN 3D doesn't run on the same engine as Doom 1/2. Still, it's basically the same shit, just with slopes.

sure, it is still an accelerator, games using the software render with an ordinary card still only use the CPU for the actual calculations

the more resources you give programmers the less the optimize and the more they waste.

You don't need a GPU to play Half Life / Counter Strike too. It's all done via software rendering.

It was rendered in 2D. Pretty sure it DID use the GPU. It used VESA acceleration which gave more direct access to the GPU.

>GPU
>they called that a VGA card
The absolute state of Sup Forums

>It used VESA acceleration
No it didn't, it ran in mode 13h

No you don't

The modern GPU is there for hardware acceleration. Anything the GPU can process your CPU can process, it is just slower.

thanks for taking the time to write this out so i dont have to. unfortunately i doubt anyone who doesn't already know this will actually read it. such is the life of Sup Forums shit posters.

Hello embryo.

>That said, did VGA cards from this time offer any gimmick functions besides passing pixels to the monitor?

Some of them did IDCT acceleration and other video accel functions. It wasn't until 96 or so with the Voodoo that they became capable of doing dedicated texturizing tasks for games. There were hardware that could do that before, but they were for CAD usage.

GPUs are good for highly parallel processing of large blocks of data, which makes them good for certain specialized tasks including rendering, or as of recently cryptocurrency mining (which benefits from parallel computation of hashes)
CPUs exist because not all tasks are better oriented to running on hundreds of cores.

>video graphics accelerator
He's not wrong

...

>mode 13h
good times

>video graphics accelerator
You're not right.

How to be wrong and obnoxious at the same time: the post

Holy fuck. Seeing a screenshot of games like Doom and Duke3D give me an instant dopamine hit. The colors, sprites, the midis... I remember I spent so much time making maps for these games, I got really good and creative, I wanted to make a living doing that when I grew up.

Now im old and a NEET. If I only knew how much of a pipedream that was.

it's all over.

Everyone in this thread is subtly wrong. The old 2D cards were graphics cards, and you could certainly call them GPUs if you wanted to since it does have a processor that does computations associated with graphics. But other user is right in that no one called them GPUs back then. When it comes to 3D graphics, they were indeed called 3D accelerators when they first came out. GPU is purely a marketing term used by NVidia. So there are 2D cards that you could certainly retroactively call a GPU, way back to the ISA days (the term VPU, visual processing unit, was actually the common one). So this whole argument is pedantic bullshit. On the one hand it's wrong to call a pre-3D accel card a GPU because no one as calling them that back then. On the other hand, if your definition of a GPU is something with a processor dedicated solely for graphical computations, then it's perfectly fine to call them that. Take your pick depending on what kind of faggot you are.

i played the moon man doom mod the other week, couldn't stop laughing

it's like 8th grade all over again for me

loved those games and still play them every now and then for the feels

thank fuck they were open sourced so people could keep them alive on newer hardware (still mad that unreal 1/ut wasn't open sourced even though they released their latest engine, reee)

You are wrong. All of the bad things that are currently going on are a symptom of the missunderstaing of how VGA chipsets worked back then and the voluble statements of some people in the industry. They are not a cause; they are an effect. I assert that VGA cards -or chipsets or however you want to call them- have a penchant for counterinsurgency and clandestine operations, even though that presupposes a dialectical intertwinement to which a cheeky turn of mind is impervious. Is that such a difficult concept? It's doubtlessly astounding that he has somehow found a way to work the words “hexosemonophosphoric” and “anatomicochirurgical” into his vaporings.

All those boards were 2D/3D. As far as I can remember the PowerVR was the only other card that was 3D accel only.

In approximate order of appearance
- Indexed color to RGB conversion (in

Doesn't vanilla zdoom just run in software mode like the original?

DN3D used Build, a more advanced engine, but still based on a 2D plane. You could do some hacks to overlay rooms one over another, but with a ton of restrictions.

Virge was a graphics decelerator though.

The PowerVR chips went through some really ridiculous metamorphoses. PCI add-on cards -> Dreamcast -> Failed integrated GPUs -> Mobile GPUs -> Every fucking iPhone

Yeah, they pivoted well, I guess. Same with Matrox. I googled them a few months ago and was surprised to see they're still in business. I remember lusting over the G400 and its beautiful bump mapping.

No the graphics are getting better than the hardware.

>tfw there are like a dozen old boxes with G450 dual-monitor cards at work waiting to be disposed of

Kinda sad to see them go, dual monitors were the shit back in 2001. I'll probably try to swipe one for my junk collection.

From where, some chink university? I bet you can power through Microsoft Word.

>TFW when you make a shitpost and cause Sup Forums to sperg out

Are you this big of an ass to your friends too?

It's a chipset that runs a framebuffer. That's all card is.

Even if you did have a 2D/3D card which was extremely rare back when Doom was launched, the software did not use it anyway. All of the rendering was done by code which relies entirely on the CPU. This is why the jump from Wolf3D to Doom required the upgrade from a 286 to a 486 CPU. Back in those days, reliance on a certain expansion cards (ie. Sound Blaster) meant that drivers had to be coded by the developer which is the opposite of how it is done today. This is why if you look at old 3D computer games they only list say 6 or 7 cards that are supported for 3D acceleration with a little disclaimer saying if you didn't have those cards then it was software for you. Now days it is up to the manufacturers (Nvidia, AMD, Matrox, Intel et. al.) to ensure their drivers support the software you are trying to run.

even if you meant 3d accelerator, doom isn't a 3d game, or at least, it's not a 3d engine

loved that game, though it chugged along on my 486DX2-66

I didn't know it and I read it. Can I have a cookie?

game devs didn't write soundblaster drivers, they just shipped them with the game
they did have to write in support for interfacing with that driver though
and the difference is more than we now abstract away things like sound and graphics to OS apis, this is a requirement for true multitasking as well, so it was inevitable one way or another

Not really.
All the sound blaster drivers did was to set a system variable to tell the game the hardware configurations of the card (the set blaster line) and setup the sound blaster volume levels.
The game accesses the sound blaster I/O ports, responds to the IRQ requests etc manually.

>when I played it with a two graphic card combination

Why you needed IRQs?

False.

A blitter is a GPU.
Any PC video card made after MDA, CGA and MGA, is basically a GPU.

Your wikipedia knowledge is lacking.