Writing O(n^2) algorithms where O(n log n) is a known solution

>writing O(n^2) algorithms where O(n log n) is a known solution
I seriously hope you guys don't do this

Other urls found in this thread:

sites.google.com/site/scienceandmathguide/subjects/computer-science
twitter.com/NSFWRedditImage

there was a guy recently who unironically suggested O(n^3) sorting alg
Sup Forums has become pleb shithole

Just O(n^3).

Care to elaborate? What book can I read to be more informed on the subject?

Computer science books

Which do you recommend?

sites.google.com/site/scienceandmathguide/subjects/computer-science

Totally a waste of time, today all computers are super fast, so no neee of writing efficent algorithms.

>what is scaling
It'll take you a while but you'll learn

>using O(nlogn) just because it's "better" when a O(n^2) might outperform it for appropiate data sets
this is the "copy paste solutions" mentality showing its ugly self again

>what is amortized complexity
>what is average complexity
O(n^2) is in most cases simply lazy solution for the inexperienced dev

I asked for a book name, not for a google search. If you can't name one, its okay, just stop pretending you know stuff.

Algorithms by sedgewick and some other guy

Algorithms 4th edition

Skip the Java bullshit.

Fuck off retard

Oh they totally do, just slap a web interface on it and call it 'modern'

>wasting time coding overcomplicated solutions with no real world benefit so you can feel superior for no reason

I sure hope you don't so this

>rewriting algorithms
Who /uses standard data structures for all their computation/ here?
Oh wait, you're looking for jobs. So am I, I guess.

>There are people who unironically still use O(n^2) for Matrix vector multiplication.
>You can actually do a good approximation in O(n log n)
I Seriously hope you guy do not do this.

CLRS. Read that book from top to down and vice-versa. You and the book shall become one, such is its importance for a programmer.

Sometimes O(n^2) is better than O(n log n) for small data size

Quick sort is a typical counter-example.

I somewhere got information that 60% of internet is powered by wordpress... That's even more sad, than your algorythms

i was so proud when i implemented my own art gallery problem code, only to my teacher to subtlety diss me for doing it O(n^2)

Define sometimes

SICP

If we can guarantee that N is small, it probably doesn't matter.

Be pragmatic, not autistic.

omg, Jill of the Jungle! I loved that series.

It's a shame the third one had that game-breaking bug in level 12.

O(1) and O(logn) are the wrong colors on the graph here.

>he doesnt know about the constant factors
Hey OP did you know that you can speed up quicksort by adding an O(n^2) insertion sort?

>Talking about an O(n^2) algorithm i made for a code interview
>Only realize later that I called it O(2n) by mistake

Its fine who wants to do .NET anyways, I'll just write frontend JS forever

Why isn't anyone talking about Jill of the Jungle?

Not hardcoding the functions range so that it becomes O(1)

>not precomputing all of computing

here is an illustration my brainlet friend

>what are constant factors

>ITT: people fall for the less CPU cycles jew and don't care about space complexity or I/O ops.

Every time

One of my personal old school favs. Although that pic is from Jazz Jackrabbit.

Nice bait into spoonfeeding, you got a lot of catches.