"1" + "1" - "1" = 10

"1" + "1" - "1" = 10

only in javascript

Other urls found in this thread:

en.wikipedia.org/wiki/JavaScript#Simple_examples
twitter.com/AnonBabble

>being a retard that uses type coersion
>expects numbers written as strings in formulas to render the same result as numbers
you're the retard here, not js, you fucking babby.

"1" + "1" = "11" and that's nothing fucking new
then you do "11" - 1 so of course js just tries to interpret the string as a number

gtfo

(OP)
Not JS programmer nor OP but:
OP:
>"1" + "1" - "1" = 10
Angry webpleb:
>then you do "11" - 1

Its no "11" - 1 but "11" - "1"

+ works for strings and math. Because "1" and "1" start off as strings, they get concatenated to "11".

- only works for numbers. "11" - "1" results in them being cast is numbers, so 11 - 1 = 10.

"-" isn't an available operator for strings so javascript infers by your use of it that you meant to use numbers and then coerces the string to numbers.

For some reason the web had to be incredibly liberal with what it accepted so now we live in a world where you dont have to close html tags and javascript automatically converts strings to ints for you(actually no such thing as ints in js but you get the point)

>low hanging fruit
Sup Forums are you even trying?
> [1,2,3,4,5,6,7,8,9,10].sort()
[1, 10, 2, 3, 4, 5, 6, 7, 8, 9]

sorts it by unicode point. 1 < 2 so 10 comes before 2. Also fucking retarded

Fuck, do you realize the proper response would be a type casting exception? It should not assume things this way.

Look man, I only ever use C++ unless I'm forced to use something else (Python at work sometimes), but that's an explanation of why javascript does what it does.

I am sure OP did know HOW it computed the 10, the question he is asking is WHY would anyone (especially the language designers) think it's a desired behavior.

>the language should not attempt to do something sensible and predictable when given poor instructions. I should have to hand-hold the language through every little thing that I want it to do

The way javascript handles this makes perfect sense to anyone who knows javascript. If you don't want automatic type coercion to happen, that's fine. Don't write code that causes it.
But for fuck's sake don't pretend like all languages have to conform to the typing system you're personally used to and are required to not attempt to make sense of input.

Language should complain about poor instructions. If I have a slave and he didn't understand what I want and had to explain it twice, I would be mildly annoyed. If he just did the wrong thing I would be very angry.

Javascript was an early language, and they didn't want to be throwing exceptions everywhere. Now it's too late to change.

>Actually defending this shit.

ok explain pic related, just typed it into the chrome console. Please explain how and why javascript does things this way.

>mfw [Object object]1

It's not hard to explain how (expect the !{} + !{}, I have no idea when it comes to that one).

But I'm really interested in why.

Hey you got this from my post on /r/Sup Forums didn't you?

Javascript came after Java (hence the name) and Java had exceptions.

Dynamic typing is bad for many reasons, but simple behavioral eccentricities like this shouldn't be an issue if you write decent tests.

You do write unit tests of course, don't you?

My bad, I meant it was an early dynamically typed language, not an early language.

Also, Java and JavaScript came out on the same day and JavaScript was originally named something else.

>It should not assume things this way.
It should because it serves in the purpose of the language. Don't like it? Get over it or use something else.

>mfw my company requires 100% test coverage
>mfw the python fags can just write unit tests that cover things the compiler should
>mfw the c++ devs have to write real tests for everything

So is javascript shit or are you shit for not understanding it?

Really makes you think doesn't it?

Javascript actually does not derive from java, it stole the name for marketing purposes. Not defending the shitty typing, just saying.

No. Most bugs occur at the boundary of an API rather than easily isolated in unit tests. Also a unit test only tests a programmers expectations of what goes wrong, and by definition a bug is something the programmer didnt expect to go wrong.

In other words, favor integration/functional tests over unit tests everytime.

If {} is converted to boolean, it becomes true (because it's an existing object, not null.)

Therefore !{} is false.

Therefore !{}+!{} is the same as false + false.

When false is converted to number, it becomes 0.

Therefore false + false is 0.

Unit tests are mandatory when you're at a large org though, so random people won't inadvertently break your shit that they don't know or care about.

>It does not matter this feature is a common source of a bug, you should write unit tests all the time!
>It does not matter this food gives you diarrhea, you should have toilet paper all the time!

Both are necessary to have any degree of confidence in your logic.

>Most bugs occur at the boundary of an API rather than easily isolated in unit tests

That's simply not true. You seem to be under the impression that a programmer has infallible expectations about the code he writes. Even the most experienced programmers in the world can't be trusted not to have simple mistaken intentions.

I didn't mean to imply that, I'm just saying that it arose in the post-Java programming culture.

The main usefulness of unit testing is to force the programmer to think though all the cases.

It's true that the programmer will only find the bugs that he's expecting to find. But writing the unit tests will help the programmer to broaden the cases that he's considering.

And if someone else writes the unit tests, then so much the better, because it confirms that two different people understand the API in the same way, which promotes another dimension of robustness.

So, yeah, there are good reasons why unit tests are used.

en.wikipedia.org/wiki/JavaScript#Simple_examples
var x; // defines the variable x and assigns to it the special value "undefined" (not to be confused with an undefined value)

Elements that do not work with an operator operating on them and another type are pulled up the type tree until they work. Many times this results in them all being converted to strings, often after one or both of them are converted to something else first.

{} + {} Object types are not defined for the addition operator. Both are attempted to be cast to strings using their builtin toString() methods, for with the default is to return the string "[object Object]" at which point the strings can be added.
!{} + {} The "!" operator will take anything falsey and return true, or return false otherwise. an empty object is not falsey, so it becomes false. An object cannot be added to a boolean, so they are both cast to strings and added ("false" + "[object Object]")
!{} + !{} Both are casted to booleans in the same way as above. Addition is actually defined for booleans (there is no further type coercion) with false treated as 0 and true treated as 1.
[] + {} + 1 Addition not defined for an array and object. An empty array becomes an empty string by its default toString.
[].push(null) + {} + 1 Array.push returns the new size of the array (1). all the toString() logic same as above

>Even the most experienced programmers in the world can't be trusted not to have simple mistaken intentions.

This is so true, and half of Sup Forums seems to think otherwise. If you are 100% sure you are writing without any mistakes, you are writing fizzbuzz all the time.

again, the vast majority of bugs are from external services, databases, configuration file issues, mismatched platforms/dependencies in production, issue in the deployment process, etc.

I dont mind unit tests, but I think they have very little value compared to the time it takes to write them. There is no units in modern development, there is a mountain of abstracted layers living underneath all code that's written, they leak out constantly.

I avg about 20% unit test coverage on any given project with 100% integration tests. Im a consultant so I value time to feature and the ability to make significant changes midway through a project. Unit tests simply arent that valuable in my position.

Also dynamically typed langs are a meme, unless that language is purely immutable like erlang.

Yeah, and !!{} is equal to 1 too btw.
user@pc $ nodejs
> -!![]>>>!!{}
2147483647
> !![]

>get webdev job because you need money
>add JavaScript experience to CV
>expect CV to increase in value
>it's actually -1 now

I'm sure that's probably good enough for your clients.

If you feel confident enough to stake your work on the assumption that you'll never write a stupid off by one error or corner case that never gets hit by an end to end test, more power to you.

But on the off chance that you do (and everyone does) it will be way more expensive for the people maintaining your code to fix it than it is for you to write unit tests.

I think you'll find that a weakly typed language having the same operator for string concatenation and addition is the source of the retardation.

Take a real scripting language, Perl, for example: + is reserved for addition while . is reserved for string concatenation. That makes a whole hell of a lot more sense in a language where conversions are made all over the place.

I get paid to write code to solve business problems. The minimum amount of testing required to acheive that goal is the right amount of testing. TDD is a crutch for amateur developers to learn how to modularize their code. Except just because your code is more modular, doesn't mean it solves the problem better. They also give you a false sense of security "Tests pass that means it works". This mentality is harmful.

>someone actually understands javascript in a javascript thread

stop the fucking presses.

I don't disagree with you, testing is extremely expensive. But I think the mentality "I'm an experienced developer so I'll never accidentally forget a null check" is equally as harmful.

The only thing more expensive than testing is not testing.

YOU ARE FUCKING ILLITERATES

READ A SPEC

>I think you'll find that a weakly typed language having the same operator for string concatenation and addition is the source of the retardation.

PHP uses + for addition and . for string concatenation and it has retarded type shit too.

> "a" == 0
true

> "20 bears" - "1 cat"
19

> !!0.0
false
> !!'0.0'
true
> 0.0 == '0.0'
true

> "a".(3)
"a3"
> "a".3
PHP Parse error: syntax error, unexpected '.3' (T_DNUMBER), expecting ',' or ')'

is JavaScript even a programming language? or just scripting language? the only shit i see javascript "programmers" doing is simple apps that you can do with php, python, rails,etc. Even Go...must faster than Node btw...

>I'm not a programmer, the post

You're conflating the idea of "unit test" and "test". Also when you design by contract you don't worry about NPEs because the code simply doesn't accept them.

haha ok so since i use everything BUT javascript that makes me not a programmer?

No, since you're an idiot who thinks that scripting languages aren't also programming languages you're not a programmer.

small world isn't it

scripting languages are used to write scripts
programming languages are used to write programs
You're an idiot if you can't tell the fucking difference.

i dont consider javascript a real programming language, no

I laughed, thanks guys.

They are not mutually exclusive things. Not all programming languages are scripting languages, but all scripting languages are programming languages.
This is "HTML isn't code" level retarded

It doesn't matter what you consider it. You're wrong.

I said
>real scripting language

FUCK,
I meant to tag not my own post. Ignore that

good luck writing a real program in javashit that isn't going to be a pile of shit

>true scottsman

10 + 100 = 6

What the fuck? Are computers retarded? It equals 110, not 6, you fucking retarded computer shit

>nu-male webdev who spends all day working in Starbucks on his macbook thinks he's a real programmer

I made an xbox one game in JS once.

It was a pile of shit.

>It equals 110, not 6
>It equals six, not six
computers r shit

ITT: Redneck cavemen

It's time to step into 2017, Hicks.

there is literally no good reason to use javascript outside of client front end stuff (jquery). i'm willing to be convinced, though. Floor is yours. If i were to step outside the standard php, asp.net, python stacks, i would go with GO since it's more powerful than Node...big league.

I have 3 years of full-time PHP experience because I need the money but I'm ashamed to have it on my CV
Maybe I should put "not proud" next to PHP

Absurd image
Node OS is based on Linux

lol...

And humans are based on Neanderthals, but I guess you never evolved past that

not him but what makes javascript such a milestone for programming? what are these evolved magical javascript secrets you're talking about? the only thing actually good about Node is the logo.

dude youre a pissed off little virgin thinkes hes oldfag huh

i think OP was admiring the languages common sense style of interpritation

Wana know how I know you're fat?

ha i just realized oP's post was b8, cunning OP

lol im glad im not a fat kid like you, people always accuse others of what they are guilty of because they arent smart enough to compute anything else

dude all im saying is you aint that smart, i build virtual fpgas in haskel and im here talking to you geniuses

Wana know how I know you're fat?

That isn't true. Black people don't have any almost no (even none in some cases) Neanderthal DNA.

"A team of scientists comparing the full genomes of the two species concluded that most Europeans and Asians have between 1 to 2 percent Neanderthal DNA. Indigenous sub-Saharan Africans have none, or very little Neanderthal DNA because their ancestors did not migrate through Eurasia."

I would hire u if you were saying this in an interview. Most people go nuts here.

kek OP BTFO

JavaScript is the superior programming language, hence why the best operating system was developed in it and you're using some operating system that's like so old

That's why they aren't humans, they're an old scion of sapiens that didn't evolve beyond Neanderthals.

which one is used for apps

Here's how you calculate pi in JavaScript:

((++[+[]][+[]]+[]+ ++[+[]][+[]]+[])* ++[+[]][+[]])*(++[+[]][+[]]+ ++[+[]][+[]])/((+[+[]]+'x'+(![]+[])[[+!+[]+!+[]]*[+!+[]+!+[]]])/(++[+[]][+[]]+ ++[+[]][+[]]));

Javascript alert(1)
[][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]][([][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]]+[])[!+[]+!+[]+!+[]]+(!![]+[][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]])[+!+[]+[+[]]]+([][[]]+[])[+!+[]]+(![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[+!+[]]+([][[]]+[])[+[]]+([][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]])[+!+[]+[+[]]]+(!![]+[])[+!+[]]]((![]+[])[+!+[]]+(![]+[])[!+[]+!+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]+(!![]+[])[+[]]+(![]+[][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]])[!+[]+!+[]+[+[]]]+[+!+[]]+(!![]+[][(![]+[])[+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]]+(!![]+[])[+[]]+(!![]+[])[!+[]+!+[]+!+[]]+(!![]+[])[+!+[]]])[!+[]+!+[]+[+[]]])()

I have seen some projects, where the dev did not document the testing procedures in any way. No manual testing guidelines, no automated test cases.
When you look at the code, which you have to maintain, you will have no idea why did he choose a specific solution. When there are automated tests, that will run in a matter of seconds, you are able to restructure the code in a way, that it makes sense to you, and it will not take long.
That is why unit tests are more important than integration tests. Integration usually takes a couple of minutes to run, if not hours. You won't be confident to refactor if it takes ages to run a simple testsuite.
If you need external resources for your project like a DB, there are plenty of patterns that will help you decouple the DB logic from the application, with the additional value of flexibility.

Automated tests are important. I prefer unit tests for the business logic, and integration tests for the boundaries.

>- only works for numbers.
Literally an arbitrary convention. Subtracting strings could also mean this

// this function takes two strings and returns a string
function minus(a, b) = {
if (a.endsWith(b)) return a.substring(0, a.length-1);
return a;
}


Also why does it assume you want 11-1 when you explicitly put "11" - "1"? Implicit type conversions are dangerous anyway, so why do them when the programmer explicitly specified the arguments to "-" to be of type string? Seems like a design flaw to me.

Actually this is shit because it doesn't take into account strings b of length > 1. I was only thinking of single character as a subtrahend. But you get the idea.

"potato" + 5


hurr durr y javaskript bad

[1,2,3,4,5,6,7,8,9,10].sort(function(n1,n2){return n1 > n2});
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

thats not how it works faggot

>adding objects and arrays together
I can do that too

#include

class Hurr{ };

int main() {
new Hurr() + std::vector ();
}

wtf it doesn't compile
C++ BTFO
I dare anyone to defend this

This desu
OP is dumb, as per usual.

If using an aspect of the language is stupid then the language is stupid.

It's more concerning to me that .sort() modifies the array it's called on, instead of returning a new array like pretty much every other Array prototype method. Since .sort() returns the array instance like those other methods, it's easy to overlook

Use typescript nigga, it wouldn't allow this shit

But that's the thing, it doesn't compile and rightfully so since adding a vector and an arbitrary object doesn't make sense, unless you use operator overloading.
In Javascript however, these statements don't produce any errors, but some bullshit silently, and that's bad behavior.
Like an user said earlier, ultimately you should write unit tests anyway, but a good dynamic type system should prevent errors like this to happen before you write tests.

That's quite the easy trap to fall in.
Just because people can explain WHY this happens this doesn't mean that this is reasonable behavior.

On a side note, I wish javascript had natsort.
That's pretty much the one useful function PHP has.

ITT: bootcampfags trying to defend their life choices

What do you expect a weak typed language to do ?

Imagine you're a scripting language, you're locked into an OS you can't directly speak with the almighty Programmer, you can't see Him, not hear Him.

You have doubts about what the fuck he's trying to convey you, but what can you do ? Nothing, so you just consult the almight language spec codex and do your job.
Because you know that if the almight Programmer wanted a strong typed language He would just have used a fucking strong typed language.

It seems I did the wrong thing creating a thread, I will ask in here then:

I have these two functions:

fucntion1:

var insert = function (user, done) {

if(validator(user) == true) {
console.log('good');
} else{
console.log('fuck')
}
}


function2:

function validator(user) {
connection.query('select * from user_record where user_name = ?', [user.email] , function(err,rows){
console.log(rows);
console.log("above row object");
if (err) {
return false;
}
if (rows.length) {
console.log('That email is already taken.');
return false;
} else {
console.log('That email is free.');
return true;
}
});
}


and they aren't working as I expected them to.
The first function just prints 'fuck' whenever I execute it with a non existent user, while the 'That email is free' is correctly printed by the second one as it should.
I cannot figure the fuck out why, its like the "return true/false" of the query just remain in the scope of the callback, how to make this work?

Why are you doing math on strings?

Didn't they teach you math in school?

>returning inside a callback
Put the return value in a variable declared inside validator()