So this is the true power of JavaScript!?!

So this is the true power of JavaScript!?!

Other urls found in this thread:

en.wikipedia.org/wiki/Floating-point_arithmetic#IEEE_754:_floating_point_in_modern_computers
twitter.com/SFWRedditImages

Its assuming you are doing + for string concatenation. The minus is a math only function, so it casts the "1" down to an int. Same thing will happen if you do 1 + -"1". There are better examples of javascript's fucked up type inference

Better do stupid shit than error in language without error handling.

> 1000000000000000000000.03 == 1000000000000000000000.09
True


heh, so I guess this is the future

Yes, but the true question here is, as a programmer, are you skilled enough to wield it effectively?

That would probably work in C

> {}+{}
NaN
> {}+[]
0
> []+{}
"[object Object]"
> []+[]
""
> {}-[]
-0
> []-{}
NaN
> []-[]
0

Python 2.7.10 (default, Feb 7 2017, 00:08:15)
[GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.34)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 1000000000000000000000.03 == 1000000000000000000000.09
True

>not using library for arbitrary sized numbers
>complaining about precision error

>python 2

...

Python 3.6.1 (default, Apr 4 2017, 09:40:51)
[GCC 4.2.1 Compatible Apple LLVM 8.0.0 (clang-800.0.42.1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> 1000000000000000000000.03 == 1000000000000000000000.09
True
>>>

I wasn't implying it would work with python 3 I was just laughing at you because you're still using 2

>python 3

>125x93

sorry im new
is this better

This is actually not surprising in any regular programming language.

> printf (" %.20f \n", 3.6);
> 3.60000000000000008882

Explanation here and in section right after the one linked:
en.wikipedia.org/wiki/Floating-point_arithmetic#IEEE_754:_floating_point_in_modern_computers

I am noob and don't know any Java or programming (learning C++ though).
Can somebody explain why this isn't 1?
First one adds 1 to it, second one should remove zeroes from number but it should stay the same. Right?

Try again

>>> from decimal import Decimal
>>> Decimal('1000000000000000000000.03') == Decimal('1000000000000000000000.09')
False
>>>

it doesn't work on c++

If you mean the OP, it's 'cause of the types involved.

One of these is just interpreted as string concatenation.

I'm going to blow your nips off

0.1 + 0.2
0.30000000000000004

its pronounced yavascript. soft j.

>Not a built in type
>Have to construct decimal from string

pig disgusting.

Explain this JSFags

2.toString();
>>> Uncaught SyntaxError: Invalid or unexpected token


2 .toString();
"2"

...

>Python 2.7.10

>-0
Wat..? How this happens..?

Oh... Wait...

>Language works exactly as intended

>W-W-WHY AREN'T ALL PARADIGMS THE SAME ACROSS ALL PROGRAMMING LANGUAGES

function getObject()
{
return
{
foo : "bar"
};
}

Why would they intend the language to work as a totally broken piece of shit?

Weakly typed = Weak language

this
Dynamic typing can be good, but weak typing is absolute shit. Getting an error when you fuck up >>>> blindly continuing.

Use float literals instead of doubles

If a programming language is supposed to do something retarded, it's objectively a bad language.

Web languages are so extraordinary awful that they have created a job market by requiring way too much labor to maintain them.

Weak typing was a mistake.

That is flawed logic. The instructor should behave on the first operand's type. So to produce "11", the first operand should be the string. Otherwise it should be addition.

There is no "-" operator for strings, so in order for it to subtract a string it has to convert it to an integer.

There is no way to add a string and an integer, so in order for it to add a string to an integer, it has to convert the integer to a string or the string to an integer.

Garbage in, garbage out

How about you don't do all these implicit conversions and raise an error on ambiguity?

lol, you really expect someone actually writing "2 .toString()".. that goes against every style guide in existence

Because it is convenient

JavaScript also adds semicolons if you forget them

It's not convenient for your code to unexpectedly do something you didn't expect. Even the mess that is C++ only does implicit conversions if they are unambiguous.
>JavaScript also adds semicolons if you forget them
Yeah, which leads to shit like this

I suppose it's confusing the literal for a float in the first example?

Doesn't matter I guess, you'd never call tostring on a numeric literal.

JavaScript was made in a week or so.
The parser (for the initial features) is as stupid as it gets.

Left to right, greedy.
It would love to match that "2." as a number token, but can't because of the toString() that's right next to it, without any kind of separator.

In the second case however, "2" can be matched by itself as a number token, because of the white space, the "." as a dot, "toString" as a name and the two braces as themselves.

Another way to write the second example would be "2..toString()".

In all fairness, C and C++ fuck this up too
#include
int main(){
printf("%d\r\n", 1000000000000000000000.03 == 1000000000000000000000.09);
return 1;
}
$ gcc a.c
$ ./a.out
1
-----------------------------------------------------------
#include
int main(){
std::cout

obviously because they are probably both interpreted as IEEE floats

this guy claimed c++ was not affected

retards, please read the IEE 754 specification

>JavaScript
Such an achievement.

>It's the year 2050
>Websites still use javascript
kill me

but muh "static typing is too hard!!!!"

>types shit code nobody with half a brain would type because they know it's vague
>is suprised the result is vague
gee it's almost like you're just asking for it

>JUST DON'T TYPE STUPID BAD STUFF
if only there was a way to automatically detect and forbid stupid bad stuff

That's just an issue with floating point numbers, not any language. No language will handle that correctly unless they go full retard and take a massive performance hit and use an arbitrary precision library for everything.

Implicit casts and overloaded operators were a mistake. Even C's relatively conservative use of them trips people up. It just introduces too many arbitrary rules in the name of intuitive behavior.

It's really not that much of an issue to do an explicit cast when you need to work with different data types. Nor is it overly hard to put strings into a function to be concatenated. I'm pretty sure the latter is only added to languages to make them seem "hip." It's slightly more typing, which a good editor can eliminate anyways, in return for less random rules to remember.

And don't even bring up "typeless" languages. If a language says it doesn't have types, it's full of shit.

Y'all niggas need typescript

Why don't languages implicitly convert number == number to char[] == char[]? This way they would never fail OP's example.

Type is inference is cancerous and shouldn't be allowed.
I don't understand why some people though it was a good idea to explicitly make the code less readable and far slower.

...

type inference is fine, just don't treat strings like numbers and vice versa

>type inference is fine
how so?
int a = 2;
// against
a = 2;

Tell me why type inference is a good thing then.

You can infer that a is an int by virtue of it being assigned the value 2, so you get to save on typing

What's the issue?

Who's responsible for the modern trend of weak typing, implicit casts, and overloaded operators?

So it's useless then...
People wasted time developing this retarded feature just for the sake of it and sacrificed readability and performance for it...

...

but I don't want to have to convert my if statements to bools

753*
You don't belong here.

bash isn't much better. but then i don't see people evangelizing bash so i guess js deserves the hate it gets

How about this? All numerals are floating points, so you get this shit:
> 9007199254740993 == 9007199254740992
true

nigga did you even attempt to check if 753 exists? It doesn't