Is JSON the best way to structure data before sending it over a network?

Is JSON the best way to structure data before sending it over a network?

It certainly looks less faggy than XML at least.

Other urls found in this thread:

hjson.org/
jsonapi.org/
twitter.com/NSFWRedditImage

yes, it's pretty much the standard

jason is a meme for node.js fags

>parsing xml in any language
huge meme

but it just werks

If you're sending text then definitely use json.

Just use CSV or something like that

yes

CSV is flat, you can't represent the structure in OP's image in csv.

no use Apache Thrift, Apache Avro or Google Protocol Buffers

angle bracket tax

>CSV is flat,
CSV is actually arbitrarily-dimensioned, you can use as many distinct separators as you want.

Using numpy with some external simulation packages, tensors matrixes and larger data structures are all passed via CSV

Oh you can, it's just a giant pain. Need to use either generic columns and references or a large number of columns and references.

I'm an SI, I have some customers that do this. It's batshit insane

i mean it's bretty gud
use compression if possible

pretty much
there's a reason almost every major website with any form of api uses it

>tfw writing a one-off console app that molests some SVG files by directly modifying the xml to fix cropping issues

This desu

JSON wastes a fuckton of bytes

json is for brogrammer faggots who can only write "apps" in html and ruby.

.txt for the master race programmers who know their shit.

Yes, you can. You're obviously a not very brilliant person. I wish the hierarchical data meme would just die, people have had more than enough time to learn object-relational data by now.

It's stupid data structure IMO because the parsing isn't straightforward. Readability isn't that much better than XML. Like what another user suggested, CSV is better. Whoever designed JSON and thought it's a good data structure for internet use is an idiot because it has no advantages over simpler, more efficient formats.

Yeah but not everything is numeric computer and not everything deals with matrices of numbers.
If you want a matrix then obviously use something like CSV that easily represents a matrix. But if you want remotely "rich" data then JSON is literally perfectly fine, is readable and concise, easy to parse, has lots of implementations in lots of languages. Hating it because "simplicity is better and CSV is more simple!" is retarded.

>I don't know what I'm talking about, the post
Hi pajeet

Define "best".
If you're looking for the most efficient way on the wire, then binary would be the best.

Binary is shit. Hexadecimal is better.

Base64 is Best64

Considering that most json is machine generated and effectively has no whitespace I actually found it small enough for Web configuration stuff vis a vis CSV with the upside of getting to be lazy about parsing it in PHP ot jQuery.

Binary isn't a data structure.

>But if you want remotely "rich" data then JSON is literally perfectly fine, is readable and concise, easy to parse, has lots of implementations in lots of languages.

It's neither easily readable, concise, nor easy to parse. The fact that it's popular doesn't mean necessarily it's good. Tables are always easier to read and more efficient.

Json is trivial to parse, with a well defined, minimal set of control characters and a well defined minimal set of data containers.

It's only barely harder than string.split

>not using HSON
wew lad

hjson.org/

Are there even any decent alternatives to JSON?

Can a parser trivially distinguish between a string and a number?
How does it encode newline? Tab?
Why are there comments in a computer generated serialization format?

Somebody is going to argue yaml, another person about sexprs. They would have any point at all if the parsers weren't slower and more complex than XML

>Json is trivial to parse
>It's only barely harder than string.split
not nearly as trivial as CSV, which is _literally_ string.split or substring
>minimal set of control characters
not as minimal as CSV's: "," and ";"
+ have to escape any string with the " symbol in it
>well defined minimal set of data containers
data containers?
you mean fields?

>JSON
Easy to use universal structured data format.

>XML
For MARKUP, like if you need to send custom text document types.

>INI/YAML
For app configuration.

>roll your own binary format
If you absolutely need the performance.

JSON all the way babyyyy

txt is for faggy wannabe hackers in their mom's basement, use HDF5 instead, it's used by scientists.

>You mean fields
No, I mean string, number, array, unordered map

>Literally string.split
>Except for newline
>except for strings containing the delimiter
>except for partial lines and empty elements
>except for anything having to do with utf8
You might as well decode base64 for all the "easy" CSV transmission will give you

>Not using hl7, the superior health care format

What does best mean? It's certainly quite nice if the data is required to be human readable, but if you're looking for pure serialization/deserialization performance, then one of the libraries that uses some binary wire protocol that doesn't have to be human readable is probably going to be faster and produce a smaller output message (like Protocol Buffers). If you really want to get down to it you could probably write your own application-specific protocol from scratch and have absolutely minimal overhead.

JSON are good enough for web rpc.

CSV although look simple could have more overhead due to writing your own parser and escaping possible delimiter.

JSON parser are done by browser itself.

As for binary vs text, don't bother.
Instead, go for HTTP/2 and save yourself from complexity of data compression.

>he doesn't use the jsonapi specification
jsonapi.org/

>standard

now select some value by key.

it's piss easy to parse, I wrote my own library for that in like 50 lines of code (and could probably make it even shorter)
anyways, if y'all want something compact and easy to parse, s expressions are clearly the best

>strawman

Still doesn't change the fact that CSV is easier to parse than JSON.

do you know what a strawman is? the first guy said it's _literally_ string.split(). idk what you think "_literally_" means but he addressed the argument exactly as it was presented

{
"the most": ["annoying", "thing", "about", "json"],
"is_how": "anyone",
"in_their_right_mind": null,
"could_think_that_this": {
"is": [{
"a":
"readable": false
},
{"desirable": "data serialization format"}
{"to_the_point": [{
"where": "they would type it out manually and consider it convenient and clever and amazing"
},{
"kind_of": "like how proud of their genetically assigned role in society gammas, deltas, and epsilons were to be in brave new world"
}]}]}

>parsing csv tree structures is easier to parse than simple dict lookups
print('\n'.join(post['com'] for post in data['posts']))

Tell me how you are going to print all comments in a thread as elegantly as that using csv. Even that task is simple

nobody types json out manually you dumb shit. It's generated from objects

faggot = you[0]

>he addressed the argument exactly as it was presented

No he didn't. He ignored "or substring".

White space isn't part of the JSON standard.

Just from my uses it is quick and easy to parse with Ruby and it is the format for setting policy in AWS. It has a much broader use than node even though you are likely just trolling.

If you are not using json you are a meme

In my experience, it's actually pretty rare to actually need tree structured data. Most data people want can be represented easily as key=value pairs or csv.

If you need tree structure, JSON is only better because every scripting language supports it. If you need to do any parsing yourself, use s-expressions.

Everyone now a days uses Json OP, people are saying otherwise are either trolling or Neets who dont even work.

>readability
you have to be a woman

>White space isn't part of the JSON standard.
That has nothing to do with my post. I'm not making json, I'm accessing it

JSON is fine for most things. Has the advantage of being decently human-readable and standard.

If you are sending a lot of data you might want o look into transit.

>into transit.
I meant MsgPack

Is JSON a good idea for a protocol that works over UDP?

Protocol buffers is definitely the way to go

This

Doesn't that use a library provided by Google?

Could someone explain me what the is the difference between plain text in body and JSON if you to change js object to a normal string with json stringify before sending it back?

Do they actually call the function "stringify"? Damn, what a bunch of hilarious, silly, all-casual chucklemasters the json developers must be.

xml takes more space too i would guess because of the closing tags

JSON is a presentation layer protocol. It doesn't have anything to do with getting your data transferred over from A to B. Your image makes no sense.

>lines separated by commas

This triggers my autism. What kind of language forces you to use commas so often? I only feel like I've made a complete statement after using a semicolon.

>protocol buffers
Just use json schema you neet

Hell we literally use JSON strings as XML attribute values in our XML-based config files (.NET stuff) because it's often easier to read and deal with complex configuration that way than to implement an equivalent XML structure

Whats the best way to communicate server-client, if you are coding in C?

I don't know man, why don't you googlify it?

the beauty of json is that you can have fairly complex nested data structures of arrays and hashes, convert it to json, send it over the wire, and then with one line of code turn it back into your data structure.

nice bait

>>Why are there comments in a computer generated serialization format?
Because it can now be used as a configuration format.

>Is JSON the best way to structure data before sending it over a network?
yes

Well it does waste a lot

>It certainly looks less faggy than XML at least.
Until you need to store rich text content. Then, out of all sudden, you'll start sucking XML/HTML5s dick soon enough.

He has a point, though. It's not exactly a compact format.

>not writing your own parser that is more efficient than xml, json or csv.

Plebs. My datastream had an overhead of 3 bytes per nested object and parsed 2 to 5 megabytes per second depending on tree depth.

GraphQL