I don't have a data cap at home either. Compression does help make my upload speed less awful though.
John Williams
.zip - best for compatibility generally .7z - best for decent compression ratio on Windows .rar - best for (scene) warez (especially on Windoes) tar.gz - best for compatibility on linux .gz - best for speed on linux .Z - best for ancient packages on Unix .xz - best for compression ratio on linux (xz -9) .bzip2 - totally useless, never use this
Nolan Morgan
thanks for this =)
Christopher Robinson
Bzip2 would be good if it weren't so slow. The only way to make it useful is to make it multithreaded which apparently changes the output somewhat so that it's not the same as the reference implementation.
I was messing with 7zip's implementation of Bzip2 and found that it compresses images better than LZMA for some reason but other than that LZMA beats it at everything.
Elijah Torres
zpaq for storing shit, zip/tar/whateverthefuckisavailable for packing multiple files to send them wherever i need
Nathaniel Rogers
>tar.gz tar is not compression >.rar - best for (scene) warez (especially on Windoes) free unrar implementation works well as does non-free rar is used in scene for its multipart
Dominic Gray
>The only way to make it useful is to make it multithreaded
tar -I lbzip2 -cvf archive.tar.bz2
Connor Stewart
Or you can use --symmetric instead of --encrypt to make use of a passphrase if the recipient does not have access to the public key of user Faggot. Correct?
Grayson Allen
The recipient would need access to the private key of user Faggot, but yes, --symmetric makes it use a passphrase instead.
Jacob Long
Thanks, and goodnight.
Luke Rodriguez
RAR5
Cooper Hill
Got 100mbit both up and down here. But I do see your point.
Adrian Robinson
Bump
Luis Moore
there is some case, you paye for bandwidth and compression is interessting here.
Levi Edwards
tar when i want to bundle tar.gz if i also want to compress zip if im using gui
Jeremiah Perry
osx =(
Aiden Peterson
7z LZMA2
Jordan Diaz
>bandwidth >implying you backup in the clouds®
Jaxson Reyes
>no dar >no zpaq into the trash it goes
Wyatt Murphy
>encryption: my own program which uses a secure hash function in CTR mode and some group theory for mixing. Wow, someone fell for my meme, nice. You still need to use a MAC though.
Samuel Thompson
>.zip - best for compatibility generally This is .gz
>.rar - best for (scene) warez (especially on Windoes) Bad for anything
The free unrar is total crap.
Chase Sanchez
This^ Top kek
Colton Perry
>Wow, someone fell for my meme, nice. I don't know what you are talking about, I would never take any advice from Sup Forums. I chose SHA-2 because it can be implemented far more easily than AES, therefore it's less error prone but is still secure enough according to research papers.
Levi Ward
>I chose SHA-2 because it can be implemented far more easily than AES Chacha20 is easier to implement than both, same for BLAKE and Keccak (though more complex than chacha20).
Xavier Perry
My main goal wasn't to choose the most simple algorithm but one that has gone through thorough cryptoanalysis and is still considered strong. Sponge based algoritms are relatively new and aren't well tested, therefore I will stick with SHA-2 while I can.
Luis Jackson
>Sponge based algoritms The sponge construction by itself is provably secure. Also, BLAKE and Chacha20 do not use the sponge construction.
>but one that has gone through thorough cryptoanalysis and is still considered strong All of the ones that I mentioned fit this.
Austin Brown
Why hasn't there been any advancement in compression technology for the past 20 years?
Nicholas Hill
It has
Camden Brooks
They was.
Robert Lewis
Idk, whatever comes by default on Ubuntu based Linux distros.
Tyler Johnson
Because you're a fucking retard. Retard.
Cooper Richardson
Honestly, I use 7zip, because that's the one I'm most familiar with, and it supports encryption.
Luke King
PNG is not lossy, and using lossless video compression is retarded. You only have a lossless video source if you own a high-end camera or if you're recording your screen to begin with. I guess you are right about lossy audio compression being retarded, but that's still justified if you put your music on a device with limited space, such as a smartphone.
Grayson Stewart
pbzip2 is the fastest out of any of them and works great.
John Sanchez
There has been some advancement, there are some new image formats such as webp or FLIF that are better at losslessly compressing images. There has also been advancement in the world of lossy compression with things like h.265 or webp (it has a lossy and a lossless variant). There will never be any huge advancement though, because you are fundamentally fucked in the ass by the pigeon hole principle.
What does Sup Forums use to optimize their PNG's? zopfli here.
Carter Williams
>zopfli Put a png you optimized with zopfli. I'll losslessly reduce it even more.
Thomas Robinson
...
Colton Hill
If you're compressing your files using Jcvf for arguments your file extension should be .tar.xz
Jayden Young
here you go $ compare -metric MAE 1504646467153.png out.png /dev/null ; echo 0 (0)
I've additionally removed all the pixels with full transparency set, so it's not strictly lossless since you cannot "undelete" pixels hidden beneath 100% transparency. You can view the difference if you $ convert [input] -alpha off [output] on both your image and mine. In mine, pixels with 100% transparency had their RGB reset to black. I could have shrinked it anyway even without this trick. The toolset is a kde-dev script (optimizegraphics) plus pngwolf-zolfi at the end.
Noah Howard
>pngwolf-zolfi *pngwolf-zopfli
Jaxson Watson
btw, removing everything beneath alpha 0 is the default mode for FLIF Example. this is an (unoptimized) PNG with hidden info...
Christian Adams
..this is the (unoptimized) output of FLIF after an having encoded the previous image to FLIF. It's visually identical, but...
Xavier Young
..this is the first image with Alpha Channel removed...
Jose Ortiz
...and this is the second image (FLIF output) without alpha.
Bentley Hughes
incredible, some minutes have passed and yet not a single "it's not a pelican, it's a seagull" comment
Dominic Brooks
What is the besto way to compress a brunch of linux isos?
Jackson Perry
This shit terrifies me because I stress over all the PNGs I've compressed that might have had something cool in them.
Camden Barnes
I think it would be j for bz2 but I'm too lazy to man. Too bad there's such an atrocious number of dependencies for libsdl2-dev which is required to build the lib for anything that supports viewing flifs.
Nathaniel Brown
Sorry I meant to build viewflif. (libsdl2-dev is a depencency)
Jacob Walker
Does anyone here remember the 8 chan board dedicated to archiving music, movies, and almost every other type of media ? I fell on it months ago, but I don't remember the name. Basically theses peoples where thinking that most of the content currently accessible online where eventually be taken down because of the upcoming enforcement of copyright laws. This correlation between this thread and the board I'm talking is that they where also discussing the best ways to compress files, depending of theirs formats.
Jaxson Fisher
/eternalarchive/
Dominic Brooks
Thanks, that's the one. Here's the thread I was talking about : /eternalarchive/res/263.html
Justin Lopez
>not using .rar I bet you didn't even purchase a WinRAR license. Fucking pleb
Joshua Clark
redpill me on gz vs bz2 vs xz vs lz4 vs lzo
Jaxson Gomez
Timely bump for an interesting thread
Joshua Ortiz
>For encryption i use gpg-zip Why not block device encryption?
Brandon Collins
For text, usually bz2 but sometimes xz. For some reason bzip2 beats xz in my experience, but ONLY on text files. I use gpg for encryption, it has built in support for several compression methods as well. If you're autistic you can use lrzip which makes for smaller files but is not supported by anything other than itself. Also has zpaq support for supreme autism. LZO is actually really good, it can compress faster than copying a file but has the worst compression ratio. I use it for when I need to move a large uncompressed file across the internet (to myself). I don't use much other than those. zip for interacting with normalfags. rar is complete trash, never use it. For images, I use the following: >mozjpeg for jpeg >gifsicle for gif >optipng for png Pic related is a test I ran on a large mostly text (I think?) tarball.
>gz Good compatibility for Linux, most web browsers can open html compressed with gzip >bz2 Strange middle ground, I only found use for it in text files >xz Good compression but takes longer than others >lz4 Never used it, think I have heard of it >lzo Really fast and good for when you're time-limited (either by compression time or upload/download time). I think openvpn uses it.
Colton Campbell
Does Sup Forums have any ideas on how to compress ebooks? Optimize PDFs and epubs?
Camden Baker
>compress ebooks >Optimize PDFs Define "optimizing" please. qpdf is a tool capable of linearizing and it can perform some lossless optimizations. ghostscript will entirely rebuild the pdf. This isn't a 100% lossy procedure if the PDF contains jpegs. Plus, there are a few twists about image compression in gs. exiftool alone may remove some metadata, but you'll have to feed the output to qpdf in order to suppress recovery of the metadata you stripped. For example, it's possible to linearize and slim down Brian Abraham's pdf in the chinkshite general from 14M to 3.1M with a quick gs+qpdf. embedded data in embedded jpegs may survive if the pdf isn't entirely reprocessed. MAT is a tool that (if you compile it from source) can _still_ remove cruft and "anonymize" pdfs; there's a chance it misses something in a defined scenario (which led from a pre-emptive removal of the pdf "anonymization" feature in the MAT version available in the debian repo) and the process isn't lossless. It's always better to start from the original .ps and from the original images, if any. for other ebooks, the best route is always to convert to ps and then back in your final format. In the .ps you'll do all the necessary cleaning (most of it would be done from the interpreter/converter itself). Or read MAT's paper and follow a similar approach For endured compatibility and for archival purposes, PDFA is suggested. A free as in freedom validator is veraPDF.
Leo Taylor
The reason I use 7z for anything that is not tiny is it compresses a header containing a list of contents. So I don't have to unpack the whole archive just to know what is inside it. Is there any other tools that can do that? I would also like something that compresses but also adds an option to have some kind of redundancy so if a couple of bits get flipped I can still recover my data.
Benjamin Reed
>mozjpeg beware that in some older version of mozjpeg, jpegtran's process wasn't entirely lossless. At least, not in -fastcrush runs. Now they fixed it, but it would be nice to perceptually compare (i.e. fetching graphicsmagick/imagemagick compare's output) mozjpeg's results with the source before overwriting the source with mozjpeg's results. If you're compressing pngs to jpegs, guetzli beats smallfry (jpeg-recompress, algo allegedly used in jpegmini).
Camden Powell
>Is there any other tools that can do that? Is there any other tool that does not do that? Tar does it, unzip does it, zpaq does it >I would also like something that compresses but also adds an option to have some kind of redundancy so if a couple of bits get flipped I can still recover my data. par2cmdline
Adrian Howard
>This isn't a 100% lossy *This isn't a 100% lossless >which led from *which led to
Joshua White
>Tar does it, unzip does it, zpaq does it I mean keep a header containing the contents separately. Fair enough, tar does that, but if you run it trough gz for example it becomes useless as you have to decompress everything in order to get to it. Zip does it, but I see no advantage over 7z. The 7z format is far more modern and supports many more compression algorithms. >zpaq, par2cmdline Never heard of them, will look it up. Thanks user.
Zachary Walker
zpaq is also pretty resistant in case of corruption (i.e. some bit flips won't ruin the entire archive). It's also de-duplicating, incremental and extremely efficient, at the cost of being slow in the compression phase. one of the problem with compression for archival purposes is that anything that ain't plain zip (or zpaq) will suffer a lot from flipped bits unless you have some parity laying around. rar notoriously can add a "recovery header" but rar isn't free as in freedom and its max compression is lower than 7z or xz + parity. Note that parity can be added to single archives but better yet to collection of archives. i.e. you can create backup DVDs/BDs with dvdisaster (adds reed-solomon correction codes at the fs level) and parity (at file level). You can create some RAID-alike scenarios with par2cmdline alone. par3 is a proposed upgrade on par2 but it's not ready yet and par2 is rock solid and ancient enough to be considered well-tested.
Ethan Edwards
>plain zip *plain tar
Joshua King
I read a bit about zpaq and it looks REALLY good. Far more features than I need, but looks much better than 7z in the majority of cases. Still trying to picture in my mind how can I build a robust backup/archival solution using a combination of snapraid, zpaq, par2cmdline, etc.
I would love to be able to 'merge' various type of media (hd, dvds, cloud, etc) in a single volume and assign the files inside it different levels of "importance" which controls how strongly they should be protected against loss. Also be able to manually assign each file a score on how readily accessible it should be and so on. Would be real neat to have something like that working, but it would take far too much effort. Probably will settle for something simpler that I can do using existing tools.
Wyatt Lee
This, par2 is good shit, if you're backing up to optical just fill the remaining space with parity data. That way if you scratch the fuck out of the disk or get rot you're still good to go
Nathaniel Richardson
Bad idea
>no kdf >cbc >no mac >aes
Jordan Smith
Fast with bad compression to slow with good compression. This is for compression speed only. lz4 gz bz2 xz
Matthew Thompson
>witch type of file >witch
Wyatt Mitchell
ECT. It's basically zopfli, but better and way faster. It even works on jpg files (uses mozjpeg's jpegtran I think).
Jack Williams
Fuck off fatso
Leo Rogers
Go drink some bleach, shit stain.
Ethan Moore
I'd rather use dvdisaster to protect the DVD at the filesystem level and then create a separate DVD with all the parity e.g. 4 DVDs + 1 DVD containing 25% parity of the others - now you can lose any DVD and still recover everything (RAID4-alike) another option (more costly, space wise) would be to distribute 4 DVDs in 5DVD+partial parity of the whole array (RAID5-alike); you still can lose up to one DVD but you'll need to add additional parity, so rather than parity=25% you'll need sum(x=1)->∞ 100*(25/100)^x = 33% (and given that you won't increase the size of the dvds, you'll end up will less data saved); adding parity for the whole batch on each medium is less convenient I'd keep dvdisater in the background because of pic related BlockHashLoc ( github.com/MarcoPon/BlockHashLoc ) may or may not serve a similar purpose
it's a collage of various different tools and it's worse than pngwolf-zopfli iirc.
William Jones
>so rather than parity=25% you'll need sum(x=1)->∞ 100*(25/100)^x = 33% (and given that you won't increase the size of the dvds, you'll end up will less data saved); well kek I went retardo while considering a different scenario (parity spread on each media without spreading data over the fifth DVD as well); loss of parity won't impair recovery if you're going to lose 1/5 of the actual data rather than 1/4 (and 25% is sum(x=1)->∞ 100*(20/100)^x = 25%) time to sleep
Gabriel Rodriguez
>Using macOS >not using keka You're doing it wrong
Julian Barnes
don't you thread on me
Charles Jenkins
Just found out yesterday that tar -xvf will overwrite files with the same path/filename without confirming RIP
rar is objectively best compression algorithm, though linux fags will never use it cause "muh freedom"
Brody Mitchell
You're making me so angry. STOP!
Ayden Kelly
>SHA-2 because it can be implemented Are you using a library? If no, why not?
(not that it matters, but if your decryption program may be prone to side-channel attacks if it's remotely accessible - crypto is hard yo)
Josiah Martin
friendly reminder that your UID and GID names get stored unless you specify --numeric-owner >tfw your tar crafted as pedo:loli get investigated by a fellow sysadmin
also, just use -cavf and forget all the compression-related syntax
Easton Reed
Gotta use zpaq. Everything else is for normies.
Caleb Foster
>objectively best Horse shit, how can you say this?
It sucks.
Dominic Morales
If you are using zpaq, you probably want to use lrzip; one of the options it has is to use the zpaq backend with the compression-enhancing prefilter from rzip.