How do you transfer big files (more than few GBs) between devices in the same wifi network?

how do you transfer big files (more than few GBs) between devices in the same wifi network?

httpd. maybe smb/afs/nfs/ftp. with sftp you dont even have to install anything on most unices

U can just share a drive with public permissions then RUN: \\IP\Drive

smb://192.168.x.x/directory

rsync

Rsync with gzip compression & SCP transport

scp with no bandwidth limiting or compression of the files

>put files on FTP server over gigabit ethernet
>cry downloading it over wifi at a few MB/s max

with external drives

What switches do you use to disable bandwidth limiting?

To add to the OP question, how do you do it on Windows with automatic file integrity verification?

plot twist: one of the devices is an android phone

> (OP)
>with external drives
god level, ensure your wifi connection doesn't fail after hours copying

Windows home network

Format a drive as NTFS and use it as a Samba share.

python -m SimpleHTTPServer

Ftp

My 5g WiFi does 55 Mb/s, it's wicked fast.

Same network is not so bad - but I still have yet to find a simple way to transfer large files to tech-illiterate friends

WebRTC service or something? Fuck, man, this should be simple

Bittorrent

I sometimes reach ~72Mbit/s

scp

Tar over netcat.

Instant.io
file.pizza
et al

Make a torrent.

why not email them a torrent, have them manually add your ip?

I assume by your addition of "IP", you meant to imply creating a *private* torrent.

Just set up a samba share on your computer, and access it using a file manager from your android phone.

Unless you are doing something that's highly compressible like raw text, compression will slow down your transfer. The size difference will be small and the CPUs on both sides become a potential bottlenecks.

Most modern filetypes (pictures, videos, audio, documents) incorporate compression so cannot be further compressed to any significant degree.

Probably rsync still

There are p2p file transfer websites, I once tried to send a file to my normie friend using one of those, but it didn't work out, because his connection always dropped, as he was using wi-fi (lmao). If the file is small enough, just put it on google drive.

Yeah, I've never tried sharing a private torrent, but seeing how you can manually add peers that was my line of thinking

you can resume transfers with rsync. But yeah the fastest data transfer I ever did was wheeling a 100TB storage unit across campus to a different datacenter.

Torrents are only great if you have multiple seeds or multiple destinations.

From the reading I have done on it, the private flag has literally no mechanism in practice. It's just a general rule. In theory, any asshole peer/seed could take the sourcecode of say, qbittorrent, ignore the private flag, and continue operating fully open in the network, exposing the hash and its associated peers to the DHT network. So I think the private flag is the wrong way to go, myself. Instead if you really want security on the data, encrypt the file and share the PSK to the others across a traditional secured medium.

I would disagree with you on that point. Torrents are great because it's a sequential, blocked transfer meaning any, say 512KiB block can fail transfer, and the checksum mechanism of your client can react accordingly by requesting the block again from the swarm.

>cat file1, pipe to base64 and output to text file
>print the text file
>scan reams of paper onto second device, making sure to maintain correct page order
>use image recognition on resulting pdf file to generate raw text strings, write these strings to another text file
>cat that text file to base64 --decode, write resulting data to file
>profit
You can also store the paper for cold backups.

Gr8 b8, m8.

I prefer encoding the data by punching into gold-plated reels, myself.

Is this pasta?