Why aren't we publishing the new nyaa.se in zeronet?

Why aren't we publishing the new nyaa.se in zeronet?
>Free
>Open Source
>Decentralized
>Free of shitty SJW govt regulations
zeronet.io/

Gaown den, publish it, I'll use it. As long as my shitty australian internet doesn't have to download a massive database with torrents, torrent info, comments, etc, that will probably be like >500MB in size.

>Page response time is not limited by your connection speed.

What did they mean by this?

Soon user soon

>Why aren't we
That always means "not me tho"

Has it ever occurred to you that OP might be to tech illiterate to install a toaster and is just asking a question. Why not just answer his question and stop acting like an annoying shit.

And I gave an answer...somewhat.

Why not be honest and ask
"Why aren't you publishing the new nyaa.se in zeronet?"

I assume you didn't look at the unique posters number? It's okay counting is hard. I'm not, I have not a clue why one would or wouldn't but I also can't stand people who mock someone when they ask a question instead of answering it.

I'll give a decent answer.

A good reason is because it can't cater to normies, including myself. For first time viewers they need to download all the data the site has, this could easily be upwards of 500MB and for people with slow internet that hella sucks. For future navigation to the site you only need to grab the newest additions to the files (I believe, could be wrong) which makes going to it later much easier/quicker. There is also the issue of having to download ZeroNet before using it, another thumbs down for normies. ZeroNet also needs to be downloaded before you can use it, obviously it needs to be a downloaded program in order for this network to exist, but for normies they'll just want to go to a url and download and quit. A really, major plus to ZeroNet is it's semi-dynamic capabilities, ZeroFrame is pretty cool and having the bit accounts network wide is quite appealing which will make publishing new releases from a release group easy to see and you'd know it was coming for that group and not some random.

I looked into ZeroNet initially but found it probably wouldn't cater to many because it would succeed or fail based on adoption of the service, which we would probably need the normies for. Instead I looked into IPFS, which is also quite promising but isn't catered for actual websites where people can contribute it's more for just file sharing. Although it has a PubSub feature for realtime communication which is similar to the nature of ZeroFrame but not quite. We had very lengthy discussions on using IPFS, one of which was very ideal but had it's caveats.

I think the technology isn't quite there, but the one thing that could possible work well is Etherium but that relies on "fuel" to run, which I guess is similar to renting a server anyway so it might be worth looking into.

>p2p solution isn't viable because muh normies
how has that ever been an issue? Back in the days of 150kb/s connections, when normies didn't even have computers, it never was an issue. All this takes is the dedication of a few.

Thanks my dude, real choice info you gave there.

p2p = adoption. I doubt release groups will publish to it if nobody will use it. Sure you will have people who can be assed enough to download everything so they can be foolproof, but if nobody commits to it then it will die.

Normies are just starting to use VPNs because they are realising privacy issues and such. With KAT going down recently, and TPB issue that happened years ago they will start realising that they need decentralisation too. With Nyaa going down recently too, now would be a great time to start pushing for it, but before going out into the blue to make something that uses this there needs to be a set design laid out. The design needs to specify how it'll deal with big databases, moderation (malicious, bad release, etc), how release groups will publish, and how accessible it can be. I still think having to download a big database on first launch is always going to be an issue and I don't see a great way around it yet.

I'm all for this, don't get me wrong, I want it to happen I just don't see it catching on *just* yet, and the technologies behind it need to mature a bit more, ZeroNet is still very early, as is IPFS and other options.

you are a good boy

why the fuck are you even on this website?

4U

redpill me on zeronet?

is it the same as freenet?

what is freenet anyway?

by peer2peer you mean like winmx and kazaa?

>upwards of 500MB and for people with slow internet that hella sucks
that's 1 or 2 anime episodes, they can't watch anime if they can't download 500MB, stop being an idiot

different user here
He does have a point, this kind of system won't work very well if it's required to download hugefuck databases before it's possible to use the actual site. Not only because fuck waiting for a website to load, but think about the bigger picture; what if you have about 20 such sites saved on your hard drive? What if it becomes 50? Why would you want to store entire fuckhuge websites locally? The way it works now, it's as if fucking everyone is forced to be a seeder, rather than just a peer. Private trackers would go nuts for a system like that.

Peers should be able to just download snips and pieces from a website, taking only what is required. You're not required to download all files mentioned in a torrent, why should you be forced to in zeronet?

>500MB is a lot
Are you retarded? That's less than a movie, or 2 series episodes. You're not hosting this on a phone ffs.

It's simple, people who have the capacity for hosting that many sites will host them. But 2GB isn't a lot so literally anyone can host at least 4 sites. 2GB is an average mobile net bandwidth limit so it's nothing for desktops. Unlimited bandwidth is cheap as fuck nowadays, and so are HDDs.
Obviously it has it's flaws, but everything does. It will serve just fine for a lot of websites.

>people who have the capacity for hosting that many sites will host them
wouldnt accessing a site mean you also host it?

What exactly are 'SJW govt regulations'? Do tell.

IPFS + i2p seems like the most realistic outcome, people have already been experimenting with zeronet and it hasn't really improved, GNUnet seems dead in the water, and nobody really uses Freenet. IPFS exists and it's good enough to use now, they're actively improving it too and seeing some adoption. I feel like browser integration could happen and that would be big.

>Anime
Hi.
Go to hell.

2difficult4me

Why not create a DC++ hub where you can share files p2p

Good question, OP, but why don't we all just go back to using Usenet?

Hi.
Go back to Rabbit.

Yes. I doubt too many sites would need that much storage anyways. And there wouldn't be as many sites probably. There's no need to have 10 different site alternatives which perform the same functions (no need for multiple social media, torrent or video sites like we have on regular www).

>The year of our Lord twenty hundred and seventeen.
>Using torrents
Gnushare

>Anime
Hi.
Go to hell.