Not mass downloading Sup Forums images

>not mass downloading Sup Forums images

Other urls found in this thread:

github.com/bibanon/BASC-Archiver
github.com/fellchase/Sup
github.com/Sup
twitter.com/AnonBabble

Post the script. I've always wanted to build an archive

>accidentally downloading the donut man

The question is, why would I want to do that?

I've done this before. It's a massive time sink.

>images
I download archives of the whole site, I wrote a custom scraper for it that uses the api to collect boards and thread id's and then I scrape everything

What kind of storage setup do you have?35

>officer I swear I didn't know !

>accidentally download Certain Pictures
>visit from the party van

Do you have any semblance of a life?

Source?

the data is analysed and scrapped afterwards

Why would I want to save every image on here?

But I am.

OP can you share the scrip so others can use.

...

im not a pedo to spend time hunting for random images.
i use thread-archiver to download specific threads i like like this guy.

post the script please

lol so needy, jesus.

github.com/bibanon/BASC-Archiver

Not OP, but I've heard of this one.

What if you save see pee by accident?

n-not FBI here

>the data is analysed

Is what way?

>not mass downloading Sup Forums images on your phone

If the image doesn't look like a little girl 12 years old are younger it doesn't get downloaded.

Share the script, user-kun.

Hoarding is a mental disease.

Everyone and their mother has written an image scraper for Sup Forums buddy

>not mass downloading Sup Forums mass downloaders

>he didn't make his own Sup Forums image downloader
pic related, it keeps on downloading till thread is 404 or archived

>Sup Forums
>anything worth download
Really moves the cogs if you know what I mean.

Looks pretty good, user. I like the minimalism.

pic related, it keeps on downloading till Sup Forums 404s or is archived -_-

pretty damn sexy tbqhfwy

>accidentally saving cheese pizza
yeah no...

>"accidentally"
I hate it when that happens.

>what is browser cache

this thread is fucking pathetic. one of the first programs I ever made was a Sup Forums image downloader. I still occassionally use it.

everyone here should code one up

>implying

You can write a script to do it using basc-py4chan in five minutes

for what purpose

>I've done this before. It's a massive time sink.
only if you're a fucking retard... or pajeet.

>subtly trying to brag about your shitty, 30-line script

>there are actually people in this thread who think using the Sup Forums api to download images is hard
Sup Forums - desktop threads and consumerism everyone

but I made my own json parser
but I made my own Sup Forums lib
but it works in just under 150 lines of code

congratulations, how does that have anything to do with what I said?

...

...

I did it in 15 lines of bash.
*shrug*

why dont you show us the code?

Its against the rules.

Just go on GitHub and search for Sup Forums Downloader, you'll find a lot of projects about it.

I only wrote a downloader for images since I didn't want to click save every image on /wg/.

Did it with beautifulsoup, is there a Sup Forums package for python or what?

There is a readonly API for it. Go google it. There probably is a library for it somewhere but honestly if all you want is to download images a simple API call will be enough

>Did it with beautifulsoup, is there a Sup Forums package for python or what?
Probably.

But you don't need to use bs or scrape the HTML for the images, you can request and parse the JSON file directly to get the information about the thread, including file names, original file names, MD5 and file size.

Hell, you can download an entire board or Sup Forums entirely by doing it, without using third party libraries, just json and urllib that's already included on Python.

I didn't know that. Still fairly new to this shit. I will look into it once I have enough time. Archiving some threads on my laptop (or simply counting different generals and their activity) should be quite fun.

If you are super lazy you can just use xidel to parse the json

what? It's objectively a waste of time. the script takes 5 minutes.

What does the 0 in your PS1 mean ? Status code of previous command ?

Yeah its whatever code the previous command returns.

rstatus() {
if [[ $? -eq 0 ]];then
echo 0
else
echo ${PR_RED}$?
fi
}

github.com/fellchase/Sup Forums-media-downloader

github.com/fellchase/Sup Forums-media-downloader
got it for you

empty hard drive is wasted hard drive

Enjoy your IP ban, faggots.

Enjoy your viruses

It's yours?
It's good but consider using the Sup Forums API instead, easier to parse and there's a lot of information about the thread/board on it that may be useful to you and to the user.

thanks its my first project first time ever I made something useful :P
Started that in april of this year actually I don't want to add more dependencies actually I was going to replace requests with urllib and all so that there'll be no dependencies, but I like bs4 and requets very much :D that's why kept it
I made this project to learn python and download threads from /gif/ :) please commit bro

You can use urllib and json, it's already included on Python so there's no need for third party libs.

You can easily expand it to use on other boards and even other imageboards as well.

I've made one with GUI (PyQt4) and it's quite easy too, so you can do a lot with it and it make things very easy to normies and illiterate people (if that's what you want).

What's the point of this? Do people like hoarding images or what?

This desu

>mass downloading from boards where 90% of images are irrelevant
why?

bro you're on github?

DownThemAll!

Bash + cURL + jq
Why even bother going into Python for something so simple

For buggers too lazy to google:
github.com/Sup Forums/4chan-API - read-only Sup Forums API, enjoy.

well yea, gotta grow my Sup Forums.
It's a good idea to check for duplicates every once in a while

Tfw I made one using Qt and C++. Still use it since it's ez to use and werks.

Also if you have to manually put in the board and thread ID, you did a shit job and should redo it.

This.
Babby's first script.

>Auto-Fetch all Images from Sup Forums to get hands free

kek

cp harvesting on Sup Forums

>I downloaded a bunch of images using a super easy to use public API, aren't I special?

It can be done with more or less any competent language in about 40-50 lines tops, it might take you maybe 15 mins to have it all done and dusted.

>If the image DOESN'T look like a little girl 12 years old are younger it DOESN'T get downloaded.

Jesi ti onaj nas sa int, ex-yu ?

>ex-yu
Careful, he might be Deki.

>inb4 albozerg

This. I've written shit like this for a whole bunch of websites.

>but I made my own json parser
Why would you reinvent the wheel
Also import json

Deki > Fredi