I’m trying to build an anime scan downloader, so i got the right url because with i print the variable ‘link’ that give me a url, then i copy it on chrome and get me the image
But when i try to download it with the current python program, that doesn’t work.
Can anyone help figure me out with does that doesn’t work ?
import re
import requests
from bs4 import BeautifulSoup
import os
import time
from time import sleep
import sys, subprocess
import urllib.request
def clean_screen():
operating_system = sys.platform
if operating_system == 'win32':
subprocess.run('cls', shell=True)
elif operating_system == 'linux' or 'darwin':
subprocess.run('clear' , shell=True)
def download():
img_data = requests.get(link).content
with open(manga + '.jpg', 'wb') as handler:
handler.write(img_data)
clean_screen()
link = ""
soup = BeautifulSoup(link, 'html.parser')
urlbase = "https://sushiscan.net/wp-content/uploads/"
manga = input("Manga (ex Berserk): ")
chapter = input("Chapitre : ")
asktype = input("1 - Volume 2 - Chapter ")
n_pages = int(input())
if asktype == "1":
manga_type = "Tome"
elif asktype == "2":
manga_type = "chapter"
page = 1
str_page = ""
link = (urlbase + manga + manga_type + chapter + "-" + str_page + ".jpg")
format_name = (manga + manga_type + chapter + "-" + str_page)
link_list = []
for x in range(n_pages):
if page < 10:
str_page = ("00" + str(page))
link = (urlbase + manga + manga_type + chapter + "-" + str_page + ".jpg")
link_list.append(link)
download()
elif page > 10:
if page < 100:
str_page = ("0" + str(page))
link = (urlbase + manga + manga_type + chapter + "-" + str_page + ".jpg")
link_list.append(link)
print(link)
download()
else:
str_page = str(page)
link = (urlbase + manga + manga_type + chapter + "-" + str_page + ".jpg")
link_list.append(link)
print(link)
download()
page = page + 1
counter = 0
So this is my code and i was expecting to download all image localy, i already create the same type of downloader and it worked, so i don’t understand why in this case it doesn’t work as well?
Leo Fargues is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
2