Форум сайта python.su
0
Как с помощью этого скрипта разом проверить несколько url?
from urllib.request import Request, urlopen from urllib.error import URLError, HTTPError req = Request("http://gu.mon.mos.ru/zabbix.php") try: response = urlopen(req) except HTTPError as e: print('The server fulfill the request.') print('Error code: ', e.code) except URLError as e: print('We failed to reach a server.') print('Reason: ', e.reason) else: print('Website is working fine')
Офлайн
0
аа ну вот, что то нашел
import urllib.request URLS = ['http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/', 'http://www.bbc.co.uk/', 'http://some-made-up-domain.com/'] # Retrieve a single page and report the URL and contents def load_url(url, timeout): with urllib.request.urlopen(url, timeout=timeout) as conn: return conn.read()
Офлайн
13
Сокет модуль:
from socket import gethostbyname URLS = ['http://www.foxnews.com/', 'http://www.cnn.com/', 'http://europe.wsj.com/', 'http://www.bbc.co.uk/', 'http://some-made-up-domain.com/'] for url in URLS: try: gethostbyname(url) except: print("Connection error, so go and check ur urls")
# Life loop while alive: if (fun > boredom) and money: pass_day(fun, boredom, money) continue else: break
Офлайн