%PDF- %PDF-
Direktori : /usr/local/lib/python3.8/urllib/__pycache__/ |
Current File : //usr/local/lib/python3.8/urllib/__pycache__/robotparser.cpython-38.pyc |
U p©ßaÐ$ ã @ s\ d Z ddlZddlZddlZdgZe dd¡ZG dd„ dƒZG dd„ dƒZ G d d „ d ƒZ dS )a% robotparser.py Copyright (C) 2000 Bastian Kleineidam You can choose between two licenses when using this package: 1) GNU GPLv2 2) PSF license for Python 2.2 The robots.txt Exclusion Protocol is implemented as specified in http://www.robotstxt.org/norobots-rfc.txt é NÚRobotFileParserÚRequestRatezrequests secondsc @ sr e Zd ZdZddd„Zdd„ Zdd„ Zd d „ Zdd„ Zd d„ Z dd„ Z dd„ Zdd„ Zdd„ Z dd„ Zdd„ ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. Ú c C s2 g | _ g | _d | _d| _d| _| |¡ d| _d S )NFr )ÚentriesÚsitemapsÚ default_entryÚdisallow_allÚ allow_allÚset_urlÚlast_checked©ÚselfÚurl© r ú./usr/local/lib/python3.8/urllib/robotparser.pyÚ__init__ s zRobotFileParser.__init__c C s | j S )z·Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r ©r r r r Úmtime% s zRobotFileParser.mtimec C s ddl }| ¡ | _dS )zYSets the time the robots.txt file was last fetched to the current time. r N)Útimer )r r r r r Úmodified. s zRobotFileParser.modifiedc C s&