首页 > 其他 > 详细

笔记-scrapy-extentions

时间:2018-11-11 13:11:31      阅读:136      评论:0      收藏:0      [点我收藏+]

笔记-scrapy-extentions

 

1.      extentions

1.1.    开始

The extensions framework provides a mechanism for inserting your own custom functionality into Scrapy.

Extensions are just regular classes that are instantiated at Scrapy startup, when extensions are initialized.

在scrapy中注册扩展类需要在settings中设置EXTENSIONS参数,该参数中的每一条扩展记代表一个扩展类,记录格式是扩展类的全路径。

EXTENSIONS = {

    ‘scrapy.extensions.corestats.CoreStats‘: 500,

    ‘scrapy.extensions.telnet.TelnetConsole‘: 500,

}

记录的值指定扩展类加载顺序,一般不用。

停用扩展类:将 EXTENSIONS_BASE设为 None.:

EXTENSIONS = {    ‘scrapy.extensions.corestats.CoreStats‘: None,}

 

1.2.    自定义扩展类

首先要知道scrapy从哪调用这些自定义扩展类,

第一步开始于crawler.py的self.extensions = ExtensionManager.from_crawler(self)

找到最后:

    @classmethod

    def from_settings(cls, settings, crawler=None):

        mwlist = cls._get_mwlist_from_settings(settings)

        middlewares = []

        enabled = []

        for clspath in mwlist:

            try:

                mwcls = load_object(clspath)

                if crawler and hasattr(mwcls, ‘from_crawler‘):

                    mw = mwcls.from_crawler(crawler)

                elif hasattr(mwcls, ‘from_settings‘):

                    mw = mwcls.from_settings(settings)

                else:

                    mw = mwcls()

                middlewares.append(mw)

                enabled.append(clspath)

 

核心就这句了mw = mwcls.from_crawler(crawler),官方文档描述如下:

Each extension is a Python class. The main entry point for a Scrapy extension (this also includes middlewares and pipelines) is the from_crawler class method which receives a Crawler instance. Through the Crawler object you can access settings, signals, stats, and also control the crawling behaviour.

Typically, extensions connect to signals and perform tasks triggered by them.

Finally, if the from_crawler method raises the NotConfigured exception, the extension will be disabled. Otherwise, the extension will be enabled.

意思是说扩展类必需要有from_crawler方法,scrapy会从这里初始化类。

1.2.1.   案例解说

下面是一个扩展类案例:

import logging

from scrapy import signals

from scrapy.exceptions import NotConfigured

 

logger = logging.getLogger(__name__)

 

class SpiderOpenCloseLogging(object):

 

    def __init__(self, item_count):

        self.item_count = item_count

        self.items_scraped = 0

 

    @classmethod

    def from_crawler(cls, crawler):

        # first check if the extension should be enabled and raise

        # NotConfigured otherwise

        if not crawler.settings.getbool(‘MYEXT_ENABLED‘):

            raise NotConfigured

 

        # get the number of items from settings

        item_count = crawler.settings.getint(‘MYEXT_ITEMCOUNT‘, 1000)

 

        # instantiate the extension object

        ext = cls(item_count)

 

        # connect the extension object to signals

        crawler.signals.connect(ext.spider_opened, signal=signals.spider_opened)

        crawler.signals.connect(ext.spider_closed, signal=signals.spider_closed)

        crawler.signals.connect(ext.item_scraped, signal=signals.item_scraped)

 

        # return the extension object

        return ext

 

    def spider_opened(self, spider):

        logger.info("opened spider %s", spider.name)

 

    def spider_closed(self, spider):

        logger.info("closed spider %s", spider.name)

 

    def item_scraped(self, item, spider):

        self.items_scraped += 1

        if self.items_scraped % self.item_count == 0:

            logger.info("scraped %d items", self.items_scraped)

 

看下它做了什么,

from_crawler初始化了类,

然后这三句决定了什么时候调用扩展类中的函数来执行操作。

        crawler.signals.connect(ext.spider_opened, signal=signals.spider_opened)

        crawler.signals.connect(ext.spider_closed, signal=signals.spider_closed)

        crawler.signals.connect(ext.item_scraped, signal=signals.item_scraped)

其中的函数定义操作。

关于singals参考scrapy-singals文档。

 

笔记-scrapy-extentions

原文:https://www.cnblogs.com/wodeboke-y/p/9941771.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!