一个 scrapy 解析器可以调用另一个解析器并在没有新请求的情况下发送响应吗?
can a scrapy parser call another parser and send response without a new request?
我是 scrapy 的新手。谁能告诉我如何使以下代码工作?在我的蜘蛛中,它停在 parse1.
谢谢!
def parse1(self,response):
response.meta['addedKey']=addedData1
self.parse_all(response)
def parse2(self,response):
response.meta['addedKey']=addedData2
self.parse_all(response)
def parse_all(self,response):
yield FormRequest(self.url,formdata={'key':response.meta['addedKey']},callback = self.someparser)
你必须从生成器中 yield,因为 parse1
:
def parse1(self,response):
response.meta['addedKey']=addedData1
for item in parse_all(response): # parse_all is a generator
yield item
我是 scrapy 的新手。谁能告诉我如何使以下代码工作?在我的蜘蛛中,它停在 parse1.
谢谢!
def parse1(self,response):
response.meta['addedKey']=addedData1
self.parse_all(response)
def parse2(self,response):
response.meta['addedKey']=addedData2
self.parse_all(response)
def parse_all(self,response):
yield FormRequest(self.url,formdata={'key':response.meta['addedKey']},callback = self.someparser)
你必须从生成器中 yield,因为 parse1
:
def parse1(self,response):
response.meta['addedKey']=addedData1
for item in parse_all(response): # parse_all is a generator
yield item