- It's recommended to check out the latest log via: the Stats page >> View log >> Tail
PROJECT (sourcing_v2), SPIDER (auto_selling_autotrader.co.uk)
2026-03-16 11:11:11 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
2026-03-16 11:11:11 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
2026-03-16 11:11:11 [auto_selling_autotrader.co.uk] INFO: Starting auto_selling_autotrader.co.uk spider
2026-03-16 11:11:11 [twisted] CRITICAL: Unhandled error in Deferred:
2026-03-16 11:11:11 [twisted] CRITICAL:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/twisted/internet/defer.py", line 2003, in _inlineCallbacks
result = context.run(gen.send, result)
File "/usr/local/lib/python3.11/dist-packages/scrapy/crawler.py", line 155, in crawl
self.spider = self._create_spider(*args, **kwargs)
File "/usr/local/lib/python3.11/dist-packages/scrapy/crawler.py", line 169, in _create_spider
return self.spidercls.from_crawler(self, *args, **kwargs)
File "/usr/local/lib/python3.11/dist-packages/scrapy/spiders/__init__.py", line 62, in from_crawler
spider = cls(*args, **kwargs)
File "/usr/src/app/crawlers/spiders/auto_selling_autotrader.py", line 92, in __init__
raise ValueError(
ValueError: Invalid retailerId: None, probably due to an invalid url_to_scrape
PROJECT (sourcing_v2), SPIDER (auto_selling_autotrader.co.uk)