2026-03-12 14:58:00 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: SourcingV2)
2026-03-12 14:58:00 [scrapy.utils.log] INFO: Versions: lxml 5.2.2.0, libxml2 2.12.6, cssselect 1.2.0, parsel 1.9.1, w3lib 2.1.2, Twisted 24.3.0, Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0], pyOpenSSL 24.1.0 (OpenSSL 3.2.1 30 Jan 2024), cryptography 42.0.7, Platform Linux-5.15.0-1098-azure-x86_64-with-glibc2.36
2026-03-12 14:58:00 [arval.uk] INFO: Starting spider arval.uk
2026-03-12 14:58:00 [scrapy.addons] INFO: Enabled addons:
[]
2026-03-12 14:58:00 [asyncio] DEBUG: Using selector: EpollSelector
2026-03-12 14:58:00 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor
2026-03-12 14:58:00 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop
2026-03-12 14:58:00 [scrapy.extensions.telnet] INFO: Telnet Password: 72032cd29c8971a3
2026-03-12 14:58:00 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.closespider.CloseSpider']
2026-03-12 14:58:00 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'SourcingV2',
'CLOSESPIDER_TIMEOUT': 7200,
'DOWNLOAD_MAXSIZE': 52428800,
'DOWNLOAD_WARNSIZE': 10485760,
'FEED_EXPORT_ENCODING': 'utf-8',
'LOG_FILE': '/var/log/scrapyd/logs/sourcing_v2/arval.uk/2026-03-12T14_57_57.log',
'LOG_FORMATTER': 'crawlers.log_formatter.SourcingLogFormatter',
'MEMUSAGE_LIMIT_MB': 2048,
'MEMUSAGE_WARNING_MB': 1536,
'NEWSPIDER_MODULE': 'spiders',
'REQUEST_FINGERPRINTER_CLASS': 'scrapy_zyte_api.ScrapyZyteAPIRequestFingerprinter',
'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7',
'SPIDER_MODULES': ['spiders', 'auth_check'],
'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor',
'USER_AGENT': ''}
2026-03-12 14:58:00 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
2026-03-12 14:58:00 [scrapy_zyte_api.handler] INFO: Using a Zyte API key starting with '5857011'
2026-03-12 14:58:00 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.offsite.OffsiteMiddleware',
'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy_zyte_api.ScrapyZyteAPIDownloaderMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2026-03-12 14:58:00 [crawlers.middlewares.id_gen_middleware] INFO: Setting up IdGenerationMiddleware
2026-03-12 14:58:00 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy_zyte_api.ScrapyZyteAPISpiderMiddleware',
'crawlers.middlewares.monitoring_spider_middleware.MonitoringSpiderMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware',
'crawlers.middlewares.photo_download_middleware.PhotoDownloadMiddleware',
'crawlers.middlewares.report_download_middleware.ReportDownloadMiddleware',
'crawlers.middlewares.id_gen_middleware.IdGenMiddleware']
2026-03-12 14:58:00 [azure.identity._credentials.environment] INFO: Incomplete environment configuration for EnvironmentCredential. These variables are set: AZURE_TENANT_ID, AZURE_CLIENT_ID
2026-03-12 14:58:00 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): login.microsoftonline.com:443
2026-03-12 14:58:00 [urllib3.connectionpool] DEBUG: https://login.microsoftonline.com:443 "POST /8ea908c1-4e85-4692-bc3f-3646b9b40891/oauth2/v2.0/token HTTP/1.1" 200 2102
2026-03-12 14:58:00 [azure.identity._credentials.chained] INFO: DefaultAzureCredential acquired a token from WorkloadIdentityCredential
2026-03-12 14:58:01 [crawlers.pipelines.translation_pipeline] INFO: Loading translations for language: en
2026-03-12 14:58:01 [crawlers.pipelines.item_rules_pipeline] INFO: Setting up ItemRules Pipeline
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_location_for_country.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_cars_from_auction_title.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_country.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_fr.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_photos.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: damaged_from_info.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_not_allowed.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: not_operable_from_info.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: makes_models_not_allowed.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_title.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: imported_cars.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_currency.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_mileage.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: keywords_from_auction_title.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: invalid_country_of_origin.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: puretech_for_pt.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: electric_cars.json
2026-03-12 14:58:01 [crawlers.filter_rules.rules_loader] INFO: Loaded rule: missing_color.json
2026-03-12 14:58:01 [crawlers.pipelines.post_to_api] INFO: Setting up PostToApi Pipeline pointing to https://api.alx.dev-cluster.alx.tech/api/command/Alx.Cars.Contracts.Internal.Sourcing.AddListing
2026-03-12 14:58:01 [scrapy.middleware] INFO: Enabled item pipelines:
['crawlers.pipelines.translation_pipeline.TranslationPipeline',
'crawlers.pipelines.item_rules_pipeline.ItemRulesPipeline',
'crawlers.pipelines.post_to_api.PostToApiPipeline']
2026-03-12 14:58:01 [scrapy.core.engine] INFO: Spider opened
2026-03-12 14:58:01 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2026-03-12 14:58:01 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
2026-03-12 14:58:01 [scrapy.extensions.memusage] INFO: Peak memory usage is 139MiB
2026-03-12 14:58:01 [scrapy-playwright] INFO: Starting download handler
2026-03-12 14:58:01 [scrapy-playwright] INFO: Starting download handler
2026-03-12 14:58:06 [scrapy-playwright] INFO: Launching browser firefox
2026-03-12 14:58:06 [scrapy-playwright] INFO: Browser firefox launched
2026-03-12 14:58:06 [scrapy-playwright] DEBUG: Browser context started: 'default' (persistent=False, remote=False)
2026-03-12 14:58:07 [scrapy-playwright] DEBUG: [Context=default] New page created, page count is 1 (1 for all contexts)
2026-03-12 14:58:07 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/> (resource type: document)
2026-03-12 14:58:07 [scrapy-playwright] DEBUG: [Context=default] Response: <407 https://autoselect.arval.co.uk/>
2026-03-12 14:58:07 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/> (resource type: document)
2026-03-12 14:58:07 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://cdn.cookielaw.org/scripttemplates/otSDKStub.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://cdn.cookielaw.org/scripttemplates/otSDKStub.js>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans.woff2> (resource type: font, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans-bold.woff2> (resource type: font, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans-light.woff2> (resource type: font, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/fonts/bnpp-sans-condensed/bnpp-sans-cond-v2.woff2> (resource type: font, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/fonts/bnpp-sans-condensed/bnpp-sans-cond-bold-v2.woff2> (resource type: font, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/arval_styles/arval.css> (resource type: stylesheet, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/scripts/arval_scripts/polyfills.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/scripts/arval_scripts/runtime.min.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/scripts/arval_scripts/arval.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/scripts/arval_scripts/vandor.min.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/images/arval-autoselect-logo.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/images/arval-autoselect-logo.png>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/CountriesData/UK/images/banners/banner.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/CountriesData/UK/images/banners/banner.png>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/CountriesData/UK/images/newuk/images/Spring/AutoSelectHomeSpring.jpg> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/CountriesData/UK/images/newuk/images/Spring/AutoSelectHomeSpring.jpg>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://js.monitor.azure.com/scripts/b/ai.2.min.js> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans.woff2>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans-light.woff2>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/scripts/arval_scripts/runtime.min.js>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/scripts/arval_scripts/arval.js>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/arval_styles/arval.css>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/fonts/bnpp-sans-condensed/bnpp-sans-cond-bold-v2.woff2>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/images/arval-autoselect-logo.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/images/arval-autoselect-logo.png>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/CountriesData/UK/images/banners/banner.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/CountriesData/UK/images/banners/banner.png>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/CountriesData/UK/images/newuk/images/Spring/AutoSelectHomeSpring.jpg> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://www.googletagmanager.com/gtm.js?id=GTM-MVPKT57> (resource type: script, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/CountriesData/UK/images/newuk/images/Spring/AutoSelectHomeSpring.jpg>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://www.googletagmanager.com/gtm.js?id=GTM-MVPKT57>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/scripts/arval_scripts/vandor.min.js>
2026-03-12 14:58:08 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/fonts/bnpp-sans/bnpp-sans-bold.woff2>
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/styles/fonts/bnpp-sans-condensed/bnpp-sans-cond-v2.woff2>
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/arval_icons/android-icon-192x192.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/styles/arval_icons/android-icon-192x192.png>
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://autoselect.arval.co.uk/styles/arval_icons/favicon-16x16.png> (resource type: image, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Aborted Playwright request <GET https://autoselect.arval.co.uk/styles/arval_icons/favicon-16x16.png>
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://autoselect.arval.co.uk/scripts/arval_scripts/polyfills.js>
2026-03-12 14:58:09 [scrapy-playwright] DEBUG: [Context=default] Response: <200 https://js.monitor.azure.com/scripts/b/ai.2.min.js>
2026-03-12 14:58:10 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://arval-prod-euw-appservice-portalapi.azurewebsites.net/api/Announcements/3/?pageSize=12&purchaseOption=release&reservationLabels=available&_=1773327488765> (resource type: xhr, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:10 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://arval-prod-euw-appservice-portalapi.azurewebsites.net/api/Filters/3?purchaseOption=release&reservationLabels=available> (resource type: xhr, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:10 [scrapy-playwright] DEBUG: [Context=default] Request: <GET https://arval-prod-euw-appservice-portalapi.azurewebsites.net/api/Filters/3?purchaseOption=sale&reservationLabels=available&orderBy=reLeasePriceGross|asc> (resource type: xhr, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:10 [scrapy-playwright] DEBUG: [Context=default] Request: <POST https://dc.services.visualstudio.com/v2/track> (resource type: xhr, referrer: https://autoselect.arval.co.uk/)
2026-03-12 14:58:10 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://autoselect.arval.co.uk> (referer: None) ['playwright']
2026-03-12 14:58:10 [arval.uk] INFO: Scrapy-formatted cookies: [{'name': 'ARRAffinity', 'value': '5f4dbd6a5f3bbb38bdbcbc8f545e992b01d45d94bac4a7318fa89b741ba513a9', 'domain': '.autoselect.arval.co.uk', 'path': '/'}, {'name': 'ARRAffinitySameSite', 'value': '5f4dbd6a5f3bbb38bdbcbc8f545e992b01d45d94bac4a7318fa89b741ba513a9', 'domain': '.autoselect.arval.co.uk', 'path': '/'}, {'name': 'ai_user', 'value': 'cYNUB1dp9C2mH+21JwgBsm|2026-03-12T14:58:10.532Z', 'domain': 'autoselect.arval.co.uk', 'path': '/'}, {'name': 'ai_session', 'value': 'E+7+jfUg7jDF1ACt/6O3XG|1773327490534|1773327490534', 'domain': 'autoselect.arval.co.uk', 'path': '/'}]
2026-03-12 14:58:10 [zyte_api._retry] DEBUG: Starting call to 'zyte_api._async.AsyncZyteAPI.get.<locals>.request', this is the 1st time calling it.
2026-03-12 14:58:13 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://arval-prod-euw-appservice-portalapi.azurewebsites.net/api/Announcements/3?orderBy=salePriceGross%7Casc&pageNumber=1&pageSize=20&purchaseOption=sale&reservationLabels=available> (referer: https://autoselect.arval.co.uk) ['zyte-api']
2026-03-12 14:58:13 [arval.uk] INFO: Found listing with ID: 115424
2026-03-12 14:58:13 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:13 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:13 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:13 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:13 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:14 [crawlers.middlewares.monitoring_spider_middleware] ERROR: 'int' object is not iterable
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:14 [scrapy.core.scraper] ERROR: Spider error processing <GET https://arval-prod-euw-appservice-portalapi.azurewebsites.net/api/Announcements/3?orderBy=salePriceGross%7Casc&pageNumber=1&pageSize=20&purchaseOption=sale&reservationLabels=available> (referer: https://autoselect.arval.co.uk)
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/defer.py", line 295, in aiter_errback
yield await it.__anext__()
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
return await self.data.__anext__()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
async for o in as_async_generator(it):
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
async for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 374, in __anext__
return await self.data.__anext__()
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/python.py", line 355, in _async_chain
async for o in as_async_generator(it):
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 14, in as_async_generator
async for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy_zyte_api/_middlewares.py", line 206, in process_spider_output_async
async for item_or_request in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/referer.py", line 355, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/urllength.py", line 30, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/local/lib/python3.11/dist-packages/scrapy/spidermiddlewares/depth.py", line 35, in process_spider_output_async
async for r in result or ():
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/photo_download_middleware.py", line 42, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/report_download_middleware.py", line 32, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 121, in process_async
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 118, in process_async
async for r in iterable:
File "/usr/src/app/crawlers/middlewares/id_gen_middleware.py", line 20, in process_spider_output
async for item in result:
File "/usr/local/lib/python3.11/dist-packages/scrapy/utils/asyncgen.py", line 17, in as_async_generator
for r in it:
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 109, in process_sync
exception_result = self._process_spider_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 150, in _process_spider_exception
result = method(response=response, exception=exception, spider=spider)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/middlewares/monitoring_spider_middleware.py", line 73, in process_spider_exception
raise exception
File "/usr/local/lib/python3.11/dist-packages/scrapy/core/spidermw.py", line 106, in process_sync
for r in iterable:
File "/usr/src/app/crawlers/spiders/arval.py", line 191, in parse_listings
if not self.needs_full_scrape(identifier):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 397, in needs_full_scrape
scrape_type = self._scrape_needed(identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 448, in _scrape_needed
res = self.load_from_azure_tables("ScrapedListings", self.name, identifier)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/crawlers/spiders/base.py", line 251, in load_from_azure_tables
return connect_and_load_data(table, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 215, in connect_and_load_data
return load_data(table_client, partition_key, row_key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 142, in load_data
row_key = sanitize_key(row_key)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/app/common/azure_tables.py", line 124, in sanitize_key
for c in key:
TypeError: 'int' object is not iterable
2026-03-12 14:58:14 [scrapy.core.engine] INFO: Closing spider (finished)
2026-03-12 14:58:14 [arval.uk] INFO: arval.uk Crawl ended with reason finished, scrape types: {<ScrapeType.NEW: 1>: 0, <ScrapeType.NEW_DUPLICATE_ID: 4>: 0, <ScrapeType.PRICE_UPDATE: 2>: 0, <ScrapeType.AUCTION_UPDATE: 3>: 0, <ScrapeType.SKIPPED: 0>: 0, <ScrapeType.BATCH_SKIPPED: 5>: 0}
2026-03-12 14:58:14 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (0 items) in: file:///var/lib/scrapyd/items/sourcing_v2/arval.uk/2026-03-12T14_57_57.jl
2026-03-12 14:58:14 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 879,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 408829,
'downloader/response_count': 2,
'downloader/response_status_count/200': 2,
'elapsed_time_seconds': 13.186079,
'feedexport/success_count/FileFeedStorage': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2026, 3, 12, 14, 58, 14, 319149, tzinfo=datetime.timezone.utc),
'log_count/DEBUG': 60,
'log_count/ERROR': 8,
'log_count/INFO': 45,
'memusage/max': 146698240,
'memusage/startup': 146698240,
'playwright/context_count': 1,
'playwright/context_count/max_concurrent': 1,
'playwright/context_count/persistent/False': 1,
'playwright/context_count/remote/False': 1,
'playwright/page_count': 1,
'playwright/page_count/max_concurrent': 1,
'playwright/request_count': 27,
'playwright/request_count/aborted': 10,
'playwright/request_count/method/GET': 26,
'playwright/request_count/method/POST': 1,
'playwright/request_count/navigation': 2,
'playwright/request_count/resource_type/document': 2,
'playwright/request_count/resource_type/font': 5,
'playwright/request_count/resource_type/image': 8,
'playwright/request_count/resource_type/script': 7,
'playwright/request_count/resource_type/stylesheet': 1,
'playwright/request_count/resource_type/xhr': 4,
'playwright/response_count': 13,
'playwright/response_count/method/GET': 13,
'playwright/response_count/resource_type/document': 2,
'playwright/response_count/resource_type/font': 5,
'playwright/response_count/resource_type/script': 5,
'playwright/response_count/resource_type/stylesheet': 1,
'request_depth_max': 1,
'response_received_count': 2,
'scheduler/dequeued': 2,
'scheduler/dequeued/memory': 2,
'scheduler/enqueued': 2,
'scheduler/enqueued/memory': 2,
'scrapy-zyte-api/429': 0,
'scrapy-zyte-api/attempts': 1,
'scrapy-zyte-api/error_ratio': 0.0,
'scrapy-zyte-api/errors': 0,
'scrapy-zyte-api/fatal_errors': 0,
'scrapy-zyte-api/mean_connection_seconds': 2.4458578806370497,
'scrapy-zyte-api/mean_response_seconds': 2.481478948146105,
'scrapy-zyte-api/processed': 1,
'scrapy-zyte-api/request_args/customHttpRequestHeaders': 1,
'scrapy-zyte-api/request_args/experimental.responseCookies': 1,
'scrapy-zyte-api/request_args/httpResponseBody': 1,
'scrapy-zyte-api/request_args/httpResponseHeaders': 1,
'scrapy-zyte-api/request_args/sessionContext': 1,
'scrapy-zyte-api/request_args/url': 1,
'scrapy-zyte-api/status_codes/200': 1,
'scrapy-zyte-api/success': 1,
'scrapy-zyte-api/success_ratio': 1.0,
'scrapy-zyte-api/throttle_ratio': 0.0,
'spider_exceptions/TypeError': 1,
'start_time': datetime.datetime(2026, 3, 12, 14, 58, 1, 133070, tzinfo=datetime.timezone.utc)}
2026-03-12 14:58:14 [scrapy.core.engine] INFO: Spider closed (finished)