2025-09-26 00:53:10 [scrapy.utils.log] INFO: Scrapy 2.11.1 started (bot: news_scraper) 2025-09-26 00:53:10 [scrapy.utils.log] INFO: Versions: lxml 6.0.0.0, libxml2 2.14.4, cssselect 1.3.0, parsel 1.10.0, w3lib 2.3.1, Twisted 25.5.0, Python 3.11.13 (main, Jul 15 2025, 19:29:01) [GCC 14.2.0], pyOpenSSL 25.1.0 (OpenSSL 3.5.1 1 Jul 2025), cryptography 45.0.5, Platform Linux-5.15.0-139-generic-x86_64-with 2025-09-26 00:53:10 [scrapy.addons] INFO: Enabled addons: [] 2025-09-26 00:53:10 [asyncio] DEBUG: Using selector: EpollSelector 2025-09-26 00:53:10 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor 2025-09-26 00:53:10 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.unix_events._UnixSelectorEventLoop 2025-09-26 00:53:10 [scrapy.extensions.telnet] INFO: Telnet Password: 976de417ebe17bb2 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-call.apigateway to before-call.api-gateway 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.route53 to before-parameter-build.route-53 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/endpoints.json 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/sdk-default-configuration.json 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Event choose-service-name: calling handler 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/service-2.json.gz 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/endpoint-rule-set-1.json.gz 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/partitions.json 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler ._handler at 0x7fe4aa0fbd80> 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler 2025-09-26 00:53:10 [botocore.endpoint] DEBUG: Setting s3 timeout as (60, 60) 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/_retry.json 2025-09-26 00:53:10 [botocore.client] DEBUG: Registering retry handlers for service: s3 2025-09-26 00:53:10 [botocore.utils] DEBUG: Registering S3 region redirector handler 2025-09-26 00:53:10 [botocore.utils] DEBUG: Registering S3Express Identity Resolver 2025-09-26 00:53:10 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.closespider.CloseSpider', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.throttle.AutoThrottle'] 2025-09-26 00:53:10 [scrapy.crawler] INFO: Overridden settings: {'AUTOTHROTTLE_ENABLED': True, 'BOT_NAME': 'news_scraper', 'CLOSESPIDER_TIMEOUT': 1800, 'CONCURRENT_REQUESTS': 4, 'DOWNLOAD_DELAY': 2, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_FILE': '/opt/scrapyd/logs/news_scraper/vnexpress_timestamp/25df637e9a7311f086971e907748958e.log', 'NEWSPIDER_MODULE': 'news_scraper.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['news_scraper.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2025-09-26 00:53:10 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'news_scraper.middlewares.NewsScraperDownloaderMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2025-09-26 00:53:10 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2025-09-26 00:53:10 [scrapy.middleware] INFO: Enabled item pipelines: [] 2025-09-26 00:53:10 [scrapy.core.engine] INFO: Spider opened 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-call.apigateway to before-call.api-gateway 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.route53 to before-parameter-build.route-53 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search 2025-09-26 00:53:10 [botocore.hooks] DEBUG: Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section 2025-09-26 00:53:10 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/endpoints.json 2025-09-26 00:53:11 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/sdk-default-configuration.json 2025-09-26 00:53:11 [botocore.hooks] DEBUG: Event choose-service-name: calling handler 2025-09-26 00:53:11 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/service-2.json.gz 2025-09-26 00:53:11 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/s3/2006-03-01/endpoint-rule-set-1.json.gz 2025-09-26 00:53:11 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/partitions.json 2025-09-26 00:53:11 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler 2025-09-26 00:53:11 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler ._handler at 0x7fe4a9258ea0> 2025-09-26 00:53:11 [botocore.hooks] DEBUG: Event creating-client-class.s3: calling handler 2025-09-26 00:53:11 [botocore.endpoint] DEBUG: Setting s3 timeout as (60, 60) 2025-09-26 00:53:11 [botocore.loaders] DEBUG: Loading JSON file: /usr/local/lib/python3.11/site-packages/botocore/data/_retry.json 2025-09-26 00:53:11 [botocore.client] DEBUG: Registering retry handlers for service: s3 2025-09-26 00:53:11 [botocore.utils] DEBUG: Registering S3 region redirector handler 2025-09-26 00:53:11 [botocore.utils] DEBUG: Registering S3Express Identity Resolver 2025-09-26 00:53:11 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2025-09-26 00:53:11 [vnexpress_timestamp] INFO: Spider opened: vnexpress_timestamp 2025-09-26 00:53:11 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6030 2025-09-26 00:53:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2025-09-26 00:53:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2025-09-26 00:53:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/) 2025-09-26 00:53:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:23 [vnexpress_timestamp] INFO: 2025-09-26 06:42:33 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:25 [vnexpress_timestamp] INFO: 2025-09-26 06:50:10 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:28 [vnexpress_timestamp] INFO: 2025-09-26 07:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:30 [vnexpress_timestamp] INFO: 2025-09-26 07:09:14 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:33 [vnexpress_timestamp] INFO: 2025-09-26 00:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:36 [vnexpress_timestamp] INFO: 2025-09-26 00:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:38 [vnexpress_timestamp] INFO: 2025-09-26 00:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:40 [vnexpress_timestamp] INFO: 2025-09-26 00:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:43 [vnexpress_timestamp] INFO: 2025-09-26 00:01:19 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:46 [vnexpress_timestamp] INFO: 2025-09-26 00:02:01 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:49 [vnexpress_timestamp] INFO: 2025-09-26 00:03:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:52 [vnexpress_timestamp] INFO: 2025-09-26 01:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:54 [vnexpress_timestamp] INFO: 2025-09-26 02:54:47 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:57 [vnexpress_timestamp] INFO: 2025-09-26 03:00:44 smaller than 2025-09-26 07:37:00 2025-09-26 00:53:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:53:59 [vnexpress_timestamp] INFO: 2025-09-26 04:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:01 [vnexpress_timestamp] INFO: 2025-09-26 05:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:04 [vnexpress_timestamp] INFO: 2025-09-26 05:05:24 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:07 [vnexpress_timestamp] INFO: 2025-09-26 05:20:23 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: https://vnexpress.net/tin-tuc-24h) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 279, in iter_errback yield next(it) ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in return (self._set_referer(r, response) for r in result or ()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in return (r for r in result or () if self._filter(r, response, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/opt/scrapy_projects/news_scraper/spiders/vnexpress_timestamp_spider.py", line 67, in parse_article article_valid = item.parse_metadata_with_xpath( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/scrapy_projects/news_scraper/items.py", line 176, in parse_metadata_with_xpath published_date = dparser.parse(published_date_raw, dayfirst=day_first, ignoretz=True, fuzzy=True).date() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 643, in parse raise ParserError("Unknown string format: %s", timestr) dateutil.parser._parser.ParserError: Unknown string format: 2025-09-26 05:41 + 07:00 2025-09-26 00:54:11 [scrapy.extensions.logstats] INFO: Crawled 22 pages (at 22 pages/min), scraped 0 items (at 0 items/min) 2025-09-26 00:54:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:11 [vnexpress_timestamp] INFO: 2025-09-26 05:42:37 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: https://vnexpress.net/tin-tuc-24h) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 279, in iter_errback yield next(it) ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in return (self._set_referer(r, response) for r in result or ()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in return (r for r in result or () if self._filter(r, response, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/opt/scrapy_projects/news_scraper/spiders/vnexpress_timestamp_spider.py", line 67, in parse_article article_valid = item.parse_metadata_with_xpath( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/scrapy_projects/news_scraper/items.py", line 176, in parse_metadata_with_xpath published_date = dparser.parse(published_date_raw, dayfirst=day_first, ignoretz=True, fuzzy=True).date() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 643, in parse raise ParserError("Unknown string format: %s", timestr) dateutil.parser._parser.ParserError: Unknown string format: 2025-09-26 06:00 + 07:00 2025-09-26 00:54:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:16 [vnexpress_timestamp] INFO: 2025-09-26 06:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:18 [vnexpress_timestamp] INFO: 2025-09-26 06:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:21 [vnexpress_timestamp] INFO: 2025-09-26 06:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:23 [vnexpress_timestamp] INFO: 2025-09-26 06:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:26 [vnexpress_timestamp] INFO: 2025-09-26 06:00:00 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: https://vnexpress.net/tin-tuc-24h) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 279, in iter_errback yield next(it) ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in return (self._set_referer(r, response) for r in result or ()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in return (r for r in result or () if self._filter(r, response, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/opt/scrapy_projects/news_scraper/spiders/vnexpress_timestamp_spider.py", line 67, in parse_article article_valid = item.parse_metadata_with_xpath( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/scrapy_projects/news_scraper/items.py", line 176, in parse_metadata_with_xpath published_date = dparser.parse(published_date_raw, dayfirst=day_first, ignoretz=True, fuzzy=True).date() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 643, in parse raise ParserError("Unknown string format: %s", timestr) dateutil.parser._parser.ParserError: Unknown string format: 2025-09-26 06:00 + 07:00 2025-09-26 00:54:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: https://vnexpress.net/tin-tuc-24h) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/scrapy/utils/defer.py", line 279, in iter_errback yield next(it) ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/utils/python.py", line 350, in __next__ return next(self.data) ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/offsite.py", line 28, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/referer.py", line 352, in return (self._set_referer(r, response) for r in result or ()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/urllength.py", line 27, in return (r for r in result or () if self._filter(r, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/usr/local/lib/python3.11/site-packages/scrapy/spidermiddlewares/depth.py", line 31, in return (r for r in result or () if self._filter(r, response, spider)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/scrapy/core/spidermw.py", line 106, in process_sync for r in iterable: File "/opt/scrapy_projects/news_scraper/spiders/vnexpress_timestamp_spider.py", line 67, in parse_article article_valid = item.parse_metadata_with_xpath( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/scrapy_projects/news_scraper/items.py", line 176, in parse_metadata_with_xpath published_date = dparser.parse(published_date_raw, dayfirst=day_first, ignoretz=True, fuzzy=True).date() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 1368, in parse return DEFAULTPARSER.parse(timestr, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/dateutil/parser/_parser.py", line 643, in parse raise ParserError("Unknown string format: %s", timestr) dateutil.parser._parser.ParserError: Unknown string format: 2025-09-26 06:21 + 07:00 2025-09-26 00:54:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:35 [vnexpress_timestamp] INFO: 2025-09-26 06:24:12 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: https://vnexpress.net/tin-tuc-24h) 2025-09-26 00:54:36 [vnexpress_timestamp] INFO: 2025-09-26 07:25:38 smaller than 2025-09-26 07:37:00 2025-09-26 00:54:36 [scrapy.core.engine] INFO: Closing spider (finished) 2025-09-26 00:54:36 [boto3.s3.transfer] DEBUG: Opting out of CRT Transfer Manager. Preferred client: auto, CRT available: False, Instance Optimized: False. 2025-09-26 00:54:36 [boto3.s3.transfer] DEBUG: Using default client. pid: 171731, thread: 140620065315640 2025-09-26 00:54:36 [s3transfer.utils] DEBUG: Acquiring 0 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: UploadSubmissionTask(transfer_id=0, {'transfer_future': }) about to wait for the following futures [] 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: UploadSubmissionTask(transfer_id=0, {'transfer_future': }) done waiting for dependent futures 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: Executing task UploadSubmissionTask(transfer_id=0, {'transfer_future': }) with kwargs {'client': , 'config': , 'osutil': , 'request_executor': , 'transfer_future': } 2025-09-26 00:54:36 [s3transfer.futures] DEBUG: Submitting task PutObjectTask(transfer_id=0, {'bucket': 'dagster-output-data', 'key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'extra_args': {}}) to executor for transfer request: 0. 2025-09-26 00:54:36 [s3transfer.utils] DEBUG: Acquiring 0 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: PutObjectTask(transfer_id=0, {'bucket': 'dagster-output-data', 'key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'extra_args': {}}) about to wait for the following futures [] 2025-09-26 00:54:36 [s3transfer.utils] DEBUG: Releasing acquire 0/None 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: PutObjectTask(transfer_id=0, {'bucket': 'dagster-output-data', 'key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'extra_args': {}}) done waiting for dependent futures 2025-09-26 00:54:36 [s3transfer.tasks] DEBUG: Executing task PutObjectTask(transfer_id=0, {'bucket': 'dagster-output-data', 'key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'extra_args': {}}) with kwargs {'client': , 'fileobj': , 'bucket': 'dagster-output-data', 'key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'extra_args': {}} 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-parameter-build.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-endpoint-resolution.s3: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-endpoint-resolution.s3: calling handler > 2025-09-26 00:54:36 [botocore.regions] DEBUG: Calling endpoint provider with parameters: {'Bucket': 'dagster-output-data', 'Region': 'us-east-1', 'UseFIPS': False, 'UseDualStack': False, 'Endpoint': 'https://lake-api.actable.ai/', 'ForcePathStyle': True, 'Accelerate': False, 'UseGlobalEndpoint': True, 'Key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'DisableMultiRegionAccessPoints': False, 'UseArnRegion': True} 2025-09-26 00:54:36 [botocore.regions] DEBUG: Endpoint provider result: https://lake-api.actable.ai/dagster-output-data 2025-09-26 00:54:36 [botocore.regions] DEBUG: Selecting from endpoint provider's list of auth schemes: "sigv4". User selected auth scheme is: "None" 2025-09-26 00:54:36 [botocore.regions] DEBUG: Selected auth type "v4" as "v4" with signing context params: {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True} 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-call.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-call.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.handlers] DEBUG: Adding expect 100 continue header to request. 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-call.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-call.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-call.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.endpoint] DEBUG: Making request for OperationModel(name=PutObject) with params: {'url_path': '/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'query_string': {}, 'method': 'PUT', 'headers': {'User-Agent': 'Boto3/1.34.57 md/Botocore#1.34.162 ua/2.0 os/linux#5.15.0-139-generic md/arch#x86_64 lang/python#3.11.13 md/pyimpl#CPython cfg/retry-mode#legacy Botocore/1.34.162', 'Content-MD5': '1B2M2Y8AsgTpgAmY7PhCfg==', 'Expect': '100-continue'}, 'body': , 'auth_path': '/dagster-output-data/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'url': 'https://lake-api.actable.ai/dagster-output-data/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'context': {'client_region': 'us-east-1', 'client_config': , 'has_streaming_input': True, 'auth_type': 'v4', 's3_redirect': {'redirected': False, 'bucket': 'dagster-output-data', 'params': {'Bucket': 'dagster-output-data', 'Key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl', 'Body': }}, 'input_params': {'Bucket': 'dagster-output-data', 'Key': 'vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl'}, 'signing': {'region': 'us-east-1', 'signing_name': 's3', 'disableDoubleEncoding': True}, 'endpoint_properties': {'authSchemes': [{'disableDoubleEncoding': True, 'name': 'sigv4', 'signingName': 's3', 'signingRegion': 'us-east-1'}]}}} 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event request-created.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event request-created.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event choose-signer.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event choose-signer.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-sign.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event before-sign.s3.PutObject: calling handler > 2025-09-26 00:54:36 [botocore.auth] DEBUG: Calculating signature using v4 auth. 2025-09-26 00:54:36 [botocore.auth] DEBUG: CanonicalRequest: PUT /dagster-output-data/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl content-md5:1B2M2Y8AsgTpgAmY7PhCfg== host:lake-api.actable.ai x-amz-content-sha256:UNSIGNED-PAYLOAD x-amz-date:20250926T005436Z content-md5;host;x-amz-content-sha256;x-amz-date UNSIGNED-PAYLOAD 2025-09-26 00:54:36 [botocore.auth] DEBUG: StringToSign: AWS4-HMAC-SHA256 20250926T005436Z 20250926/us-east-1/s3/aws4_request 1a01d2ae75baa5668a653d255ae7f9e72e64ca3784633e8d5b696fe98d73984b 2025-09-26 00:54:36 [botocore.auth] DEBUG: Signature: 83feb703c8129cb79ba40cd775de6310214c7ceca1a7843ff2749cf4ec9791b9 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event request-created.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event request-created.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.endpoint] DEBUG: Sending http request: 2025-09-26 00:54:36 [botocore.httpsession] DEBUG: Certificate path: /usr/local/lib/python3.11/site-packages/certifi/cacert.pem 2025-09-26 00:54:36 [urllib3.connectionpool] DEBUG: Starting new HTTPS connection (1): lake-api.actable.ai:443 2025-09-26 00:54:36 [botocore.awsrequest] DEBUG: Waiting for 100 Continue response. 2025-09-26 00:54:36 [botocore.awsrequest] DEBUG: 100 Continue response seen, now sending request body. 2025-09-26 00:54:36 [urllib3.connectionpool] DEBUG: https://lake-api.actable.ai:443 "PUT /dagster-output-data/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl HTTP/1.1" 200 0 2025-09-26 00:54:36 [botocore.parsers] DEBUG: Response headers: {'Server': 'nginx/1.24.0 (Ubuntu)', 'Date': 'Fri, 26 Sep 2025 00:54:36 GMT', 'Content-Length': '0', 'Connection': 'keep-alive', 'Accept-Ranges': 'bytes', 'ETag': '"d41d8cd98f00b204e9800998ecf8427e"', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains', 'Vary': 'Origin, Accept-Encoding', 'X-Amz-Bucket-Region': 'us-east-1', 'X-Amz-Id-2': 'dd9025bab4ad464b049177c95eb6ebf374d3b3fd1af9251148b658df7ac2e3e8', 'X-Amz-Request-Id': '1868AF01C89F9B9D', 'X-Content-Type-Options': 'nosniff', 'X-Ratelimit-Limit': '25637', 'X-Ratelimit-Remaining': '25637', 'X-Xss-Protection': '1; mode=block'} 2025-09-26 00:54:36 [botocore.parsers] DEBUG: Response body: b'' 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event needs-retry.s3.PutObject: calling handler 2025-09-26 00:54:36 [botocore.retryhandler] DEBUG: No retry needed. 2025-09-26 00:54:36 [botocore.hooks] DEBUG: Event needs-retry.s3.PutObject: calling handler > 2025-09-26 00:54:36 [s3transfer.utils] DEBUG: Releasing acquire 0/None 2025-09-26 00:54:36 [scrapy.extensions.feedexport] INFO: Stored jsonlines feed (0 items) in: s3://dagster-output-data/vnexpress_timestamp/vnexpress_timestamp_25df637e9a7311f086971e907748958e_scheduled_2025-09-26.jl 2025-09-26 00:54:36 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 11096, 'downloader/request_count': 33, 'downloader/request_method_count/GET': 33, 'downloader/response_bytes': 1657522, 'downloader/response_count': 33, 'downloader/response_status_count/200': 33, 'elapsed_time_seconds': 85.818098, 'feedexport/success_count/S3FeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2025, 9, 26, 0, 54, 36, 667155, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 7638288, 'httpcompression/response_count': 33, 'log_count/DEBUG': 143, 'log_count/ERROR': 4, 'log_count/INFO': 39, 'memusage/max': 142225408, 'memusage/startup': 124547072, 'request_depth_max': 2, 'response_received_count': 33, 'robotstxt/request_count': 1, 'robotstxt/response_count': 1, 'robotstxt/response_status_count/200': 1, 'scheduler/dequeued': 32, 'scheduler/dequeued/memory': 32, 'scheduler/enqueued': 32, 'scheduler/enqueued/memory': 32, 'spider_exceptions/ParserError': 4, 'start_time': datetime.datetime(2025, 9, 26, 0, 53, 10, 849057, tzinfo=datetime.timezone.utc)} 2025-09-26 00:54:36 [scrapy.core.engine] INFO: Spider closed (finished)