-
Notifications
You must be signed in to change notification settings - Fork 11k
Closed
Labels
Description
Description
I have simple Scrapy script which fails on Ubuntu 18 with weird memory error.
Works fine on local Mac, but fails on remote host.
Looks like a openSSL issue. Any advice is appreciated.
Steps to Reproduce
Simply run scrapy script
Expected behavior:
Run normally
Actual behavior:
2019-10-31 20:24:51 [scrapy.downloadermiddlewares.robotstxt] ERROR: Error downloading <GET https://xxx.yyy/robots.txt>: Cannot allocate write+execute memory for ffi.callback(). You might be running on a system that prevents this. For more information, see https://cffi.readthedocs.io/en/latest/using.html#callbacks
Traceback (most recent call last):
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1416, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/python/failure.py", line 512, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request
defer.returnValue((yield download_func(request=request,spider=spider)))
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/utils/defer.py", line 45, in mustbe_deferred
result = f(*args, **kw)
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/handlers/__init__.py", line 71, in download_request
return handler.download_request(request, spider)
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/handlers/http11.py", line 68, in download_request
return agent.download_request(request)
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/handlers/http11.py", line 332, in download_request
method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/web/client.py", line 1732, in request
endpoint = self._getEndpoint(parsedURI)
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/web/client.py", line 1715, in _getEndpoint
return self._endpointFactory.endpointForURI(uri)
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/web/client.py", line 1590, in endpointForURI
uri.port)
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/contextfactory.py", line 59, in creatorForNetloc
return ScrapyClientTLSOptions(hostname.decode("ascii"), self.getContext())
File "/home/scrapy/env/local/lib/python2.7/site-packages/scrapy/core/downloader/contextfactory.py", line 56, in getContext
return self.getCertificateOptions().getContext()
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1678, in getContext
self._context = self._makeContext()
File "/home/scrapy/env/local/lib/python2.7/site-packages/twisted/internet/_sslverify.py", line 1709, in _makeContext
ctx.set_verify(verifyFlags, _verifyCallback)
File "/home/scrapy/env/local/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1103, in set_verify
self._verify_helper = _VerifyHelper(callback)
Reproduces how often:
100%
Versions
$ scrapy version --verbose
Scrapy : 1.6.0
lxml : 4.4.1.0
libxml2 : 2.9.9
cssselect : 1.1.0
parsel : 1.5.2
w3lib : 1.21.0
Twisted : 19.7.0
Python : 2.7.15+ (default, Oct 7 2019, 17:39:04) - [GCC 7.4.0]
pyOpenSSL : 19.0.0 (OpenSSL 1.1.1d 10 Sep 2019)
cryptography : 2.8
Platform : Linux-4.14.117-grsec-grsec+-x86_64-with-Ubuntu-18.04-bionic
Additional context
$ cat /etc/os-release
NAME="Ubuntu"
VERSION="18.04.2 LTS (Bionic Beaver)"
PRETTY_NAME="Ubuntu 18.04.2 LTS"
VERSION_ID="18.04"
osjerick and vadorvatsal