I am working on a project which has been developed using scrapy framework on behalf of my client. My client has server on pythonanywhere. I am having the following issues which I can not seem to resolve.
Any insight on the issue much appreciated in advance.
case1# "virtualwrapper has been used to create the virtualenv environment " "while installing scrapy on virtualenv,
instructions given on "pythonnaywhere" blog has been followed.
case2# "script run fine from console "
case3: "The problem he is having with running the following script while scheduling the task on the webinterface of pythonanywhere"
case4: "below is the script which he wants to schedule"
run_lycabundles_spider.sh
1 2 3 4 5 |
|
"Add a new scheduled task:" new task are added
/home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh
what I am getting is as follows:
/home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh: line 3: /scraping_test/bin/activate: No such file or directory
2016-01-19 13:19:07+0000 [scrapy] INFO: Scrapy 0.16.5 started (bot: lycamobile_spider) Traceback (most recent call last): File "/usr/local/bin/scrapy", line 4, in <module>execute()
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 131, in execute_run_print_help(parser, _run_command, cmd, args, opts)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 76, in _run_print_help func(a, *kw)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 138, in _run_command cmd.run(args, opts) File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 43, in run spider = self.crawler.spiders.create(spname, **opts.spargs)
File "/usr/local/lib/python2.7/dist-packages/scrapy/command.py", line 33, in crawler self._crawler.configure()
File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 38, in configure self.extensions = ExtensionManager.from_crawler(self)
File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 50, in from_crawler return cls.from_settings(crawler.settings, crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 31, in from_settings mw = mwcls.from_crawler(crawler)
File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 163, in from_crawler o = cls(crawler.settings)
File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 143, in init self.exporters = self._load_components('FEED_EXPORTERS')
File "/usr/local/lib/python2.7/dist-packages/scrapy/contrib/feedexport.py", line 201, in _load_components d[k] = load_object(v)
File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 39, in load_object raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'lycamobile_stack.exporters.LycamobileItemExporter': No module named exporters /home/ratepath/.virtualenvs/scraping_test/projects/lycamobile_stack/lycamobile_stack/run_lycabundles_spider.sh: line 10: deactivate: command not found
2016-01-19 13:19:07 -- Completed task, took 1.00 seconds, return code was 127.