首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >项目未在scrapyd中显示

项目未在scrapyd中显示
EN

Stack Overflow用户
提问于 2014-10-03 15:31:52
回答 2查看 640关注 0票数 1

我是一个新手,我已经将下面的代码插入到scrapy.cfg文件中。

代码语言:javascript
复制
[settings]
default = uk.settings


[deploy:scrapyd]
url = http://localhost:6800/
project=ukmall

[deploy:scrapyd2]
url = http://scrapyd.mydomain.com/api/scrapyd/
username = john
password = secret

如果我在代码代码下面运行

代码语言:javascript
复制
$scrapyd-deploy -l

我能拿到

代码语言:javascript
复制
scrapyd2             http://scrapyd.mydomain.com/api/scrapyd/

scrapyd              http://localst:6800/

查看所有可用项目

代码语言:javascript
复制
scrapyd-deploy -L scrapyd

但它在我的机器上什么也没显示?

参考:http://scrapyd.readthedocs.org/en/latest/deploy.html#deploying-a-project

如果是的话

代码语言:javascript
复制
 $ scrapy deploy scrapyd2
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ scrapy deploy scrapyd2
Packing version 1412322816
Traceback (most recent call last):
  File "/usr/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 142, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 88, in _run_print_help
    func(*a, **kw)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 149, in _run_command
    cmd.run(args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 103, in run
    egg, tmpdir = _build_egg()
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 228, in _build_egg
    retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
  File "/usr/lib/pymodules/python2.7/scrapy/utils/python.py", line 276, in retry_on_eintr
    return function(*args, **kw)
  File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-VLM6W7']' returned non-zero exit status 1
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ 

如果我为另一个项目这样做,就意味着它显示出来了。

代码语言:javascript
复制
$ scrapy deploy scrapyd
Packing version 1412325181
Deploying to project "project2" in http://localhost:6800/addversion.json
Server response (200):
{"status": "error", "message": "[Errno 13] Permission denied: 'eggs'"}
EN

回答 2

Stack Overflow用户

发布于 2014-10-03 15:48:56

您将只能列出已部署的爬行器。如果您还没有部署任何东西,那么要部署您的爬行器,您只需使用scrapy deploy:

代码语言:javascript
复制
scrapy deploy [ <target:project> | -l <target> | -L ]

vagrant@portia:~/takeovertheworld$ scrapy deploy scrapyd2
Packing version 1410145736
Deploying to project "takeovertheworld" in http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/addversion.json
Server response (200):
{"status": "ok", "project": "takeovertheworld", "version": "1410145736", "spiders": 1}

通过访问scrapyd API验证项目是否已正确安装:

代码语言:javascript
复制
vagrant@portia:~/takeovertheworld$ curl http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/listprojects.json
{"status": "ok", "projects": ["takeovertheworld"]}
票数 0
EN

Stack Overflow用户

发布于 2015-12-25 11:01:11

我也犯了同样的错误。正如@hugsbrugs所说,因为scrapy项目中的一个文件夹具有根rights.So,所以我这样做。

sudo scrapy deploy scrapyd2

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/26174934

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档