我是django、docker和scrapy的新手,我正在尝试运行一个也使用scrapy的django应用程序(我基本上创建了一个django应用程序,它也是一个scrapy应用程序,并尝试从django视图调用蜘蛛)。尽管在需求中指定了这个scrapy
。txt和从Dockerfile运行pip,在运行python manage之前,容器中不会安装依赖项。py runserver 0.0.0.0:8000
和django应用程序在系统检查期间失败,导致web容器退出,原因是以下异常:
| Exception in thread django-main-thread:
web_1 | Traceback (most recent call last):
web_1 | File "/usr/local/lib/python3.7/threading.py", line 926, in _bootstrap_inner
web_1 | self.run()
web_1 | File "/usr/local/lib/python3.7/threading.py", line 870, in run
web_1 | self._target(*self._args, **self._kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/autoreload.py", line 54, in wrapper
web_1 | fn(*args, **kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/commands/runserver.py", line 117, in inner_run
web_1 | self.check(display_num_errors=True)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 390, in check
web_1 | include_deployment_checks=include_deployment_checks,
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 377, in _run_checks
web_1 | return checks.run_checks(**kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/registry.py", line 72, in run_checks
web_1 | new_errors = check(app_configs=app_configs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 40, in check_url_namespaces_unique
web_1 | all_namespaces = _load_all_namespaces(resolver)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 57, in _load_all_namespaces
web_1 | url_patterns = getattr(resolver, 'url_patterns', [])
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1 | res = instance.__dict__[self.name] = self.func(instance)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 579, in url_patterns
web_1 | patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1 | res = instance.__dict__[self.name] = self.func(instance)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 572, in urlconf_module
web_1 | return import_module(self.urlconf_name)
web_1 | File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1 | return _bootstrap._gcd_import(name[level:], package, level)
web_1 | File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1 | File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1 | File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1 | File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1 | File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1 | File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1 | File "/code/composeexample/urls.py", line 21, in <module>
web_1 | path('scrapy/', include('scrapy_app.urls')),
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/conf.py", line 34, in include
web_1 | urlconf_module = import_module(urlconf_module)
web_1 | File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1 | return _bootstrap._gcd_import(name[level:], package, level)
web_1 | File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1 | File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1 | File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1 | File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1 | File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1 | File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1 | File "/code/scrapy_app/urls.py", line 4, in <module>
web_1 | from scrapy_app import views
web_1 | File "/code/scrapy_app/views.py", line 1, in <module>
web_1 | from scrapy.crawler import CrawlerProcess
web_1 | ModuleNotFoundError: No module named 'scrapy'
我尝试使用pip3
而不是pip,pip安装——不需要缓存dir-r。txt
,更改Dockerfile中语句的顺序,我还检查了Scrapy==1.7.3
是否出现在需求中。txt
。似乎什么都不管用。
这是我的Dockerfile:
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/
这是我的码头工人。yml:
version: '3'
services:
db:
image: postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
有点晚了,但是我遇到了这个问题,最终解决了这个问题(我把这个放在这里给任何有同样问题的人)。
要使用更新的文件重建容器,我们需要编写:
docker-compose rm -f
docker-compose pull
docker-compose up
如果不起作用,请尝试相同的方法,但将最后一行替换为docker compose-up--build-d
。
我从这个答案中得到了这个。
你的需求中似乎缺少
!scrapy
。txt
我试图用你所有的组件构建一个最小的版本。希望有帮助。
测验py
import scrapy
from time import sleep
def main():
while True:
print(scrapy)
sleep(1)
if __name__ == "__main__":
main()
要求。txt
Scrapy==1.7.3
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY requirements.txt .
RUN pip3 install -r requirements.txt
COPY . ./
CMD [ "python3", "test.py" ]
docker撰写。yml
version: '3'
services:
db:
image: postgres
web:
build: .
ports:
- "8000:8000"
depends_on:
- db
问题内容: 我一直在关注几种不同的教程以及官方教程,但是每当我尝试在容器中安装PostgreSQL时,我都会收到以下消息 我在SO和整个互联网上浏览了几个问题,但是没有运气。 问题答案: 问题是您的应用程序/项目正在尝试访问HOST机器(不是docker容器)中的postgres套接字文件。 要解决这个问题,要么必须在使用该标志为postgres容器设置端口时明确要求进行tcp / ip连接,要么
我一直在关注几个不同的教程以及官方教程,但是每当我试图在容器中安装PostgreSQL时,我都会收到以下消息 我在SO和整个互联网上看了几个问题,但没有运气。
如何在高山容器中安装Docker并运行docker映像?我可以安装,但无法启动docker,并且在运行时获取“docker命令未找到错误”。
问题内容: 首次运行高山docker容器并尝试进行openssh安装时,会发生以下情况: 应该如何安装openssh? 问题答案: 跑第一。以下粘贴包含一个完整的示例:
命令输出: 加载插件:fastestmirror,langpacks从缓存的主机文件加载镜像速度*base:repo1.ash.innosscale.net*epel:mirror.us.leaseweb.net*extras:mirror.rackspace.com*更新:ftp.osuosl.org解析依赖项还有未完成的事务。您可以考虑首先运行yum-complete-transaction,
问题内容: 试图安装在vim或nano的docker内部,但我只得到这个: 退出docker并执行操作,然后我得到回复,在docker内同时执行它没有响应。 可能是什么问题呢? 问题答案: 解决方案是使用以下命令运行docker: