当前位置: 首页 > 软件库 > 云计算 > Serverless 系统 >

serverless-python-requirements

授权协议 MIT License
开发语言 JavaScript
所属分类 云计算、 Serverless 系统
软件类型 开源软件
地区 不详
投 递 者 王亮
操作系统 跨平台
开源组织
适用人群 未知
 软件概览

Serverless Python Requirements

Github Actions

A Serverless v1.x plugin to automatically bundle dependencies fromrequirements.txt and make them available in your PYTHONPATH.

Requires Serverless >= v1.34

Install

sls plugin install -n serverless-python-requirements

This will automatically add the plugin to your project's package.json and the plugins section of itsserverless.yml. That's all that's needed for basic use! The plugin will now bundle your pythondependencies specified in your requirements.txt or Pipfile when you run sls deploy.

For a more in depth introduction on how to use this plugin, check outthis post on the Serverless Blog

If you're on a mac, check out these notes about using python installed by brew.

Cross compiling

Compiling non-pure-Python modules or fetching their manylinux wheels issupported on non-linux OSs via the use of Docker and thedocker-lambda image.To enable docker usage, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerizePip: true

The dockerizePip option supports a special case in addition to booleans of 'non-linux' which makesit dockerize only on non-linux environments.

To utilize your own Docker container instead of the default, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerImage: <image name>:tag

This must be the full image name and tag to use, including the runtime specific tag if applicable.

Alternatively, you can define your Docker image in your own Dockerfile and add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerFile: ./path/to/Dockerfile

With Dockerfile the path to the Dockerfile that must be in the current folder (or a subfolder).Please note the dockerImage and the dockerFile are mutually exclusive.

To install requirements from private git repositories, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerizePip: true
    dockerSsh: true

The dockerSsh option will mount your $HOME/.ssh/id_rsa and $HOME/.ssh/known_hosts as avolume in the docker container. If your SSH key is password protected, you can use ssh-agentbecause $SSH_AUTH_SOCK is also mounted & the env var set.It is important that the host of your private repositories has already been added in your$HOME/.ssh/known_hosts file, as the install process will fail otherwise due to host authenticityfailure.

You can also pass environment variables to docker by specifying them in dockerEnvoption:

custom:
  pythonRequirements:
    dockerEnv:
      - https_proxy

�� Windows notes

�� Pipenv support

If you include a Pipfile and have pipenv installed instead of a requirements.txt this will usepipenv lock -r to generate them. It is fully compatible with all options such as zip anddockerizePip. If you don't want this plugin to generate it for you, set the following option:

custom:
  pythonRequirements:
    usePipenv: false

�� Poetry support

If you include a pyproject.toml and have poetry installed instead of a requirements.txt this will usepoetry export --without-hashes -f requirements.txt -o requirements.txt --with-credentials to generate them. It is fully compatible with all options such as zip anddockerizePip. If you don't want this plugin to generate it for you, set the following option:

custom:
  pythonRequirements:
    usePoetry: false

Poetry with git dependencies

Poetry by default generates the exported requirements.txt file with -e and that breaks pip with -t parameter(used to install all requirements in a specific folder). In order to fix that we remove all -e from the generated file but,for that to work you need to add the git dependencies in a specific way.

Instead of:

[tool.poetry.dependencies]
bottle = {git = "git@github.com/bottlepy/bottle.git", tag = "0.12.16"}

Use:

[tool.poetry.dependencies]
bottle = {git = "https://git@github.com/bottlepy/bottle.git", tag = "0.12.16"}

Or, if you have an SSH key configured:

[tool.poetry.dependencies]
bottle = {git = "ssh://git@github.com/bottlepy/bottle.git", tag = "0.12.16"}

Dealing with Lambda's size limitations

To help deal with potentially large dependencies (for example: numpy, scipyand scikit-learn) there is support for compressing the libraries. This doesrequire a minor change to your code to decompress them. To enable this add thefollowing to your serverless.yml:

custom:
  pythonRequirements:
    zip: true

and add this to your handler module before any code that imports your deps:

try:
  import unzip_requirements
except ImportError:
  pass

Slim Package

Works on non 'win32' environments: Docker, WSL are includedTo remove the tests, information and caches from the installed packages,enable the slim option. This will: strip the .so files, remove __pycache__and dist-info directories as well as .pyc and .pyo files.

custom:
  pythonRequirements:
    slim: true

Custom Removal Patterns

To specify additional directories to remove from the installed packages,define a list of patterns in the serverless config using the slimPatternsoption and glob syntax. These patterns will be added to the default ones (**/*.py[c|o], **/__pycache__*, **/*.dist-info*).Note, the glob syntax matches against whole paths, so to match a file in anydirectory, start your pattern with **/.

custom:
  pythonRequirements:
    slim: true
    slimPatterns:
      - '**/*.egg-info*'

To overwrite the default patterns set the option slimPatternsAppendDefaults to false (true by default).

custom:
  pythonRequirements:
    slim: true
    slimPatternsAppendDefaults: false
    slimPatterns:
      - '**/*.egg-info*'

This will remove all folders within the installed requirements that matchthe names in slimPatterns

Option not to strip binaries

In some cases, stripping binaries leads to problems like "ELF load command address/offset not properly aligned", even when done in the Docker environment. You can still slim down the package without *.so files with

custom:
  pythonRequirements:
    slim: true
    strip: false

Lambda Layer

Another method for dealing with large dependencies is to put them into aLambda Layer.Simply add the layer option to the configuration.

custom:
  pythonRequirements:
    layer: true

The requirements will be zipped up and a layer will be created automatically.Now just add the reference to the functions that will use the layer.

functions:
  hello:
    handler: handler.hello
    layers:
      - Ref: PythonRequirementsLambdaLayer

If the layer requires additional or custom configuration, add them onto the layer option.

custom:
  pythonRequirements:
    layer:
      name: ${self:provider.stage}-layerName
      description: Python requirements lambda layer
      compatibleRuntimes:
        - python3.7
      licenseInfo: GPLv3
      allowedAccounts:
        - '*'

Omitting Packages

You can omit a package from deployment with the noDeploy option. Note thatdependencies of omitted packages must explicitly be omitted too.

This example makes it instead omit pytest:

custom:
  pythonRequirements:
    noDeploy:
      - pytest

Extra Config Options

Caching

You can enable two kinds of caching with this plugin which are currently both ENABLED by default.First, a download cache that will cache downloads that pip needs to compile the packages.And second, a what we call "static caching" which caches output of pip after compiling everything for your requirements file.Since generally requirements.txt files rarely change, you will often see large amounts of speed improvements when enabling the static cache feature.These caches will be shared between all your projects if no custom cacheLocation is specified (see below).

Please note: This has replaced the previously recommended usage of "--cache-dir" in the pipCmdExtraArgs

custom:
  pythonRequirements:
    useDownloadCache: true
    useStaticCache: true

Other caching options

There are two additional options related to caching.You can specify where in your system that this plugin caches with the cacheLocation option.By default it will figure out automatically where based on your username and your OS to store the cache via the appdirectory module.Additionally, you can specify how many max static caches to store with staticCacheMaxVersions, as a simple attempt to limit disk space usage for caching.This is DISABLED (set to 0) by default.Example:

custom:
  pythonRequirements:
    useStaticCache: true
    useDownloadCache: true
    cacheLocation: '/home/user/.my_cache_goes_here'
    staticCacheMaxVersions: 10

Extra pip arguments

You can specify extra arguments supported by pip to be passed to pip like this:

custom:
  pythonRequirements:
    pipCmdExtraArgs:
      - --compile

Extra Docker arguments

You can specify extra arguments to be passed to docker build during the build step, and docker run during the dockerized pip install step:

custom:
  pythonRequirements:
    dockerizePip: true
    dockerBuildCmdExtraArgs: ['--build-arg', 'MY_GREAT_ARG=123']
    dockerRunCmdExtraArgs: ['-v', '${env:PWD}:/my-app']

Customize requirements file name

Some pip workflows involve using requirements files not namedrequirements.txt.To support these, this plugin has the following option:

custom:
  pythonRequirements:
    fileName: requirements-prod.txt

Per-function requirements

If you have different python functions, with different sets of requirements, you can avoidincluding all the unecessary dependencies of your functions by using the following structure:

├── serverless.yml
├── function1
│      ├── requirements.txt
│      └── index.py
└── function2
       ├── requirements.txt
       └── index.py

With the content of your serverless.yml containing:

package:
  individually: true

functions:
  func1:
    handler: index.handler
    module: function1
  func2:
    handler: index.handler
    module: function2

The result is 2 zip archives, with only the requirements for function1 in the first one, and onlythe requirements for function2 in the second one.

Quick notes on the config file:

  • The module field must be used to tell the plugin where to find the requirements.txt file foreach function.
  • The handler field must not be prefixed by the folder name (already known through module) asthe root of the zip artifact is already the path to your function.

Customize Python executable

Sometimes your Python executable isn't available on your $PATH as python2.7or python3.6 (for example, windows or using pyenv).To support this, this plugin has the following option:

custom:
  pythonRequirements:
    pythonBin: /opt/python3.6/bin/python

Vendor library directory

For certain libraries, default packaging produces too large an installation,even when zipping. In those cases it may be necessary to tailor make a versionof the module. In that case you can store them in a directory and use thevendor option, and the plugin will copy them along with all the otherdependencies to install:

custom:
  pythonRequirements:
    vendor: ./vendored-libraries
functions:
  hello:
    handler: hello.handler
    vendor: ./hello-vendor # The option is also available at the function level

Manual invocations

The .requirements and requirements.zip(if using zip support) files are leftbehind to speed things up on subsequent deploys. To clean them up, runsls requirements clean. You can also create them (and unzip_requirements ifusing zip support) manually with sls requirements install.

Invalidate requirements caches on package

If you are using your own Python library, you have to cleanup.requirements on any update. You can use the following option to cleanup.requirements everytime you package.

custom:
  pythonRequirements:
    invalidateCaches: true

�� �� �� Mac Brew installed Python notes

Brew wilfully breaks the --target option with no seeming intention to fix itwhich causes issues since this uses that option. There are a few easy workarounds for this:

OR

  • Create a virtualenv and activate it while using serverless.

OR

Also, brew seems to cause issues with pipenv,so make sure you install pipenv using pip.

�� Windows dockerizePip notes

For usage of dockerizePip on Windows do Step 1 only if running serverless on windows, or do both Step 1 & 2 if running serverless inside WSL.

  1. Enabling shared volume in Windows Docker Taskbar settings
  2. Installing the Docker client on Windows Subsystem for Linux (Ubuntu)

Native Code Dependencies During Build

Some Python packages require extra OS dependencies to build successfully. To deal with this, replace the default image (lambci/lambda:python3.6) with a Dockerfile like:

FROM lambci/lambda:build-python3.6

# Install your dependencies
RUN yum -y install mysql-devel

Then update your serverless.yml:

custom:
  pythonRequirements:
    dockerFile: Dockerfile

Native Code Dependencies During Runtime

Some Python packages require extra OS libraries (*.so files) at runtime. You need to manually include these files in the root directory of your Serverless package. The simplest way to do this is to use the dockerExtraFiles option.

For instance, the mysqlclient package requires libmysqlclient.so.1020. If you use the Dockerfile from the previous section, add an item to the dockerExtraFiles option in your serverless.yml:

custom:
  pythonRequirements:
    dockerExtraFiles:
      - /usr/lib64/mysql57/libmysqlclient.so.1020

Then verify the library gets included in your package:

sls package
zipinfo .serverless/xxx.zip

If you can't see the library, you might need to adjust your package include/exclude configuration in serverless.yml.

Optimising packaging time

If you wish to exclude most of the files in your project, and only include the source files of your lambdas and their dependencies you may well use an approach like this:

package:
  individually: false
  include:
    - './src/lambda_one/**'
    - './src/lambda_two/**'
  exclude:
    - '**'

This will be very slow. Serverless adds a default "&ast;&ast;" include. If you are using the cacheLocation parameter to this plugin, this will result in all of the cached files' names being loaded and then subsequently discarded because of the exclude pattern. To avoid this happening you can add a negated include pattern, as is observed in https://github.com/serverless/serverless/pull/5825.

Use this approach instead:

package:
  individually: false
  include:
    - '!./**'
    - './src/lambda_one/**'
    - './src/lambda_two/**'
  exclude:
    - '**'

Contributors

  • 今天花了一点时间来看看SimpleHTTPServer。 这是Python的一个模块。 看这个的原因是想架一个简单简单简单简单的Http Server,实现这样的功能: 用户点击一个按钮,创建出一个IE(Firefox)界面,可以浏览Flash, 也可以点击Flash中的按钮或其他什么什么的, 然后通过Javascript传到后台,后台我用Python来做处理。 本来这个功能是想用Karrigel

  • 机器人操作系统ROS1简要介绍,翻译自以下仓库: ROS-Tutorials 原仓库是一系列的jupyter notebook,可以直接在jupyter环境下运行程序示例。 编写一个Service节点 这里我们将编写一个提供(“add_two_ints_server”)服务的节点,接受两个整数,返回和。 保持roscore运行。 from rospy_tutorials.srv import *

  • #-*- coding: cp936 -*- #!/usr/bin/env python import socket, os from SocketServer import BaseServer from BaseHTTPServer import HTTPServer, BaseHTTPRequestHandler from SimpleHTTPServer import SimpleH

  • 简介 HTTPServer 是 socketserver.TCPServer 的一个子类。它会创建和侦听 HTTP 套接字,并将请求分发给处理程序。创建和运行 HTTP 服务器的代码类似如下所示: def run(server_class=HTTPServer, handler_class=BaseHTTPRequestHandler): server_address = ('', 800

  • get方式: http://www.cnblogs.com/shenshangzz/p/8318143.html   # coding=utf-8 #!/usr/bin/env python #--coding:utf-8-- #python 简易 http server from http.server import BaseHTTPRequestHandler, HTTPServer from

  • #client side # -*- coding:utf-8 -*- import sys reload(sys) sys.setdefaultencoding('gb18030') import httplib conn = httplib.HTTPConnection("localhost", 8000) f = open('d:/1.txt', 'rb') conn.request

  • 1: 创建一个html文件 代码为 Hello World! 文件名为:index.html 2: 创建一个test.py文件 代码为: improt SimpleHTTPServer SimpleHTTPServer.test(); 3: 命令提示符下,跳转到test.py文件所在的目录 输入 python test.py 4:会出现这样的字段 Serving HTTP on 0.0.0.0 p

  • Python’s SimpleHTTPServer module is a handy and straightforward tool that developers can use for several use-cases, with the main one being that it’s a quick way to serve files from the directory. It

  • 本文链接: https://blog.csdn.net/xietansheng/article/details/115558016 Python3 学习笔记(目录) Python 官方文档: http.server — HTTP 服务器 http.server模块实现了一个简单的 HTTP 服务器(Web服务器)。 http.server模块中主要涉及的几个类: # HTTP 服务器, 主线程中处

  • python -m http.server import http.server import socketserver PORT = 8000 Handler = http.server.SimpleHTTPRequestHandler with socketserver.TCPServer(("", PORT), Handler) as httpd:     print("serving at

  • 1. 生成 requirements.txt pip freeze  > requirements.txt 2. 使用requirements安装包 pip install -r requirements.txt  

  • 生成requirements.txt的同时将pip的packages写到文本中 pip freeze > requirements.txt 一次性配置requirements.txt中的环境 pip install -r requirements.txt

 相关资料
  • 云原生应用开发 回顾过去二十年,应用开发有以下几个显著的特点: 以应用服务器为中心,典型应用服务器包括 tomcat,JBoss,WebLogic,WebSphere,应用服务器提供了丰富的技术堆栈和系统构建范式,对应用开发人员友好 JavaEE/Spring,JavaEE/Spring 是应用开发的基本技能,这项技能有广泛的开发者基础,过去二十年中 JavaEE/Spring 的技术发展/版本的

  • The Serverless Framework (无服务器架构)允许你自动扩展、按执行付费、将事件驱动的功能部署到任何云。 目前支持 AWS Lambda、Apache OpenWhisk、Microsoft Azure,并且正在扩展以支持其他云提供商。 Serverless 降低了维护应用程序的总成本,能够更快地构建更多逻辑。它是一个命令行工具,提供脚手架、工作流自动化和开发部署无服务器架构的最佳实践。它也可以通过插件完全扩展。

  • Serverless Prisma [Archived] — New projects should consider using Prisma2 Minimal Serverless + Prisma Project Template Getting Started Be sure to have Docker and Make installed on your machine. Docker

  • Serverless Webpack A Serverless v1.x & v2.x plugin to build your lambda functions with Webpack. This plugin is for you if you want to use the latest Javascript version with Babel;use custom resource l

  • serverless-bundle serverless-bundle is a Serverless Framework plugin that optimally packages your ES6 or TypeScript Node.js Lambda functions with sensible defaults so you don't have to maintain your o

  • Serverless Express by Vendia Run REST APIs and other web applications using your existing Node.js application framework (Express, Koa, Hapi, Sails, etc.), on top of AWS Lambda and Amazon API Gateway.