AWS - OS 错误权限被拒绝 Lambda 脚本

AWS - OS Error permission denied Lambda Script

我正在尝试使用导入的库在 Python 中执行 Lambda 脚本,但出现权限错误。 我也收到了一些关于数据库的警报,但是数据库查询是在子进程之后调用的,所以我认为它们不相关。有人可以解释为什么我会收到错误消息吗?

警报信息

Alarm:Database-WriteCapacityUnitsLimit-BasicAlarm 
State changed to INSUFFICIENT_DATA at 2016/08/16. Reason: Unchecked: Initial alarm creation

Lambda 错误

[Errno 13] Permission denied: OSError Traceback (most recent call last):File "/var/task/lambda_function.py", line 36, in lambda_handler     
xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url]) 
File "/usr/lib64/python2.7/subprocess.py", line 566, in check_output process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib64/python2.7/subprocess.py", line 710, in __init__ errread, errwrite) File "/usr/lib64/python2.7/subprocess.py", line 1335, in _execute_child raise child_exception 
OSError: [Errno 13] Permission denied 

Lambda 代码

import logging
import subprocess

import boto3

SIGNED_URL_EXPIRATION = 300     # The number of seconds that the Signed URL is valid
DYNAMODB_TABLE_NAME = "TechnicalMetadata"
DYNAMO = boto3.resource("dynamodb")
TABLE = DYNAMO.Table(DYNAMODB_TABLE_NAME)

logger = logging.getLogger('boto3')
logger.setLevel(logging.INFO)


def lambda_handler(event, context):
    """

    :param event:
    :param context:
    """
    # Loop through records provided by S3 Event trigger
    for s3_record in event['Records']:
        logger.info("Working on new s3_record...")
        # Extract the Key and Bucket names for the asset uploaded to S3
        key = s3_record['s3']['object']['key']
        bucket = s3_record['s3']['bucket']['name']
        logger.info("Bucket: {} \t Key: {}".format(bucket, key))
        # Generate a signed URL for the uploaded asset
        signed_url = get_signed_url(SIGNED_URL_EXPIRATION, bucket, key)
        logger.info("Signed URL: {}".format(signed_url))
        # Launch MediaInfo
        # Pass the signed URL of the uploaded asset to MediaInfo as an input
        # MediaInfo will extract the technical metadata from the asset
        # The extracted metadata will be outputted in XML format and
        # stored in the variable xml_output
        xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url])
        logger.info("Output: {}".format(xml_output))
        save_record(key, xml_output)

def save_record(key, xml_output):
    """
    Save record to DynamoDB

    :param key:         S3 Key Name
    :param xml_output:  Technical Metadata in XML Format
    :return:
    """
    logger.info("Saving record to DynamoDB...")
    TABLE.put_item(
       Item={
            'keyName': key,
            'technicalMetadata': xml_output
        }
    )
    logger.info("Saved record to DynamoDB")


def get_signed_url(expires_in, bucket, obj):
    """
    Generate a signed URL
    :param expires_in:  URL Expiration time in seconds
    :param bucket:
    :param obj:         S3 Key name
    :return:            Signed URL
    """
    s3_cli = boto3.client("s3")
    presigned_url = s3_cli.generate_presigned_url('get_object', Params={'Bucket': bucket, 'Key': obj},
                                                  ExpiresIn=expires_in)
    return presigned_url

我相当确定这是 lambda 执行环境施加的限制,但可以通过 shell.
执行脚本来解决这个问题 尝试为您的子流程调用提供 shell=True:

xml_output = subprocess.check_output(["./mediainfo", "--full", "--output=XML", signed_url], shell=True)

我遇到过类似的情况。我收到错误消息:

2016-11-28T01:49:01.304Z    d4505c71-b50c-11e6-b0a1-65eecf2623cd    Error: Command failed: /var/task/node_modules/youtube-dl/bin/youtube-dl --dump-json -f best https://soundcloud.com/bla/blabla
python: can't open file '/var/task/node_modules/youtube-dl/bin/youtube-dl': [Errno 13] Permission denied

对于我的(以及所有其他)包含第三方库的 Node Lambda 项目,将有一个名为 "node_modules" 的目录(大多数教程 such as this one 将详细说明该目录的创建方式)具有所有第三方包及其依赖项。相同的原则适用于其他支持的语言(目前 Python 和 Java)。 这些是亚马逊实际放在 LAMBDA AMIS 上并试图使用的文件。因此,要解决此问题,请在 node_modules 目录(或您的第三方库所在的任何目录)中 运行:

chmod -R 777 /Users/bla/bla/bla/lambdaproject/node_modules

该命令的意思是使文件对所有用户可读可写可执行。这显然是执行 Lambda 函数的服务器为了工作所需要的。希望这对您有所帮助!