使用boto将文件从ec2传输到s3时出错

使用boto将文件从ec2传输到s3时出错,第1张

概述我按照这个程序link将我的mongodump上传到s3.bash脚本#!/bin/sh MONGODB_SHELL='/usr/bin/mongo' DUMP_UTILITY='/usr/bin/mongodump' DB_NAME='amicus' date_now=`date +%Y_%m_%d_%H_%M_%S` dir_name='db_ba

我按照这个程序link将我的mongodump上传到s3.

bash脚本

#!/bin/shMONGODB_SHELL='/usr/bin/mongo'DUMP_UTIliTY='/usr/bin/mongodump'DB_name='amicus'date_Now=`date +%Y_%m_%d_%H_%M_%s`dir_name='db_backup_'${date_Now}file_name='db_backup_'${date_Now}'.bz2'log() {    echo }do_cleanup(){    rm -rf db_backup_2010*     log 'cleaning up....'}do_backup(){    log 'snapshotting the db and creating archive' && \    ${MONGODB_SHELL} admin fsync_lock.Js && \    ${DUMP_UTIliTY} -d ${DB_name} -o ${dir_name} && tar -jcf $file_name ${dir_name}    ${MONGODB_SHELL} admin unlock.Js && \    log 'data backd up and created snapshot'}save_in_s3(){    log 'saving the backup archive in amazon S3' && \    python aws_s3.py set ${file_name} && \    log 'data backup saved in amazon s3'}do_backup && save_in_s3 && do_cleanup

aws_s3.py

ACCESS_KEY=''SECRET=''BUCKET_name='s3:///s3.amazonaws.com/database-backup' #note that you need to create this bucket firstfrom boto.s3.connection import S3Connectionfrom boto.s3.key import Keydef save_file_in_s3(filename):    conn = S3Connection(ACCESS_KEY,SECRET)    bucket = conn.get_bucket(BUCKET_name)    k = Key(bucket)    k.key = filename    k.set_contents_from_filename(filename)def get_file_from_s3(filename):    conn = S3Connection(ACCESS_KEY,SECRET)    bucket = conn.get_bucket(BUCKET_name)    k = Key(bucket)    k.key = filename    k.get_contents_to_filename(filename)def List_backup_in_s3():    conn = S3Connection(ACCESS_KEY,SECRET)    bucket = conn.get_bucket(BUCKET_name)    for i,key in enumerate(bucket.get_all_keys()):        print "[%s] %s" % (i,key.name)def delete_all_backups():    #FIXME: valIDate filename exists    conn = S3Connection(ACCESS_KEY,key in enumerate(bucket.get_all_keys()):        print "deleting %s" % (key.name)        key.delete()if __name__ == '__main__':    import sys    if len(sys.argv) < 3:        print 'Usage: %s 

但不断收到此错误:

Traceback (most recent call last):  file "aws_s3.py",line 42,in 

做了一点我的研究,发现它是boto中的某种BUG.如何继续这个?最佳答案由于我没有得到任何更新如何使其工作,我在我的bash脚本中使用了s3cmd.但我还是要测试文件> 1gb.

这是更新的代码 –

#!/bin/shMONGODB_SHELL='/usr/bin/mongo'DUMP_UTIliTY='/usr/bin/mongodump'DB_name='amicus'date_Now=`date +%Y_%m_%d_%H_%M_%s`dir_name='db_backup_'${date_Now}file_name='db_backup_'${date_Now}'.bz2'log() {    echo }do_cleanup(){    rm -rf db_backup_2010*     log 'cleaning up....'}do_backup(){    log 'snapshotting the db and creating archive' && \    ${DUMP_UTIliTY} -d ${DB_name} -o ${dir_name} && tar -jcf $file_name ${dir_name}    log 'data backd up and created snapshot'}save_in_s3(){    log 'saving the backup archive in amazon S3' && \    python aws_s3.py set ${file_name} && \    s3cmd put ${file_name} s3://YOURBUCKETname    log 'data backup saved in amazon s3'}do_backup && save_in_s3 && do_cleanup
总结

以上是内存溢出为你收集整理的使用boto将文件从ec2传输到s3时出错全部内容,希望文章能够帮你解决使用boto将文件从ec2传输到s3时出错所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: https://outofmemory.cn/langs/1206464.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-06-04
下一篇 2022-06-04

发表评论

登录后才能评论

评论列表(0条)