Amazon s3 から重複して単一のファイルまたはディレクトリを復元しようとしましたが、エラーが発生しました。
Local and Remote metadata are synchronized, no sync needed.
Last full backup date: none
Traceback (most recent call last):
File "/usr/bin/duplicity", line 1251, in <module>
with_tempdir(main)
File "/usr/bin/duplicity", line 1244, in with_tempdir
fn()
File "/usr/bin/duplicity", line 1198, in main
restore(col_stats)
File "/usr/bin/duplicity", line 538, in restore
restore_get_patched_rop_iter(col_stats)):
File "/usr/bin/duplicity", line 560, in restore_get_patched_rop_iter
backup_chain = col_stats.get_backup_chain_at_time(time)
File "/usr/lib/python2.6/dist-packages/duplicity/collections.py", line 934, in get_backup_chain_at_time
raise CollectionsError("No backup chains found")
CollectionsError: No backup chains found
私は何を間違っていますか?
ここでバックアップを行う方法 export PASSPHRASE= * *** export AWS_ACCESS_KEY_ID= * *** export AWS_SECRET_ACCESS_KEY = * *** GPG_KEY= * *** BACKUP_SIM_RUN=1
LOGFILE="/var/log/s3-backup.log"
DAILYLOGFILE="/var/log/s3-backup-daily.log"
# The source of your backup
SOURCE=/home/u54433
# The destination
DEST=s3+http://**********
trace () {
stamp=`date +%Y-%m-%d_%H:%M:%S`
echo "$stamp: $*" >> ${DAILYLOGFILE}
}
cat /dev/null > ${DAILYLOGFILE}
trace "removing old backups..."
duplicity remove-older-than 2M --force --sign-key=${GPG_KEY} ${DEST} >> ${DAILYLOGFILE} 2>&1
trace "start backup files..."
duplicity --sign-key=${GPG_KEY} --exclude="**/logs" --s3-european-buckets --s3-use-new-style ${SOURCE} ${DEST} >> ${DAILYLOGFILE} 2>&1
cat "$DAILYLOGFILE" >> $LOGFILE
export PASSPHRASE=
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=