- Amazon S3
- OneDrive
- Google Drive (Google Docs)
- Rackspace Cloud Files
- HubiC
- Backblaze (B2)
- Amazon Cloud Drive (AmzCD)
- Swift / OpenStack
- WebDAV
- SSH (SFTP)
- FTP
- and more
Download
Features
- Duplicati uses AES-256 encryption (or GNU Privacy Guard) to secure all data before it is uploaded.
- Duplicati uploads a full backup initially and stores smaller, incremental updates afterwards to save bandwidth and storage space.
- A scheduler keeps backups up-to-date automatically.
- Integrated updater notifies you when a new release is out
- Encrypted backup files are transferred to targets like FTP, Cloudfiles, WebDAV, SSH (SFTP), Amazon S3 and others.
- Duplicati allows backups of folders, document types like e.g. documents or images, or custom filter rules.
- Duplicati is available as application with an easy-to-use user interface and as command line tool.
- Duplicati can make proper backups of opened or locked files using the Volume Snapshot Service (VSS) under Windows or the Logical Volume Manager (LVM) under Linux. This allows Duplicati to back up the Microsoft Outlook PST file while Outlook is running.
- Filters, deletion rules, transfer and bandwidth options, etc
Why use Duplicati?
(https://briteming.blogspot.com/2013/09/duplicati.html)
-------
一款多平台的免费增量备份工具:Duplicati安装教程
Duplicati
是一个备份客户端,可以在本地存储,云存储服务和远程文件服务器上。备份的时候使用AES-256
加密来保护您的备份,并自动压缩,支持Zip
、7z
、LZMA2
压缩。而且当你备份一次后,后期只将你变化的数据添加到备份文件里,极大程度上节省了时间和空间。如果备份中途出现中断或损坏会尽可能帮你修复文件,自带Web
界面,更好的进行管理,支持Win
、Linux
、Mac
平台,更多介绍可以查看→传送门,这里就只说Linux
下安装方法。FTP
FTP
OpenStack Object Storage / Swift
S3 Compatible
SFTP (SSH)
WebDAV
Amazon Cloud Drive
Amazon S3
Azure blob
B2 Cloud Storage
Box.com
Dropbox
Google Cloud Storage
Google Drive
HubiC
Jottacloud
Mega.nz
Microsoft Office 365 Groups
Microsoft OneDrive
Microsoft OneDrive for Business
Microsoft SharePoint
OpenStack Simple Storage
Rackspace CloudFiles
Rclone
Sia Decentralized Cloud
安装
#CentOS 7系统
#安装依赖
yum install yum-utils -y
rpm --import "http://keyserver.ubuntu.com/pks/lookup?op=get&search=0x3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF"
yum-config-manager --add-repo http://download.mono-project.com/repo/centos7/
yum install mono-devel -y
#安装Duplicati
rpm -ivh https://updates.duplicati.com/beta/duplicati-2.0.3.3-2.0.3.3_beta_20180402.noarch.rpm --nodeps --force
#CentOS 6系统
#安装依赖
yum install yum-utils -y
rpm --import "http://keyserver.ubuntu.com/pks/lookup?op=get&search=0x3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF"
yum-config-manager --add-repo http://download.mono-project.com/repo/centos6/
yum install mono-devel -y
#安装Duplicati
rpm -ivh https://updates.duplicati.com/beta/duplicati-2.0.3.3-2.0.3.3_beta_20180402.noarch.rpm --nodeps --force
#Debian 7+、Ubuntu 14+系统
#安装Mono
apt-get update
apt-get install mono-devel -y
#安装Duplicati
wget https://updates.duplicati.com/beta/duplicati_2.0.3.3-1_all.deb
dpkg -i duplicati*.deb
#如果安装报错,使用该命令修复一下就自动安装成功了
apt-get -f install -y
2、启动#无需密码访问
/usr/bin/mono /usr/lib/duplicati/Duplicati.Server.exe --webservice-interface=any
#访问需输入密码moerats,可自定义其它密码,修改最后面参数即可
/usr/bin/mono /usr/lib/duplicati/Duplicati.Server.exe --webservice-interface=any --webservice-password=moerats
然后就可以通过IP:8200
访问程序了,对于CentOS
系统,还需要开启8200
端口,开启如下:#CentOS 6
iptables -I INPUT -p tcp --dport 8200 -j ACCEPT
service iptables save
service iptables restart
#CentOS 7
firewall-cmd --zone=public --add-port=8200/tcp --permanent
firewall-cmd --reload
开机自启
最后我们还可以设置下进程守护和开机自启来提高使用效率。这里保证所有系统都能设置到,就说2
种方法,Systemctl
和Supervisor
。貌似CentOS 6
、Debian 7
、Ubuntu 14
不支持Systemctl
,建议使用Supervisor
。
1、使用Systemctl
新增systemd
配置文件,使用命令:#以下为一整条命令,先自定义ExecStart后面的密码moerats,再一起复制到SSH运行
echo "[Unit]
Description=Duplicati Process Guardian
[Service]
ExecStart=/usr/bin/mono /usr/lib/duplicati/Duplicati.Server.exe --webservice-interface=any --webservice-password=moerats
Restart=on-failure
RestartSec=15
[Install]
WantedBy=multi-user.target" > /etc/systemd/system/duplicati.service
设置开机自启:systemctl enable duplicati
启动Duplicati
:systemctl start duplicati
2、使用Supervisor
为了方便,这里采用pip
方式安装supervisor
。
安装pip
:#CentOS 6.x 32位
rpm -ivh http://dl.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
yum install -y python-pip
#CentOS 6.x 64位
rpm -ivh http://dl.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
yum install -y python-pip
#CentOS 7.x
yum install -y epel-release
yum install -y python-pip
#如果CentOS 7安装出现No package python-pip available,可以用以下命令进行安装
wget https://bootstrap.pypa.io/get-pip.py
python get-pip.py
#Debian/Ubuntu系统
apt-get -y update
apt-get -y install python-pip
安装Supervisor
:pip install supervisor
wget -N -P /etc/ --no-check-certificate https://coding.net/u/cvc/p/supervisor/git/raw/master/supervisord.conf
新增配置代码:#以下为一整条命令,先自定义command后面的密码moerats,再一起复制到SSH运行
echo "[program:duplicati]
user=root
command=/usr/bin/mono /usr/lib/duplicati/Duplicati.Server.exe --webservice-interface=any --webservice-password=moerats
autorstart=true
autorestart=true
startsecs=15" >> /etc/supervisord.conf
加入开机自启:echo "/usr/bin/supervisord -c /etc/supervisord.conf" >> /etc/rc.local
chmod +x /etc/rc.local
注意下该设置开机自启的方法对CentOS 7
、Debian 9
、Ubuntu 17+
系统可能会没有作用,为啥就不说了,直接选Systemctl
方案就行了。
关于使用的话,直接看官方文档吧.
---------------------------------------
备份你的备份软件的数据。duplicati2备份报错的折腾
我的备份有一些是用duplicati 2来备份的。一直用得好好的,昨天中午突然遇到同步错误了:Found 2 files that are missing from the remote storage, please run repair
提示要修复,本以为点击一下修复就能搞定,却没想到不是我想的这么简单,继续提示错误:
The backup storage destination is missing data files. You can either enable `--rebuild-missing-dblock-files` or run the purge command to remove these files. The following files are missing: duplicati-e1d7b394080021567b4ba0be011443f62.dblock.zip.aes, duplicati-bb379b58e5942b404bc4c0cb8fb5c0b49.dblock.zip.aes
UserInformationException: The backup storage destination is missing data files. You can either enable `--rebuild-missing-dblock-files` or run the purge command to remove these files. The following files are missing: duplicati-e1d7b394080021567b4ba0be011443f62.dblock.zip.aes, duplicati-bb379b58e5942b404bc4c0cb8fb5c0b49.dblock.zip.aes
我就按提示去--rebuild-missing-dblock-files,还是报错:
Listing remote folder ...
Failed to perform cleanup for missing file: duplicati-e1d7b394080021567b4ba0be011443f62.dblock.zip.aes, message: Repair not possible, missing 546 blocks.
If you want to continue working with the database, you can use the "list-broken-files" and "purge-broken-files" commands to purge the missing data from the database and the remote storage. => Repair not possible, missing 546 blocks.
If you want to continue working with the database, you can use the "list-broken-files" and "purge-broken-files" commands to purge the missing data from the database and the remote storage.
Failed to perform cleanup for missing file: duplicati-bb379b58e5942b404bc4c0cb8fb5c0b49.dblock.zip.aes, message: Repair not possible, missing 543 blocks.
If you want to continue working with the database, you can use the "list-broken-files" and "purge-broken-files" commands to purge the missing data from the database and the remote storage. => Repair not possible, missing 543 blocks.
If you want to continue working with the database, you can use the "list-broken-files" and "purge-broken-files" commands to purge the missing data from the database and the remote storage.
Return code: 0
最开始不只是两个问题丢失吗?怎么现在五百多个块出问题了?
然后我就去做purge-broken-files,我自己摸索的,不知道有没有错。
如果不清空的话会报错:
操作 PurgeBrokenFiles 失败,报错:Filters are not supported for this operation => Filters are not supported for this operation
ErrorID: FiltersNotAllowedOnPurgeBrokenFiles
Filters are not supported for this operation
Return code: 100
设置好后执行,看起来好像可以了,搜到网上一些案例,做到这里基本就恢复了。但我的执行一段时间后又报错:
操作 PurgeBrokenFiles 失败,报错:Unable to start the purge process as there are 17677 orphan file(s) => Unable to start the purge process as there are 17677 orphan file(s)
ErrorID: CannotPurgeWithOrphans
Unable to start the purge process as there are 17677 orphan file(s)
Return code: 100
很恐怖,有一万多个孤立文件了,折腾了一整个下午,我也黔驴技穷了。
由于备份的资料比较多,很不想清除并重新重建,目前这情况让我很沮丧。
苦闷一阵后突然想到我有duplicati2的整个数据库的备份啊,而且有多个版本的备份,本次备份出问题前的上一个备份也有。
马上试试,彻底退出duplicati2,找出备份来把这次出错的数据库替换掉,再到远程删除今天备份的几百个数据包,接着启动duplicati2,看起来没异常,有戏,再颤抖的运行备份,经过两小时的备份,完美的又一次同步备份了,没出一点问题,开心,真是庆幸备份了的备份软件的数据。
反思:之前为省事,将上百万的文件一次性的放到一个备份里面,这是非常不合理的,而现在又没法改了,应该多弄几个备份,按需分配,每个备份配置的文件数不超过5万,大小不超过30G作为一个同步会更合适。