Many bitcoin clones have a different coin mining algorithm based off peercoin called proof-of-stake. You can mine these types of coins on any fairly low power system, as opposed to needing dedicated hardware / asics like with many proof-of-work systems.
The wallet.dat file (essentially the wallet file containing your coins) is updated often during the coin staking process, and as a safety precaution should be backed up on a regular basis to avoid loss of funds.
Here are some simple scripts to backup your wallet.dat file in an automated fashion on linux-based systems, using crontab, so you minimize the risk of loss if you have a major system failure…
Copy the wallet.dat files to a backup archive directory (copy-to-backup.sh):
#!/bin/sh # 2016-06-09 / Created by dragonfrugal.com # License: GPL v3 (http://www.gnu.org/licenses/gpl.html) ### SETTINGS ### USERNAME="LOCAL-SYSTEM-USERNAME-HERE" SCRIPTBACKUPTIME="19:45" ################ NOW=$(date +"%Y-%m-%d-%H%Mhrs") HOURMINUTE=$(date +"%H%M") HOST="$(hostname)" mkdir -p /home/$USERNAME/Backups/$HOST/Dump/Wallets/Wallets-$NOW # Crypto wallets (uncomment below to use, or create another similar entry for a different coin) #cp /home/$USERNAME/.Hyper/wallet.dat /home/$USERNAME/Backups/$HOST/Dump/Wallets/Wallets-$NOW/Hyper-wallet-$NOW.dat #cp /home/$USERNAME/.HyperStake/wallet.dat /home/$USERNAME/Backups/$HOST/Dump/Wallets/Wallets-$NOW/HyperStake-wallet-$NOW.dat #cp /home/$USERNAME/.reddcoin/wallet.dat /home/$USERNAME/Backups/$HOST/Dump/Wallets/Wallets-$NOW/Reddcoin-wallet-$NOW.dat # Backup entire scripts directory ONCE PER DAY AT $SCRIPTBACKUPTIME 24 HOUR TIME (SET CRON JOB TO INCLUDE THIS TIME) if [ $HOURMINUTE -eq $SCRIPTBACKUPTIME ]; then cp -avr /home/$USERNAME/Scripts /home/$USERNAME/Backups/$HOST/Dump/Scripts/Scripts-$NOW fi
Archive the files into tar.gz archive files for a cleaner structure (archive-backup.sh):
#!/bin/sh # Filesystem backup script # # Version: 1.1.1 # Author: Reto Hugi (http://hugi.to/blog/) # License: GPL v3 (http://www.gnu.org/licenses/gpl.html) # # This script is based on work and ideas from http://bash.cyberciti.biz/ # # Dependencies: # tar # # Changelog: # 2016-06-09 / Modified by dragonfrugal.com # 2011-08-05 / Version 1.1.1 / fixed comparison syntax # 2011-03-07 / Version 1.1 / introduced numbering of latest incremental # 2011-03-06 / Version 1.0 / initial release ### SETTINGS ### USERNAME="LOCAL-SYSTEM-USERNAME-HERE" ################ NOW=$(date +"%Y-%m-%d-%H%Mhrs") DAY=$(date +"%u") DAY2=$(date +"%Y-%m-%d") HOST="$(hostname)" BACKUPDIR="/home/$USERNAME/Backups/$HOST/Archived" INCFILE=$BACKUPDIR/$HOST-tar-inc-backup.dat # all paths for the DIRS variable must be relative to the root (don't start with /) DIRS="/home/$USERNAME/Backups/$HOST/Dump" # Directory, where a copy of the "latest" dumps will be stored LATEST=$BACKUPDIR/latest # Day of Week (1-7) where 1 is monday FULLBACKUP="2" # If cleanup is set to "1", backups older than $OLDERTHAN days will be deleted! CLEANUP=1 OLDERTHAN=14 ### Libraries ### TAR=$(which tar) if [ -z "$TAR" ]; then echo "Error: tar not found" exit 1 fi CP="$(which cp)" if [ -z "$CP" ]; then echo "Error: CP not found" exit 1 fi ### Start Backup for file system ### [ ! -d $BACKUPDIR ] && mkdir -p $BACKUPDIR || : [ ! -d $LATEST ] && mkdir -p $LATEST || : if [ $DAY -eq $FULLBACKUP ]; then FILE="$HOST-full_$DAY2.tar.gz" $TAR -zcPf $BACKUPDIR/$FILE -C / $DIRS $CP $BACKUPDIR/$FILE "$LATEST/$HOST-full_latest_$DAY2.tar.gz" fi FILE="$HOST-incremental_$NOW.tar.gz" $TAR -g $INCFILE -zcPf $BACKUPDIR/$FILE -C / $DIRS $CP $BACKUPDIR/$FILE "$LATEST/$HOST-incremental_latest_$DAY2.tar.gz" ### Find out if ftp backup failed or not ### # Remove files older than x days if cleanup is activated if [ $CLEANUP == 1 ]; then find $BACKUPDIR/ -name "*.gz" -type f -mtime +$OLDERTHAN -delete fi
FTP the files to your remote backup location, in case you have a hard drive failure (ftp-backup.sh, needs mail / ncftp to work):
#!/bin/sh # FTP updload script # # Version: 1.0 # Author: Reto Hugi (http://hugi.to/blog/) # License: GPL v3 (http://www.gnu.org/licenses/gpl.html) # # This script is based on work and ideas from http://bash.cyberciti.biz/ # # Dependencies: # mail, ncftp # # Changelog: # 2016-06-09 / Modified by dragonfrugal.com # 2011-03-07 / Version 0.2 / changed mput to put and delete uploaded files # 2011-03-06 / Version 0.1 / initial release ### SETTINGS #### EMAILID="youremail@youremailhost.com" USERNAME="LOCAL-SYSTEM-USERNAME-HERE" ### FTP Setup ### FTPU="FTP-USERNAME-HERE" FTPP="FTP-PASSWORD-HERE" FTPS="FTP-IP-ADDRESS-HERE" FTPR="ROOT-SUBDIRECTORY-HERE-OR-MAKE-BLANK" ################# NOW=$(date +"%Y-%m-%d-%H%Mhrs") DAY=$(date +"%u") HOST="$(hostname)" ### Dump backup using FTP ### #Start FTP backup using ncftp # Delete files we are replacing /usr/bin/ncftpget -c -DD -u"$FTPU" -p"$FTPP" $FTPS /$FTPR/Backups/$HOST/Archived/$HOST-tar-inc-backup.dat /usr/bin/ncftpget -c -DD -u"$FTPU" -p"$FTPP" $FTPS /$FTPR/Backups/$HOST/Archived/latest/$HOST-incremental_latest_$DAY.tar.gz # Store the backup directory structure /usr/bin/ncftpput -R -u"$FTPU" -p"$FTPP" $FTPS /$FTPR/Backups /home/$USERNAME/Backups/$HOST ### Notify if ftp backup failed ### if [ "$?" != "0" ]; then T=backup.fail echo "Date: $(date)">$T echo "Hostname: $HOST" >>$T echo "Backup failed" >>$T echo "." >>$T echo "" >>$T /usr/sbin/sendmail "BACKUP FAILED" "$EMAILID" <$T rm -f $T fi
Completely refresh the files at your FTP remote backup location once in awhile, to purge old backups (ftp-delete.sh):
### SETTINGS #### ### FTP Setup ### FTPU="FTP-USERNAME-HERE" FTPP="FTP-PASSWORD-HERE" FTPS="FTP-IP-ADDRESS-HERE" FTPR="ROOT-SUBDIRECTORY-HERE-OR-MAKE-BLANK" ################# HOST="$(hostname)" # Delete all remote FTP file backups every X days ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF cd /$FTPR/Backups/$HOST/ rm -rf * quit EOF
Create a directory to save these files in on your system (your user home directory is ~/), and enter it:
cd ~/ mkdir Scripts cd Scripts
Create the .sh files, and make them executable:
nano filename.sh chmod 755 filename.sh
Setup your cron jobs (use the “crontab -e” command in a linux terminal window to bring it up, replace the host / user names with your own):
# Your scripts 5,25,45 * * * * /bin/sh /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Scripts/copy-to-backup.sh 55 * * * * /bin/sh /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Scripts/archive-backup.sh 0 * * * * /bin/sh /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Scripts/ftp-backup.sh #Delete LOCAL backup DUMPED SCRIPTS directories older than 4 days @hourly find /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Backups/YOUR-LOCAL-HOSTNAME-HERE/Dump/Scripts -maxdepth 1 -type d -mtime +4 -exec rm -rf {} \; #Purge REMOTE backups 3x per week 0 3 * * 1,3,5 /bin/sh /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Scripts/ftp-delete.sh #Delete LOCAL backup DUMPED WALLETS directories older than 4 days @hourly find /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Backups/YOUR-LOCAL-HOSTNAME-HERE/Dump/Wallets -maxdepth 1 -type d -mtime +4 -exec rm -rf {} \; #Delete LOCAL backup ARCHIVES older than 4 days @hourly find /home/YOUR-LOCAL-SYSTEM-USERNAME-HERE/Backups/YOUR-LOCAL-HOSTNAME-HERE/Archived* -mtime +4 -exec rm -rf {} \;
You should now have automated backups of you wallet.dat files every hour to your remote FTP account. 🙂
Recent Comments