Developers Club geek daily blog

2 years, 2 months ago
It happened to configure now to me Akeeba Backup Pro on remote storage of backup copies in Dropbox. And on the course of process it has appeared that Akeeba is only able to litter that Dropbox, and here it is necessary to clean second-hand articles after it manually. But manually — not comme il faut and archives on gigabyte with small. Therefore, it is necessary to get rid somehow from outdate without hands.

So, it is given — full backups aploaditsya in the full folder each three hours. The Mysql bases — in the mysql folder everyone half an hour. So the owner of the site wants, he under this business of Dropbox Pro has paid.

It is necessary — to delete all old full archives, having left on one in day (and that was!), and all backups of Mysql, except the today's.

I will warn at once — I use CentOS recently, less than a year. It is set on virtualka as the test Web server (LAMP), in the console I happen at most once a month, and "body shirts" wrote under old kind MS-DOS 6.22 therefore all described below was born literally in couple of hours "from scratch" and does not shine with grace.

First of all I was puzzled with question — than to reach Dropbox from the console? After short search on the Internet wonderful bash-script Dropbox_Uploader which as it has appeared, is able not only to download, but also other vital teams, including list and delete was found. Here only ill luck — I have not understood to descent, whether it is trained to delete "according to the list" therefore has issued is banal cycle (besides it is "by the piece" simpler to debug).

dropbox_uploader simply is established — we download it team
curl "https://raw.githubusercontent.com/andreafabrizi/Dropbox-Uploader/master/dropbox_uploader.sh" -o dropbox_uploader.sh

we grant all the rights for reading and execution
chmod +rx dropbox_uploader.sh

then we start
./dropbox_uploader.sh

we configure access to Dropbox, following helps on the screen (to create new application on the site Dropbox and to feed to script of APP key and APP secret) and we transfer script in / usr/bin / that it was available to all, and, above all — krone.

Dropbox_uploder at the command of list full (I will remind, "full" at me — folder name) gives out the approximately following
> Listing "/full"... DONE
 [F] 1218610223 site-sitename.com-20150908-000001.jpa
 [F] 1218610223 site-sitename.com-20150908-030001.jpa
 [F] 1218610223 site-sitename.com-20150908-060001.jpa
 [F] 1218610223 site-sitename.com-20150908-090001.jpa
 [F] 1218610223 site-sitename.com-20150908-120001.jpa
 [F] 1218610223 site-sitename.com-20150908-150001.jpa
 [F] 1218610223 site-sitename.com-20150908-180001.jpa
 [F] 1218610223 site-sitename.com-20150908-210001.jpa
 ... и т.п. ...
 [F] 1218610223 site-sitename.com-20150915-150001.jpa
 [F] 1218610223 site-sitename.com-20150915-180001.jpa
 [F] 1218610223 site-sitename.com-20150915-210001.jpa

We create own script of clean_dropbox.sh at once in / usr/bin (why and is not present?), with the same rights 755, and in it:

1. We create the list of unnecessary archives
dropbox_uploader.sh list full *.jpa | cut -d ' ' -f4 | grep -v $(date +%Y%m%d) | grep -v -e '-00' | grep -v 'full' > todel.txt

in which the received file list *.jpa from the full folder is filtered cut and several grep and remains in the file.
Cut leaves only names of files (-d '' — divider space, - f4 — the 4th field). Yes, the fourth since the line begins with space.
The first grep discards today's archives, the second excludes created within hour after midnight, and the third cleans line with the name of the folder "/full" … from list head.

2. The received file is read out in variable (array of lines)
files=$(<todel.txt)

3. We are taken place by cycle on array and we delete each file
for file in $files
do
    dropbox_uploader.sh delete /full/"$file" >> /var/log/cleanup.log
done

Considering that the script will work in krone, just in case we write output to log file.

On it processing of full archives can be considered complete, there were mysql archives.
Here all on knurled track. The list of files with the simplified grep — we do not need old backups at all, and the pile in days can and be undergone, their sizes are quite acceptable the benefit.
dropbox_uploader.sh list mysql *.sql | cut -d ' ' -f4 | grep -v $(date +%Y%m%d) | grep -v 'mysql' > todel.txt

Similarly we delete unnecessary in the scraper
dropbox_uploader.sh delete /mysql/"$file" >> /var/log/cleanup.log

and finally we clean up the fulfilled file list
rm todel.txt

Here actually and everything, it is enough to create look kroner now
50 23 * * * clean_dropbox.sh
Voila!

I understand that for the guru of Linux my post is almost useless, but Habr also such ignoramuses, as I read. I hope, and it is useful to somebody.

This article is a translation of the original post at habrahabr.ru/post/266993/
If you have any questions regarding the material covered in the article above, please, contact the original author of the post.
If you have any complaints about this article or you want this article to be deleted, please, drop an email here: sysmagazine.com@gmail.com.

We believe that the knowledge, which is available at the most popular Russian IT blog habrahabr.ru, should be accessed by everyone, even though it is poorly translated.
Shared knowledge makes the world better.
Best wishes.

comments powered by Disqus