Garden Cam Gallery Project


For awhile now, I’ve had my Garden Cam up and running but I wanted a photo gallery that wasn’t a pain to maintain or setup. I also needed it to auto detect directories and photos. I stumbled across SFPG (Single File Photo Gallery) which does just that!

My only problem was, the webcam only uploads a single image every 5 minutes. My fix? a simple bash script to copy that file into a directory 1 minute after the new image uploads. The image starts at cam_1.jpg but gets transferred with date and time stamps on it for easier sorting with another bash script. Once it’s in the new directory and renamed, it gets sorted:

DATE=$(date +%m%d)
Dir=$(date +%m-%d)

# Move the files
for file in cam-“$DATE”*.jpg; do
mkdir -p “$Dir”
mv $file “$Dir”
done

I have the sort script run one minute after the file gets moved, this way I don’t have any issues with the sort script running before the image is moved.

So check it out Gallery

Something you may have caught onto if you read any of my posts, I do things in a round about way. Mainly because I don’t want to have to pay for software to do something if I can make it happen my own way. (prime example: My back up and restore script)

If you want to talk to me about my scripts or want to use them, feel free to contact me! (the ways to contact me are over to the right)

OwnCloud


You’re familiar with the term cloud right?… If not, I’m sure you have heard of Dropbox, Box, Google Driveetc. These are all Cloud services, they store your data on a server that is accessible from anywhere. So, what I’m going to do, is create my own using OwnCloud and an Ubuntu Server I have (currently it sits and runs PyTivo.. That’s a different blog post ūüôā ) (Edit: Yes I am aware of Ubuntu Enterprise Cloud Server.. I will look at installing and playing with this.)

I was watching TWiT‘s new show “Know How…” and their first episode was about rolling out your own cloud a few different ways, they shows the Tonido Plug and the Pogo Plug. Now Tonido has a software suite you can download and use but from the sounds of it, you are actually allow them to see some of your data (at least that is how it is with the Tonido plug device). I’m not so interested in allowing that to happen, not because I’m doing anything illegal but because I don’t like the idea of people willy nilly looking at my data.

Now, looking at OwnCloud you can run the software on your machine (Desktop or Server, I prefer server) and it will simply host a Web GUI that you can access your data from. OwnCloud comes with a client you install on your device and you can access your data from your server. I don’t know if there is a mobile app yet. I mainly want this setup so I can easily access content from my home server without needing to worry about a super low max limit (Dropbox currently has 4gb on my account and Box has 50gb).

I plan on toying with it and seeing what all it does and then writing a review on it. The idea of running your own cloud (If you have an ISP that gives you a nice upload speed and doesn’t limit you) is a really neat idea. You don’t have to worry about uploading to a server, where God knows who is looking at your data.

The other option is to use software to encrypt your data before you upload to these cloud services, the reason I dislike that idea is that when I want to run in and grab something quick I don’t want to have to worry about, “Does this computer/device have the software to allow me to view this?”

So, I will install and toy with OwnCloud on my Sandbox machine and see what I come up with.  If you do not have a sandbox machine. You should REALLY invest in one. Mine is basically an old computer I had laying around after an upgrade that I tossed some hardware into. You can also pick up computers pretty cheap on ebay or a local computer recycler. Free Geek is a good place to look too.

See you on the other side!

-Dan

Bash Scripts.


I run a web server that contains multiple websites and multiple SQL databases. I decided to use bash scripts to manage my backup’s and for any file restores that were needed. I know the argument will be, “Why bash scripts? there are software options that do the same thing”. The answer is, I couldn’t find one that fit my specific needs and so I decided to write something that fit my needs.

I read several forum threads of people trying to accomplish parts of what I wanted, but nobody had meshed everything together. If you are looking for a bash script that does a grandfather-father-son archive rotation of each user directory into it’s own archive and also does the same thing for databases, then these may work for you. The archives are stored in Daily, Weekly and Monthly. The rotation will do a daily backup Sunday through Friday. On Saturday a weekly backup is done giving you four weekly backups a month. The monthly backup is done on the first of the month rotating two monthly backups based on if the month is odd or even.

I hope you find these as useful as I did.

So this first script I found on the Ubuntu server Archive page.

This one archives the specified files/directories

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of specified directory
#
####################################

# What to backup.
backup_files=”/etc /root”

# Where to backup to.
dest=”/backup/system”

# Setup variables for the archive filename.
day=$(date +%A)
hostname=$(hostname -s)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”$hostname-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”$hostname-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”$hostname-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”$hostname-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”$hostname-month2.tgz”
else
month_file=”$hostname-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”$hostname-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up $backup_files to $dest/$archive_file”
date
echo

# Backup the files using tar.
tar czf $dest/$archive_file $backup_files

# Print end status message.
echo
echo “Backup finished”
date

# Long listing of files in $dest to check file sizes.
ls -lh $dest/

 

This one keeps the same concept except that instead of defining which file/directory the script grabs all the directories within /home

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of each home directory within
# it’s own archive
#
####################################

# Where to backup to.
dest=”/backup/users”

# Setup variables for the archive filename.
day=$(date +%A)
#folder=$(backup)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.tgz”
else
month_file=”-month1.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.tgz”
else
archive_file=$week_file
fi

# Print start status message.
echo “Backing up This may take a few minutes.”

# Backup the files using tar.
for folder in $(ls /home); do
sudo -u $folder tar czf “$dest/$folder$archive_file” /home/”$folder”

# Print end status message.
echo
echo “Backup $folder complete.”
#date
done

# Long listing of files in $dest to check file sizes.
ls -lh $dest/
echo
echo “Backup is complete”
exit

 

Now, if you multiple SQL databases on your sever, this will benefit you a lot.

#!/bin/bash
####################################
#
# Backup to NFS mount script with
# grandfather-father-son rotation
# of all your SQL Databases.
#
####################################

# Where to backup to.
dest=”/backup/sql-backup”

# Setup variables for the archive filename.
day=$(date +%A)

# Find which week of the month 1-4 it is.
day_num=$(date +%d)
if (( $day_num week_file=”-week1.sql.tgz”
elif (( $day_num > 7 && $day_num week_file=”-week2.sql.tgz”
elif (( $day_num > 14 && $day_num week_file=”-week3.sql.tgz”
elif (( $day_num > 21 && $day_num < 32 )); then
week_file=”-week4.sql.tgz”
fi

# Find if the Month is odd or even.
month_num=$(date +%m)
month=$(expr $month_num % 2)
if [ $month -eq 0 ]; then
month_file=”-month2.sql.tgz”
else
month_file=”-month1.sql.tgz”
fi

# Create archive filename.
if [ $day_num == 1 ]; then
archive_file=$month_file
elif [ $day != “Saturday” ]; then
archive_file=”-$day.sql.tgz”
else
archive_file=$week_file
fi

# Backup the files.
MYSQL=’/usr/bin/mysql’

MYSQLDUMP=’/usr/bin/mysqldump’
DUMPOPTS=’–opt –hex-blob –skip-extended-insert’

user=”CHANGEME”
pass=”CHANGEME”
# Get the names of the database tables
databases=`$MYSQL -u$user -p$pass –skip-column-names -e’SHOW DATABASES’`

# Write the compressed dump for each table
for db in $databases; do
filename=`date +”$dest/$db$archive_file”`
echo “creating $filename”
$MYSQLDUMP $DUMPOPTS -u$user -p$pass –database $db
| gzip -9 > $filename

done

echo “Backup of SQL Datases Complete”
exit

 

I currently have not written a restore for the SQL backup, if you have phpMyAdmin, you can use the import function.

Okay, so we have backed up all the home directories. What if someone needs something specific from a specific backup. I know that this script doesn’t account for a failure. I’m working on that.

#!/bin/bash
##############################
#
# Script written by Dan Walker
# for Merval.Org Hosting to
# restore a specific file
# from a specific backup.
#
##############################

# Specify Backup Directory
echo -n “Where are we restoring from? (default is /backup/users): ”
while read -e inputline
do
backup_path=”$inputline”
# Display what user typed
if [ -z “${backup_path}” ]
then
echo “You didn’t type anything”
backup_path=”/backup/users”
echo “Using $backup_path”
else

if [ -n “${backup_path}” ]
then
backup_path=”$inputline”
echo “Using custom location: $backup_path”
fi
fi

# Lets ask what user to restore
echo -n “Which user?: ”
read -e user

echo -n “What are we restoring? (leave out /home/): ”
read -e source

# Now lets figure out what backup to restore
echo -n “Which backup? (Daily, Weekly or Monthly): ”
read -e choice
if [ $choice = “Daily” ];
then
echo -n “Which day? (Sunday – Friday): ”
read -e date
echo “You chose $date”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-$date.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from the $date backup”
fi

if [ $choice = “Weekly” ];
then
echo -n “Which week? (1-4): ”
read -e week
echo “You chose to restore to $week(s) ago”
echo “Starting restore process. This may take a moment”
cd $backup_path
sudo -u $user tar -xzf $user-week$week.tgz -C / home/$user/$source
echo “Restored /home/$user/$source to the back from $week week(s) ago”
fi

if [ $choice = “Monthly” ]
then
echo -n “Which Month? (1 or 2): ”
read -e month
echo “You chose to restore to $month(s) ago”
echo “Starting restore process. This may take a moment.”
cd $backup_path
sudo -u $user tar -xzf $user-month$month.tgz -C / home/$user/$source
echo “Restored /home/$user/$source from $month month(s) ago”
fi

exit
done

 

Suggestions are always appreciated!

Thanks for reading!

So.. Merval.Org


So I re-launched merval.org the other day and I really don’t know what I want to do with it.. I did not own the domain for several years as some guy bought it up the moment I allowed it to expire, I didn’t have the money to renew it. When I did enough money to renew the domain I contacted the guy who bought it (or a secretary of the guy.. it was weird) . He was asking me for 500 dollars to buy the domain back. Had I REALLY¬†generated that much buzz in the prior 3 years I owned the domain? hardly. I think I had something like 200 – 400 unique hits a month. Which isn’t anything to write home about.

My other domain, pdxrevolution.com gets somewhere in the range of 500~ a month. I don’t know what this guy was thinking trying to sell me a domain I¬†owned for 500 dollars, maybe he saw something in the domain I¬†didn’t? I¬†know there is a company overseas called Merval and they own… wow nevermind, I just checked merval.com and they don’t own the domain anymore. Godaddy is auctioning the domain off for $100 dollars.. It expires in June. I wonder how many have bid on it… *looking* hah 0 people. It has been viewed by 5 people. The domain should have some importance since (http://en.wikipedia.org/wiki/MERVAL) it is a stock exchange in Buenos Aire (http://en.wikipedia.org/wiki/Buenos_Aires_Stock_Exchange) …. Not too sure why they let that domain expire.. I’m going to keep my eye on that and see if anyone bids.

Back to Merval.Org. I’m having a hard time finding what I want to do with it. I don’t blog that often and when I actually blog I do it here (livejournal). I will sit here and drink my coffee and ponder the thought.

Thanks for reading!

Dan