I'm writing my first bash script--to deploy a web project via FTP. The main functionality of the script (besides updating from svn, scheduling an outage, warning logged-in customers, etc.) is simply traversing the directory structure and uploading eligible files. This function lists all the files I'd like to upload:
# Second step: Loop through all directories and folders
# skipping any excluded files or extensions
UploadFolder() {
for dir in $(find $1 -type d | grep -v "\.svn")
do
for file in $(find $dir -type f -maxdepth 1 | grep -v "configure\.php")
do
echo $file
done
done
}
Simple enough. I know I could do it with a single find command, but I want to deal with folders separately (to minimize mkdir's on the FTP server). I also got a basic FTP upload script working:
ftp -in theftpserver.com <<EOF
quote USER theuser
quote PASS password
binary
mkdir test_upload
cd test_upload
put upload
quit
EOF
This works well enough also. The problem is that I'd have to run that whole ftp command for each file (reconnecting, asking for binary, finding the right folder, etc). There are two solutions I can see:
1) Somehow intersperse bash commands in with the ftp commands. Is there a way to do this?
2) Write a text file in the bash script, used later for the stdin on the ftp command. How would I do this?
Thanks,