Advertisement

[bash] Mixing FTP/bash commands, or Writing to a text file from bash script

Started by December 30, 2006 01:51 PM
2 comments, last by Sander 17 years, 9 months ago
I'm writing my first bash script--to deploy a web project via FTP. The main functionality of the script (besides updating from svn, scheduling an outage, warning logged-in customers, etc.) is simply traversing the directory structure and uploading eligible files. This function lists all the files I'd like to upload:

# Second step: Loop through all directories and folders
# skipping any excluded files or extensions
UploadFolder() {
  for dir in $(find $1 -type d | grep -v "\.svn")
  do
    for file in $(find $dir -type f -maxdepth 1 | grep -v "configure\.php")
    do
      echo $file
    done
  done
}
Simple enough. I know I could do it with a single find command, but I want to deal with folders separately (to minimize mkdir's on the FTP server). I also got a basic FTP upload script working:

ftp -in theftpserver.com <<EOF
  quote USER theuser
  quote PASS password

  binary
  mkdir test_upload
  cd test_upload
  put upload
  quit
EOF
This works well enough also. The problem is that I'd have to run that whole ftp command for each file (reconnecting, asking for binary, finding the right folder, etc). There are two solutions I can see: 1) Somehow intersperse bash commands in with the ftp commands. Is there a way to do this? 2) Write a text file in the bash script, used later for the stdin on the ftp command. How would I do this? Thanks,
Version 1: create a file to be piped into the ftp program

# Headerecho 'quote USER theuserquote PASS passwordbinarymkdir test_uploadcd test_upload' > /tmp/$$# Filesfor (...) do  echo "put $file" >> /tmp/$$done# Footerecho 'quit' >> /tmp/$$# Executecat /tmp/$$| ftp -in ...


Version 2: create a file list to be output as an sh variable

# Create the file listFILES=""for (...)do   FILES="$FILES $file"done# Pipe into ftp programecho 'quote USER theuserquote PASS passwordbinarymkdir test_uploadcd test_uploadput $FILESquit' | ftp -in ...


Both are untested, but they should give you the general idea.
Advertisement
If you're open to alternatives to FTP, I would highly recommend rsync over ssh. It will only upload files that have been updated, travel the directory trees automatically, and you can specify files to be included or excluded. It's very handy, and I used it where I can.

If I can't use rsync, I make a tarball with just newer files, upload that, and then extract it remotely.

Barring that, I more or less do what ToohrVyk suggested.
We''re sorry, but you don''t have the clearance to read this post. Please exit your browser at this time. (Code 23)
Another vote for rsync here. Lots of webhost these days support ssh access. If you have that, you can use rsync.

<hr />
Sander Marechal<small>[Lone Wolves][Hearts for GNOME][E-mail][Forum FAQ]</small>

This topic is closed to new replies.

Advertisement