Script to Exclude Files That Still On Transfer..


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Script to Exclude Files That Still On Transfer..
# 1  
Old 03-29-2016
Wrench Script to Exclude Files That Still On Transfer..

Hi.

I want to schedule a job at some directory will several files in it.

But there maybe a situation whereby some of the files at the point of the schedule are still transferring during that time.

So I want to skip those files from being processed.

Two method that come to my mind:
1. Process only files that its modification time is older than the last 15 minutes.
2. Exclude files that have changes in bytes (transferring)

But both method also I don't have the idea how to start.

So appreciate the UNIX expertise here to suggest what's the best way to solve this kind of situation.

Code:
$ uname -a
HP-UX system1 B.11.31 U ia64 0189138652 unlimited-user license

lsof not available.

Thank you.

Last edited by aimy; 03-30-2016 at 12:22 AM.. Reason: include OS info
# 2  
Old 03-30-2016
You don't mention what OS you are running on.


If you OS has lsof installed you may be able to do something like:

Code:
if lsof -- "$PROC_FILE" > /dev/null
then
   echo "$PROC_FILE is busy - probably still being transferred"
else
   # code to process the file goes here
fi

This User Gave Thanks to Chubler_XL For This Post:
# 3  
Old 03-30-2016
Thanks Chubler.

Code:
$ uname -a
HP-UX system1 B.11.31 U ia64 0189138652 unlimited-user license

I've tried lsof yesterday. It's not available in my system.

Thanks.
# 4  
Old 03-30-2016
You could use perl to fetch the file's modification time and compare it to the current time like this:

Code:
PROC_FILE="/var/dump/trn_20163002.24"
FILETIME=$(perl -e 'printf "%d",((stat(shift))[9])' "$PROC_FILE")
NOW=$(date +%s)

if ((NOW - FILETIME > 15*60))
then
   #file is older than 15min - so process
fi

Edit: Not sure if perl will be in your PATH try /usr/contrib/perl if it's not found.
This User Gave Thanks to Chubler_XL For This Post:
# 5  
Old 03-30-2016
Yes, you could use 'mtime' to exclude these files but there are many ways of doing this and I'm sure you'll get a number of ideas posted here.

Having done this kind of thing countless times my preferred method is to create a timestamp file at the end of each run:

Code:
date > timestamp

so that the inode of that file holds the timestamp when the last run ended.

Then on the following run I do:

Code:
find . ! -newer timestamp ...............

to select all files NOT newer than that timestamp.

If for example this is a cron job running every 15 minutes, using this method ensures that if a run is missed for some reason (eg, system down), the next run but one will pick up the backlog.

This method can create a 'moving window' behind the 15 minute allowance for incoming transfers to complete without the chance of selecting a file still being written to.

You then overwrite the timestamp file at the end of each successful run ready for next time.

(Of course, you may need a mechanism to prevent the selected list just getting longer and longer on each run by perhaps moving the files transferred elsewhere or deleting the files successfully transferred, but that wasn't what you asked.)

That's my suggestion but without doubt others will do it differently.

Hope that helps. Perhaps it doesn't.

Last edited by hicksd8; 03-30-2016 at 01:38 PM..
This User Gave Thanks to hicksd8 For This Post:
# 6  
Old 04-06-2016
Quote:
Originally Posted by Chubler_XL
You could use perl to fetch the file's modification time and compare it to the current time like this:

Code:
PROC_FILE="/var/dump/trn_20163002.24"
FILETIME=$(perl -e 'printf "%d",((stat(shift))[9])' "$PROC_FILE")
NOW=$(date +%s)

if ((NOW - FILETIME > 15*60))
then
   #file is older than 15min - so process
fi

Edit: Not sure if perl will be in your PATH try /usr/contrib/perl if it's not found.
Thanks so much.

It works as expected! Really appreciate. Smilie

Thanks.

---------- Post updated at 10:26 AM ---------- Previous update was at 10:25 AM ----------

Quote:
Originally Posted by hicksd8
Yes, you could use 'mtime' to exclude these files but there are many ways of doing this and I'm sure you'll get a number of ideas posted here.

Having done this kind of thing countless times my preferred method is to create a timestamp file at the end of each run:

Code:
date > timestamp

so that the inode of that file holds the timestamp when the last run ended.

Then on the following run I do:

Code:
find . ! -newer timestamp ...............

to select all files NOT newer than that timestamp.

If for example this is a cron job running every 15 minutes, using this method ensures that if a run is missed for some reason (eg, system down), the next run but one will pick up the backlog.

This method can create a 'moving window' behind the 15 minute allowance for incoming transfers to complete without the chance of selecting a file still being written to.

You then overwrite the timestamp file at the end of each successful run ready for next time.

(Of course, you may need a mechanism to prevent the selected list just getting longer and longer on each run by perhaps moving the files transferred elsewhere or deleting the files successfully transferred, but that wasn't what you asked.)

That's my suggestion but without doubt others will do it differently.

Hope that helps. Perhaps it doesn't.
Thanks for your input as well.. Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help with a script to transfer files from Solaris to windows

Hi Please can you tell me what could be wrong with the following scriptto transfer files from solaris 10 to windows machine: #!/bin/sh HOST=<IP> USER=administrator PASSWD=xyz123zyx /usr/bin/ftp -inv <<EOF connect $HOST user $USER $PASSWD cd Documents binary mput *.sh bye EOF (23 Replies)
Discussion started by: fretagi
23 Replies

2. Shell Programming and Scripting

Transfer files from one server with bash script

Hello to all, I want to copy from one server to another files of last 24 hours with size between 500MB and 2GB. The code below searches last files in 24 hours. find . -mtime -1 In order to copy faster I'd like to compress the files before copying them. How to automate the process of... (8 Replies)
Discussion started by: Ophiuchus
8 Replies

3. Shell Programming and Scripting

Help with script to transfer files from one server to another

Hi I have the following script: #!/bin/sh set -x touch -mt 201210040000 /tmp/ref1 touch -mt 201210042359 /tmp/ref2 find /fs1/bscsrtx/BSCS_ix/WORK/LOG -type f \( -newer /tmp/ref1 -a ! -newer /tmp/ref2 \) > file_lst scp $(< file_lst) root@10.100.48.76:/ano2005/fs1_2015/LOG/ but somehow its... (7 Replies)
Discussion started by: fretagi
7 Replies

4. Shell Programming and Scripting

Script to transfer files from Solaris to windows

Hi All Please can you help, I´ve wrote the following script on a solaris 10 server to transfer files to a windows machine. #!/usr/bin/sh cd /moneta_polled01/download HOST=10.100.4.72 USER=user1 PASSWD=V7stop /usr/bin/ftp -v $HOST <<EOF user $USER $PASSWD cd tmp binary put... (7 Replies)
Discussion started by: fretagi
7 Replies

5. Shell Programming and Scripting

Need to exclude .NFSxxx files in clear old files batch script

I am new to Shell Scripting and need some help. The following batch job has been failing for me due to the .nfsxxx files in use. I need to know how to modify the following script to exclude the .nfsxxx files so this batch job will not fail on me. I have done lots of googling and keep coming back... (2 Replies)
Discussion started by: kimberlyg2007
2 Replies

6. Shell Programming and Scripting

Script to transfer files without xcom

Hi All, I want to make a script in which I want to transfer files from current server to another. The problem is that I don't have xcom installed on this so I was thinking of using scp command. The code I've written till now is - # Transferring latest inactive log file to another server... (16 Replies)
Discussion started by: csrohit
16 Replies

7. Shell Programming and Scripting

Need script using cp to transfer files to another folder

Hi guys , how are you doing? I am tryng to make a script using cp that i would like to do this: echo "Enter the name files to tranfer" then user has to enter via keyboard this kind of output file1 file 2 file 3 file x (i mean the number of the files could vary and their name too)... (1 Reply)
Discussion started by: REX:)
1 Replies

8. Shell Programming and Scripting

Script to transfer files

Hi, I am new to scripting, I need to write a script to transfer .TXT files from Server A (ftp) to Server B. Once the files in server B i need to transfer it to server C (sftp). (not transfer of files directly from server A to Server C ) Thanks! Regards Sendhil ---------- Post updated at... (3 Replies)
Discussion started by: Sendhil.Kumaran
3 Replies

9. Shell Programming and Scripting

Script for FTP (transfer only new files)

Hi everybody, I just want to transfer files with FTP (mget and mput). The problem is that I dont want to overwrite any existing files and don't want to transfer them again (e.g. using the rename-function). So I only want to transfer new files with mget and mput. My first idea was to create... (3 Replies)
Discussion started by: inoxx
3 Replies

10. Shell Programming and Scripting

How to transfer files (By using generic script) when using sftp

I have written generic script to transfer files from one machine to other machine(By passing the command line arguments like server name, user name ,location of the files, file names etc) but when i am using sftp, what are the things I have to specify in the code Is it necessary to specify... (0 Replies)
Discussion started by: gsri
0 Replies
Login or Register to Ask a Question