SFTP - accidentally removing unprocessed records


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting SFTP - accidentally removing unprocessed records
# 1  
Old 10-11-2012
CPU & Memory SFTP - accidentally removing unprocessed records

I have an automated sftp process running on a Linux server that is securely connecting to an insurance company server whereby the insurance company places work assignments into a directory on their windows server (running VanDyke Vshell).

My unattended (background) process runs every minute. There is private/public key authentication that is working fine.

Its goal is to retrieve any new records in the Assignments directory, and after retrieving them, post them back to a directory called ProcessedAssignments. After sending them back, the process clears the Assignments directory.

This process works perfectly 99.9% of the time. Once in a great while, I accidentally remove a new record that was just placed in the Assignments directory between the time I did a 'get' and a 'put' and the time I do an rm * on the Assignments directory.

I have searched everywhere for a 'best practices' for this type of recurring file transfer with no results.

I'm trying to find a snippet of code somewhere that will allow me to create a list of the records I got/put, and only remove those records from the Assignments directory.

Code to generate script from Basic program here:
Code:
SCRIPT = ""
SCRIPT<-1> = "cd /Assignments" ;* Change external directory
SCRIPT<-1> = "get * ":CUSTOMER.INBOUND.DIRECTORY
SCRIPT<-1> = "cd /ProcessedAssignments":
SCRIPT<-1> = "put ":CUSTOMER.INBOUND.DIRECTORY:"/* ":
SCRIPT<-1> = "cd /Assignments"
SCRIPT<-1> = "rm * "
SCRIPT<-1> = "quit"
SCRIPT<-1> = "eof"


My problem is with the rm * statement.

So, I need to somehow pass a list to my script generator after performing the get, or I need to somehow tag the records I 'got' and only delete those...
Any help would be greatly appreciated.

Last edited by jim mcnamara; 10-11-2012 at 09:24 PM.. Reason: code tags please
# 2  
Old 10-11-2012
Is that how you generate code? (curiosity) Maybe we can do something about customizing how you delete stuff. Looks like mainframe....

If you have keys over there can you ssh?

rm * is dangerous, get * is not helpful in this situation either.

A priori - what the strategy is:
1. sftp to the remote
2. ls the assignments directory to a file
3. use the file to do individual get and rm commands
# 3  
Old 10-12-2012
Thanks for the quick response. And thanks for fixing my code tags.
# 4  
Old 10-12-2012
The best practice would be for the receiving system to remove or at least move records that have been processed, by itself. After all, only it knows which records have and haven't been processed.
# 5  
Old 10-12-2012
I have tried unsuccessfully to ls to a file. I have searched this forum and can't find my answer.

I am trying to redirect stdin to a file that I touched on the unix system called assignmentslog.txt.

I have tried

Code:
 
sftp> ls .  assignmentslog.txt

and
Code:
 
sftp> ls . 2>&1 | tee assignmentslog.txt

and several other attempts at redirecting the output(really input) of the ls to this file, and they all go to the screen.
What am I doing wrong?
# 6  
Old 10-12-2012
Code:
sftp remotesystem.com  <<EOF > dir.lis
 cd ./Assignments
 ls 
 exit
EOF

dir.lis will now contain the listing of available files.

Code:
move_file()
{
   sftp remotesystem.com <<EOF
   cd ./Assignments
   get  $1
   rm $1
   cd ./otherdirectory
   put $1
   exit
EOF
}

while read fname
do
    move_file $fname 
done < dir.lis


This is purely on the UNIX side. You will end up with local copies of the files, and copies of the files in ./otherdirectory on the remote box. Note that EOF goes in the leftmost column. EOF can be any set of nonsense syllables that the shell will ignore. It is sort of traditional to use EOF. This construct is a here document.
# 7  
Old 11-15-2012
where does this code reside?

SmilieI see what you are trying to show me, but I don't understand where to put the move_file code and how to execute the while/do. Are these embedded in a script, or set as executables somehow?
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing specific records from files when duplicate key

Hello I have been trying to remove a row from a file which has the same first three columns as another row - I have tried lots of different combinations of suggestion on this forum but can't get it exactly right. what I have is 900 - 1000 = 0 900 - 1000 = 2562 1000 - 1100 = 0 1000 - 1100... (7 Replies)
Discussion started by: tinytimmay
7 Replies

2. Shell Programming and Scripting

Removing duplicate records in a file based on single column explanation

I was reading this thread. It looks like a simpler way to say this is to only keep uniq lines based on field or column 1. https://www.unix.com/shell-programming-scripting/165717-removing-duplicate-records-file-based-single-column.html Can someone explain this command please? How are there no... (5 Replies)
Discussion started by: cokedude
5 Replies

3. Shell Programming and Scripting

removing duplicate records comparing 2 csv files

Hi All, I want to remove the rows from File1.csv by comparing a column/field in the File2.csv. If both columns matches then I want that row to be deleted from File1 using shell script(awk). Here is an example on what I need. File1.csv: RAJAK,ACTIVE,1 VIJAY,ACTIVE,2 TAHA,ACTIVE,3... (6 Replies)
Discussion started by: rajak.net
6 Replies

4. Shell Programming and Scripting

Removing non matching records

Hi all I have a file with records starting with "Page" has a first column. some of the records have some other junk characters has first column. so pls help me to remove rows which is not having "Page" has a first column. Thanks, Baski (2 Replies)
Discussion started by: baskivs
2 Replies

5. Shell Programming and Scripting

Removing duplicate records in a file based on single column

Hi, I want to remove duplicate records including the first line based on column1. For example inputfile(filer.txt): ------------- 1,3000,5000 1,4000,6000 2,4000,600 2,5000,700 3,60000,4000 4,7000,7777 5,999,8888 expected output: ---------------- 3,60000,4000 4,7000,7777... (5 Replies)
Discussion started by: G.K.K
5 Replies

6. Linux

Need awk script for removing duplicate records

I have log file having Traffic line 2011-05-21 15:11:50.356599 TCP (6), length: 52) 10.10.10.1.3020 > 10.10.10.254.50404: 2011-05-21 15:11:50.652739 TCP (6), length: 52) 10.10.10.254.50404 > 10.10.10.1.3020: 2011-05-21 15:11:50.652558 TCP (6), length: 89) 10.10.10.1.3020 >... (1 Reply)
Discussion started by: Rastamed
1 Replies

7. Shell Programming and Scripting

Removing duplicate records from 2 files

Can anyone help me to removing duplicate records from 2 separate files in UNIX? Please find the sample records for both the files cat Monday.dat 3FAHP0JA1AR319226MOHMED ATEK 966504453742 SAU2010DE 3LNHL2GC6AR636361HEA DEUK CHOI 821057314531 KOR2010LE 3MEHM0JG7AR652083MUTLAB NAL-NAFISAH... (4 Replies)
Discussion started by: zooby
4 Replies

8. Shell Programming and Scripting

Removing bad records from a text file

Hi, I have an requirement where i need to remove few bad records(bad records I mean email id's are part of the 1st field, where a numeric value expected) from the text file delimited by ",". file1.txt --------- 1234,,DAVID,MAX abc@email.com,,JOHN,SMITH 234,,ROBERT,SEN I need to remove... (3 Replies)
Discussion started by: naveen_sangam
3 Replies

9. Linux

Need awk script for removing duplicate records

I have huge txt file having millions of trade data. For e.g Trade.txt (first 8 lines in the file is header info) COB_DATE,TRADE_ID,SOURCE_SYSTEM_TRADE_ID,TRADE_GROUP_ID, TRADE_TYPE,DEALER_NAME,EXTERNAL_COUNTERPARTY_ID, EXTERNAL_COUNTERPARTY_NAME,DB_COUNTERPARTY_ID,... (6 Replies)
Discussion started by: nmumbarkar
6 Replies

10. UNIX for Dummies Questions & Answers

Removing spaces between records

Hi I have an XML file. Which has spaces between different records.... current file( Has many lines, like this... I want to delete all the spaces between > and <, if there are only spaces between them) input file <xyzr> <abc>1234</xyzr> <aaa> <bbb> ayz mnz</bbb> <sen>KEA... (6 Replies)
Discussion started by: thanuman
6 Replies
Login or Register to Ask a Question