+ read f d
+ [[ -n folder1 ]]
+ [[ -n file1 ]]
++ locate file1
+ filepath='/home/aaa/a/file1
/home/do/file1
/home/files/file1
/var/lib/mysql/ib_logfile1'
+ [[ -e /home/aaa/a/file1
/home/do/file1
/home/files/file1
/var/lib/mysql/ib_logfile1 ]] # Here's checking if the multiple line is an actual path, it will fail the condition.
+ read f d
+ [[ -n thisforfile2 ]]
+ [[ -n file2 ]]
++ locate file2
+ filepath='/home/bbb/11/22/file2
/home/data/thisforfile2
/home/files/file2'
+ [[ -e /home/bbb/11/22/file2
/home/data/thisforfile2
/home/files/file2 ]]
+ read f d
+ [[ -n thisfolderforfile3 ]]
+ [[ -n file3 ]]
++ locate file3
+ filepath='/home/data/thisfolderforfile3
/home/ttt/file3
/home/files/file3'
+ [[ -e /home/data/thisfolderforfile3
/home/ttt/file3
/home/files/file3 ]]
+ read f d
+ [[ -n folder4 ]]
+ [[ -n lada4 ]]
++ locate lada4 + filepath=/home/vv/lada4
+ [[ -e /home/vv/lada4 ]]
+ mkdir -p /home/data/folder4
+ cp /home/vv/lada4 /home/data/folder4
+ read f d
That shows the risk of using the command locate for what you want. It is returning multiple paths where the filename matches (concatenated as an string with newlines) and since, on purpose, I am not accepting the return as valid, I am checking if that result is an actual existent path, only [[ -e /home/vv/lada4 ]] is real.
yup they are duplicate files in multiple locations..i intend to do that.. so it should be copied on the specied folder..
You can always loop through the result of locate if you want to accept its result as "duplication" but if it has the same filename you'll be just overwriting them at destination. But I suspect that would not be the case, neither, since they could be a partial match or even not a file:
The point is that you do not have any guarantee that what locate is giving is what you expect.
not working,
locate/find.. it will locate the files anywhere in the system, the file1,file2 could be in different directories..
Quote:
Originally Posted by kenshinhimura
yup they are duplicate files in multiple locations..i intend to do that.. so it should be copied on the specied folder..
Your statement / request is not quite clear. You use locate yourself in post#1. Do you want all files located (including the ones whose file names are supersets of the search term) to be copied to your target directory given in your data file? If several files with identical file names exist, they will overwrite each other - which one should survive?
To give you a starting point, you might want to consider / analyse this:
It will check the resp. file name against the one in datafile and copy only if identical, but will not check for overwriting. Directories are checked for existence and created if non-existent. It will of course depend on the locate-DB to be up to date, and on those names not containing white space as these would confuse the for loop.
Give it a try an comment back
ITs really hard right? because that is just an example..the real data is 100 files..
What i have done in the past is to run the 2 for loop and copy manullay to specific folder.
Not truly hard. The hard part is for you to recognize the way that you discern what files need to be copied when you do it manually and communicate it in a way that can be translated into an automation script without getting unexpected results. I hope I have clearly pointed out that accepting the result from locate is not it.
For example the snippet below it might be alright if you understand that locate MUST never return a matched directory, a partial match in filename or directory name and never returns neither with spaces on them. Otherwise, you MUST accommodate for those conditions.
Quote:
Originally Posted by RudiC
To give you a starting point, you might want to consider / analyse this:
I have number of csv files (like tmo_2019*). In these files some files have 5th column value as V. I want to copy those files having 5th column value as V to specific directory /test/V_files/.
I tried to extract file names by below but not able to complete command for copy.
find -type f -iname... (4 Replies)
I have data of an excel files as given below,
file1
org1_1 1 1 2.5 100
org1_2 1 2 5.5 98
org1_3 1 3 7.2 88
file2
org2_1 1 1 2.5 100
org2_2 1 2 5.5 56
org2_3 1 3 7.2 70
I have multiple excel files as above shown.
I have to copy column 1, column 4 and paste into a new excel file as... (26 Replies)
Dear UNIX experts,
I'm a command line novice working on a Macintosh computer (Bash shell) and have neither found advice that is pertinent to my problem on the internet nor in this forum.
I have hundreds of .csv files in a directory. Now I would like to copy the subset of files that contains... (8 Replies)
Example:
I have files in below format
file 1:
zxc,133,joe@example.com
cst,222,xyz@example1.com
File 2 Contains:
hxd
hcd
jws
zxc
cst
File 1 has 50000 lines and file 2 has around 30000 lines :
Expected Output has to be :
hxd
hcd
jws (5 Replies)
Hi,
I need to find the difference between 2 files in unix and write the result in the new file
File1:
A
B
File2:
X 123 hajkd
Y 345 adjfka
A 123 djafjhd
B 678 dsndjks
Output file:
X 123 hajkd
Y 345 adjfka
Thanks. (6 Replies)
Hi friends,
My file is like:
Second file is :
I need to print the rows present in file one, but in order present in second file....I used
while read gh;do
awk ' $1=="' $gh'" {print >> FILENAME"output"} ' cat listoffirstfile
done < secondfile
but the output I am... (14 Replies)
Dear Gurus,
I am very new to UNIX. I appreciate your help to manage my files.
I have 16 files with equal number of columns in it. Each file has 9 columns separated by space. I need to compare the values in the second column of first file and obtain the corresponding value in the 9th column... (12 Replies)
Hi guys,
I'm rather new at using UNIX based systems, and when it comes to scripting etc I'm even newer.
I have two files which i need to compare.
file1: (some random ID's)
451245
451288
136588
784522
file2: (random ID's + e-mail assigned to ID)
123888 xc@xc.com
451245 ... (21 Replies)
I have several files that are being generated every 20 minutes. Each file contains 2 columns. The 1st column is Text, 2nd column is Data.
I would like to generate one single file from all these files as follows:
One instance of 1st column Text, followed by 2nd column Data separated by... (5 Replies)