Note that newdir will be same level as your olddir
Hello Senhia83/Phaethon,
Yes script looks to work same as requested, but it is always good practice to mention the complete path in spite of relative path. Otherwise we need to put the script a specific directory only. Following may also help in same.
NOTE: I have just taken an example here for paths you can mention the exact path according to your need.
Thanks,
R. Singh
Last edited by RavinderSingh13; 03-05-2015 at 12:46 AM..
This User Gave Thanks to RavinderSingh13 For This Post:
I did search the posts for info on this and while there were some in the ballpark, none addressed this specifically. (also I tried to post this once it said I was logged out, so hopefully I'm not sending a duplicate here).
I have a set of files (250 +/-) where I need to delete the first "$x"... (4 Replies)
I have written an script which will excluded some records from .csv file and put it on another excluded file from primary file.This is working very fine.Now the problem is that I want to delete those excluded lines from Primary file but not able to delete it.
I have stored the line number in... (1 Reply)
Hi all....I'm using awk to validate a csv file, but now I've been told to delete any invalid lines from the file.. Im not sure what the best way to do this is?
Would it be to create a temp file say "csv_temp.tmp" file and print all the valid records to that temp file. Then delete the old file... (3 Replies)
I have a directory full of text data files.
Unfortunately I need to get rid of the 7th and 8th line from them all so that I can input them into a GIS application.
I've used an awk script to do one at a time but due to the sheer number of files I need some kind of loop mechanism to automate... (3 Replies)
I have a directory full of text data files.
Unfortunately I need to get rid of the 7th and 8th line from them all so that I can input them into a GIS application.
I've used an awk script to do one at a time but due to the sheer number of files I need some kind of loop mechanism to automate... (3 Replies)
I need a little help...
I have a file with 3 fields,
Time/Date, IP address, UniqueID
I have a 2nd file of UniqueIDs.
I want to either (which ever is easier):
1. delete entries in file 1 that have a UniqueID in file 2
2. create a new file with the fields from File 1, excluding the... (4 Replies)
Hi All,
I have about 10000 files having names of the same pattern
I am using awk to match the pattern.
Inside the awk block how can i delete all the 10000 files..
Or is there any other way to delete files in bulk...
"rm" command fails with the message that argument list is... (9 Replies)
Hello friends,
I searched in forums for similar threads but what I want is to have a single awk code to perform followings;
I have a big log file going like this;
...
7450494 1724465 -47 003A98B710C0
7450492 1724461 -69 003A98B710C0
7450488 1724459 001DA1915B70 trafo_14:3
7450482... (5 Replies)
hi,
Here is excerpt from my xml file
<!-- The custom module to do the authentication for LDAP
-->
</login-module>
<login-module code="com.nlayers.seneca.security.LdapLogin" flag="sufficient">
<module-option... (1 Reply)
hi
i have a set of similar files. i want to delete lines until certain pattern appears in those files. for a single file the following command can be used but i want to do it for all the files at a time since the number is in thousands.
awk '/PATTERN/{i++}i' file (6 Replies)
Discussion started by: anurupa777
6 Replies
LEARN ABOUT OPENSOLARIS
comm
comm(1) User Commands comm(1)NAME
comm - select or reject lines common to two files
SYNOPSIS
comm [-123] file1 file2
DESCRIPTION
The comm utility reads file1 and file2, which must be ordered in the current collating sequence, and produces three text columns as output:
lines only in file1; lines only in file2; and lines in both files.
If the input files were ordered according to the collating sequence of the current locale, the lines written will be in the collating
sequence of the original lines. If not, the results are unspecified.
OPTIONS
The following options are supported:
-1 Suppresses the output column of lines unique to file1.
-2 Suppresses the output column of lines unique to file2.
-3 Suppresses the output column of lines duplicated in file1 and file2.
OPERANDS
The following operands are supported:
file1 A path name of the first file to be compared. If file1 is -, the standard input is used.
file2 A path name of the second file to be compared. If file2 is -, the standard input is used.
USAGE
See largefile(5) for the description of the behavior of comm when encountering files greater than or equal to 2 Gbyte ( 2^31 bytes).
EXAMPLES
Example 1 Printing a list of utilities specified by files
If file1, file2, and file3 each contain a sorted list of utilities, the command
example% comm -23 file1 file2 | comm -23 - file3
prints a list of utilities in file1 not specified by either of the other files. The entry:
example% comm -12 file1 file2 | comm -12 - file3
prints a list of utilities specified by all three files. And the entry:
example% comm -12 file2 file3 | comm -23 -file1
prints a list of utilities specified by both file2 and file3, but not specified in file1.
ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of comm: LANG, LC_ALL, LC_COLLATE,
LC_CTYPE, LC_MESSAGES, and NLSPATH.
EXIT STATUS
The following exit values are returned:
0 All input files were successfully output as specified.
>0 An error occurred.
ATTRIBUTES
See attributes(5) for descriptions of the following attributes:
+-----------------------------+-----------------------------+
| ATTRIBUTE TYPE | ATTRIBUTE VALUE |
+-----------------------------+-----------------------------+
|Availability |SUNWesu |
+-----------------------------+-----------------------------+
|CSI |enabled |
+-----------------------------+-----------------------------+
|Interface Stability |Standard |
+-----------------------------+-----------------------------+
SEE ALSO cmp(1), diff(1), sort(1), uniq(1), attributes(5), environ(5), largefile(5), standards(5)SunOS 5.11 3 Mar 2004 comm(1)