Sponsored Content
Full Discussion: remove duplicate
Top Forums Shell Programming and Scripting remove duplicate Post 49197 by kazanoova2 on Saturday 27th of March 2004 07:01:08 PM
Old 03-27-2004
but the record contains space

thnx Ygor, Optimus_P
but what if the record contain space
ex: my record size 5
test.txt=ab cdefr tab cdab cdefr tasdfg
the result must be
ab cdefr tasdfg
i try :
dd if=$1 of=$1.temp cbs=5 conv=unblock
sort $1.temp -u >$1.temp1
cat $1.temp1|tr "\n" " ">$1.new ?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicate ???

Hi all, I have a out.log file CARR|02/26/2006 10:58:30.107|CDxAcct=1405157051 CARR|02/26/2006 11:11:30.107|CDxAcct=1405157051 CARR|02/26/2006 11:18:30.107|CDxAcct=7659579782 CARR|02/26/2006 11:28:30.107|CDxAcct=9534922327 CARR|02/26/2006 11:38:30.107|CDxAcct=9534922327 CARR|02/26/2006... (3 Replies)
Discussion started by: sabercats
3 Replies

2. Shell Programming and Scripting

Remove duplicate

Hi all, I have a text file fileA.txt DXRV|02/28/2006 11:36:49.049|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:37:06.404|SAC||||CDxAcct=6070970034 DXRV|02/28/2006 11:37:25.740|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:38:32.633|SAC||||CDxAcct=6070970034 DXRV|02/28/2006... (2 Replies)
Discussion started by: sabercats
2 Replies

3. UNIX for Dummies Questions & Answers

Remove duplicate in array

Hi, I have a list of numbers stored in an array as below. 5 7 10 30 30 40 50 Please advise how could I remove the duplicate value in the array ? Thanks in advance. (5 Replies)
Discussion started by: Rock
5 Replies

4. Shell Programming and Scripting

remove duplicate

Hi, I am tryung to use shell or perl to remove duplicate characters for example , if I have " I love google" it will become I love ggle" or even "I loveggle" if removing duplicate white space Thanks CC (6 Replies)
Discussion started by: ccp
6 Replies

5. Shell Programming and Scripting

Help with remove duplicate content

Input file data_1 10 US data_1 2 US data_1 5 UK data_2 20 ENGLAND data_2 12 KOREA data_3 4 CHINA . . data_60 123 US data_60 23 UK data_60 45 US Desired output file data_1 10 US data_1 5 UK data_2 20 ENGLAND data_2 12 KOREA (2 Replies)
Discussion started by: perl_beginner
2 Replies

6. Shell Programming and Scripting

How to remove duplicate ID's?

HI I have file contains 1000'f of duplicate id's with (upper and lower first character) as below i/p: a411532A411532a508661A508661c411532C411532 Requirement: But i need to ignore lowercase id's and need only below id's o/p: A411532 A508661 C411532 (9 Replies)
Discussion started by: buzzme
9 Replies

7. Shell Programming and Scripting

Remove duplicate

Hi , I have a pipe seperated file repo.psv where i need to remove duplicates based on the 1st column only. Can anyone help with a Unix script ? Input: 15277105||Common Stick|ESHR||Common Stock|CYRO AB 15277105||Common Stick|ESHR||Common Stock|CYRO AB 16111278||Common Stick|ESHR||Common... (12 Replies)
Discussion started by: samrat dutta
12 Replies

8. UNIX for Dummies Questions & Answers

Remove duplicate

Hi, How can I replace || with space and then remove duplicate from following text? T111||T222||T444||T222||T555 Thanks in advance (10 Replies)
Discussion started by: tinku981
10 Replies

9. UNIX for Dummies Questions & Answers

Remove Duplicate Lines

Hi I need this output. Thanks. Input: TAZ YET FOO FOO VAK TAZ BAR Output: YET VAK BAR (10 Replies)
Discussion started by: tara123
10 Replies

10. Shell Programming and Scripting

How to remove duplicate lines?

Hi All, I am storing the result in the variable result_text using the below code. result_text=$(printf "$result_text\t\n$name") The result_text is having the below text. Which is having duplicate lines. file and time for the interval 03:30 - 03:45 file and time for the interval 03:30 - 03:45 ... (4 Replies)
Discussion started by: nalu
4 Replies
TCFMGR(1)							   Tokyo Cabinet							 TCFMGR(1)

NAME
tcfmgr - the command line utility of the fixed-length database API DESCRIPTION
The command `tcfmgr' is a utility for test and debugging of the fixed-length database API and its applications. `path' specifies the path of a database file. `width' specifies the width of the value of each record. `limsiz' specifies the limit size of the database file. `key' specifies the key of a record. `value' specifies the value of a record. `file' specifies the input file. tcfmgr create path [width [limsiz]] Create a database file. tcfmgr inform [-nl|-nb] path Print miscellaneous information to the standard output. tcfmgr put [-nl|-nb] [-sx] [-dk|-dc|-dai|-dad] path key value Store a record. tcfmgr out [-nl|-nb] [-sx] path key Remove a record. tcfmgr get [-nl|-nb] [-sx] [-px] [-pz] path key Print the value of a record. tcfmgr list [-nl|-nb] [-m num] [-pv] [-px] [-rb lkey ukey] [-ri str] path Print keys of all records, separated by line feeds. tcfmgr optimize [-nl|-nb] path [width [limsiz]] Optimize a database file. tcfmgr importtsv [-nl|-nb] [-sc] path [file] Store records of TSV in each line of a file. tcfmgr version Print the version information of Tokyo Cabinet. Options feature the following. -nl : enable the option `FDBNOLCK'. -nb : enable the option `FDBLCKNB'. -sx : the input data is evaluated as a hexadecimal data string. -dk : use the function `tcfdbputkeep' instead of `tcfdbput'. -dc : use the function `tcfdbputcat' instead of `tcfdbput'. -dai : use the function `tcfdbaddint' instead of `tcfdbput'. -dad : use the function `tcfdbadddouble' instead of `tcfdbput'. -px : the output data is converted into a hexadecimal data string. -pz : do not append line feed at the end of the output. -m num : specify the maximum number of the output. -pv : print values of records also. -rb lkey ukey : specify the range of keys. -ri str : specify the interval notation of keys. -sc : normalize keys as lower cases. This command returns 0 on success, another on failure. SEE ALSO
tcftest(1), tcfmttest(1), tcfdb(3), tokyocabinet(3) Man Page 2012-08-18 TCFMGR(1)
All times are GMT -4. The time now is 09:41 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy