Sponsored Content
Full Discussion: Help in removing duplicates
Top Forums Shell Programming and Scripting Help in removing duplicates Post 302612255 by mregine on Sunday 25th of March 2012 11:22:38 AM
Old 03-25-2012
Try
$ sort -u abc.txt -o abc.txt
This User Gave Thanks to mregine For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing duplicates

Hi, I've been trying to removed duplicates lines with similar columns in a fixed width file and it's not working. I've search the forum but nothing comes close. I have a sample file: 27147140631203RA CCD * 27147140631203RA PPN * 37147140631207RD AAA 47147140631203RD JNA... (12 Replies)
Discussion started by: giannicello
12 Replies

2. UNIX for Dummies Questions & Answers

removing duplicates and sort -k

Hello experts, I am trying to remove all lines in a csv file where the 2nd columns is a duplicate. I am try to use sort with the key parameter sort -u -k 2,2 File.csv > Output.csv File.csv File Name|Document Name|Document Title|Organization Word Doc 1.doc|Word Document|Sample... (3 Replies)
Discussion started by: orahi001
3 Replies

3. Shell Programming and Scripting

removing duplicates

Hi I have a file that are a list of people & their credentials i recieve frequently The issue is that whne I catnet this list that duplicat entries exists & are NOT CONSECUTIVE (i.e. uniq -1 may not weork here ) I'm trying to write a scrip that will remove duplicate entries the script can... (5 Replies)
Discussion started by: stevie_velvet
5 Replies

4. Shell Programming and Scripting

Removing duplicates

Hi, I have a file in the below format., test test (10) to to (25) see see (45) and i need the output in the format of test 10 to 25 see 45 Some one help me? (6 Replies)
Discussion started by: imdadulla
6 Replies

5. UNIX for Advanced & Expert Users

removing duplicates.

Hi All In unix ,we have a file ,there we have to remove the duplicates by using one specific column. Can any body tell me the command. ex: file1 id,name 1,ww 2,qwq 2,asas 3,asa 4,asas 4,asas o/p: 1,ww 2,qwq 3,asa (7 Replies)
Discussion started by: raju4u
7 Replies

6. Shell Programming and Scripting

Removing duplicates

I have a test file with the following 2 columns: Col 1 | Col 2 T1 | 1 <= remove T5 | 1 T4 | 2 T1 | 3 T3 | 3 T4 | 1 <= remove T1 | 2 <= remove T3 ... (7 Replies)
Discussion started by: gctex
7 Replies

7. Emergency UNIX and Linux Support

Removing all the duplicates

i want to remove all the duplictaes in a file.I dont want even a single entry. For the input data: 12345|12|34 12345|13|23 3456|12|90 15670|12|13 12345|10|14 3456|12|13 i need the below data in one file 15670|12|13 and the below data in another file (9 Replies)
Discussion started by: pandeesh
9 Replies

8. UNIX for Dummies Questions & Answers

Removing duplicates from a file

Hi All, I am merging files coming from 2 different systems ,while doing that I am getting duplicates entries in the merged file I,01,000131,764,2,4.00 I,01,000131,765,2,4.00 I,01,000131,772,2,4.00 I,01,000131,773,2,4.00 I,01,000168,762,2,2.00 I,01,000168,763,2,2.00... (5 Replies)
Discussion started by: Sri3001
5 Replies

9. Shell Programming and Scripting

Removing duplicates except the last occurrence

Hi All, i have a file like below, @DB_FCTS\src\Data\Scripts\Delete_CU_OM_BIL_PRT_STMT_TYP.sql @DB_FCTS\src\Data\Scripts\Delete_CDP_BILL_LBL_MSG.sql @DB_FCTS\src\Data\Scripts\Delete_OM_BIDDR.sql @DB_FCTS\src\Data\Scripts\Insert_CU_OM_LBL_MSG.sql... (11 Replies)
Discussion started by: mechvijays
11 Replies

10. Shell Programming and Scripting

Removing duplicates from new file

i hav two files like i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3 (2 Replies)
Discussion started by: sagar_1986
2 Replies
CURLOPT_WILDCARDMATCH(3)				     curl_easy_setopt options					  CURLOPT_WILDCARDMATCH(3)

NAME
CURLOPT_WILDCARDMATCH - enable directory wildcard transfers SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_setopt(CURL *handle, CURLOPT_WILDCARDMATCH, long onoff); DESCRIPTION
Set onoff to 1 if you want to transfer multiple files according to a file name pattern. The pattern can be specified as part of the CUR- LOPT_URL(3) option, using an fnmatch-like pattern (Shell Pattern Matching) in the last part of URL (file name). By default, libcurl uses its internal wildcard matching implementation. You can provide your own matching function by the CUR- LOPT_FNMATCH_FUNCTION(3) option. A brief introduction of its syntax follows: * - ASTERISK ftp://example.com/some/path/*.txt (for all txt's from the root directory) ? - QUESTION MARK Question mark matches any (exactly one) character. ftp://example.com/some/path/photo?.jpeg [ - BRACKET EXPRESSION The left bracket opens a bracket expression. The question mark and asterisk have no special meaning in a bracket expression. Each bracket expression ends by the right bracket and matches exactly one character. Some examples follow: [a-zA-Z0-9] or [f-gF-G] - character interval [abc] - character enumeration [^abc] or [!abc] - negation [[:name:]] class expression. Supported classes are alnum,lower, space, alpha, digit, print, upper, blank, graph, xdigit. [][-!^] - special case - matches only '-', ']', '[', '!' or '^'. These characters have no special purpose. [[]\] - escape syntax. Matches '[', ']' or ''. Using the rules above, a file name pattern can be constructed: ftp://example.com/some/path/[a-z[:upper:]\].jpeg PROTOCOLS
This feature is only supported for FTP download. EXAMPLE
See https://curl.haxx.se/libcurl/c/ftp-wildcard.html AVAILABILITY
Added in 7.21.0 RETURN VALUE
Returns CURLE_OK if the option is supported, and CURLE_UNKNOWN_OPTION if not. SEE ALSO
CURLOPT_FNMATCH_FUNCTION(3), CURLOPT_URL(3), libcurl 7.54.0 February 03, 2016 CURLOPT_WILDCARDMATCH(3)
All times are GMT -4. The time now is 03:37 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy