Sponsored Content
Top Forums Shell Programming and Scripting How to identify varying unique fields values from a text file in UNIX? Post 302992590 by joeyg on Monday 27th of February 2017 02:53:15 PM
Old 02-27-2017
Can you supply some examples to better understand?
Not 150 byte lines and not 100k likes, but something to get an idea of your goal?

There are commands such as
sort -u
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Append tabs at the end of each line in NAWK -- varying fields

Hi, I need some help in knowing how I can append tabs at the end of each line... The data looks something like this: field1, field2, field3, field4 1 2 3 4 5 I have values in field1 and field 2 in the first row and I would like to append tab on field3 and field4 for the first row..and in... (6 Replies)
Discussion started by: madhunk
6 Replies

2. Shell Programming and Scripting

Extracting records with unique fields from a fixed width txt file

Greetings, I would like to extract records from a fixed width text file that have unique field elements. Data is structured like this: John A Smith NY Mary C Jones WA Adam J Clark PA Mary Jones WA Fieldname / start-end position Firstname 1-10... (8 Replies)
Discussion started by: sitney
8 Replies

3. Shell Programming and Scripting

Parse apart strings of comma separated data with varying number of fields

I have a situation where I am reading a text file line-by-line. Those lines of data contain comma separated fields of data. However, each line can vary in the number of fields it can contain. What I need to do is parse apart each line and write each field of data found (left to right) into a file.... (7 Replies)
Discussion started by: 2reperry
7 Replies

4. Shell Programming and Scripting

Getting Unique values in a file

Hi, I have a file like this: Some_String_Here 123 123 123 321 321 321 3432 3221 557 886 321 321 I would like to find only the unique values in the files and get the following output: Some_String_Here 123 321 3432 3221 557 886 I am trying to get this done using awk. Can someone please... (5 Replies)
Discussion started by: Legend986
5 Replies

5. Shell Programming and Scripting

comparing 2 text files to get unique values??

Hi all, I have got a problem while comparing 2 text files and the result should contains the unique values(Non repeatable). For eg: file1.txt 1 2 3 4 file2.txt 2 3 So after comaping the above 2 files I should get only 1 and 4 as the output. Pls help me out. (7 Replies)
Discussion started by: smarty86
7 Replies

6. Shell Programming and Scripting

Getting required fields from a text file in UNIX

My data is something like as shown below. Out of this i want the details of alarms (ex: 1947147711,1947147081......) and the fields( ex :sw=tacmwafabb9:shelf=1:slot=5-2:pport=2) Once i have these details separated, i want the count of these excluding the duplicates. What is the best possible way... (7 Replies)
Discussion started by: rdhanek
7 Replies

7. Shell Programming and Scripting

Identify high values "ÿ" in a text file using Unix command

I have high values (such as ÿÿÿÿ) in a text file contained in an Unix AIX server. I need to identify all the records which are having these high values and also get the position/column number in the record structure if possible. Is there any Unix command by which this can be done to : 1.... (5 Replies)
Discussion started by: devina
5 Replies

8. Shell Programming and Scripting

Compare multiple files, identify common records and combine unique values into one file

Good morning all, I have a problem that is one step beyond a standard awk compare. I would like to compare three files which have several thousand records against a fourth file. All of them have a value in each row that is identical, and one value in each of those rows which may be duplicated... (1 Reply)
Discussion started by: nashton
1 Replies

9. UNIX for Dummies Questions & Answers

Unique values in a row sum the next column in UNIX

Hi would like to ask you guys any advise regarding my problem I have this kind of data file.txt 111111111,20 111111111,50 222222222,70 333333333,40 444444444,10 444444444,20 I need to get this file1.txt 111111111,70 222222222,70 333333333,40 444444444,30 using this code I can... (6 Replies)
Discussion started by: reks
6 Replies

10. Shell Programming and Scripting

Print line if values in fields matches number and text

datafile: 2017-03-24 10:26:22.098566|5|'No Route for Sndr:RETEK RMS 00040 /ZZ Appl:PF Func:PD Txn:832 Group Cntr:None ISA CntlNr:None Ver:003050 '|'2'|'PFI'|'-'|'EAI_ED_DeleteAll'|'EAI_ED'|NULL|NULL|NULL|139050594|ActivityLog| 2017-03-27 02:50:02.028706|5|'No Route for... (7 Replies)
Discussion started by: SkySmart
7 Replies
SORT(1) 						      General Commands Manual							   SORT(1)

NAME
sort - sort a file of ASCII lines SYNOPSIS
sort [-bcdfimnru] [-tc] [-o name] [+pos1] [-pos2] file ... OPTIONS
-b Skip leading blanks when making comparisons -c Check to see if a file is sorted -d Dictionary order: ignore punctuation -f Fold upper case onto lower case -i Ignore nonASCII characters -m Merge presorted files -n Numeric sort order -o Next argument is output file -r Reverse the sort order -t Following character is field separator -u Unique mode (delete duplicate lines) EXAMPLES
sort -nr file # Sort keys numerically, reversed sort +2 -4 file # Sort using fields 2 and 3 as key sort +2 -t: -o out # Field separator is : sort +.3 -.6 # Characters 3 through 5 form the key DESCRIPTION
Sort sorts one or more files. If no files are specified, stdin is sorted. Output is written on standard output, unless -o is specified. The options +pos1 -pos2 use only fields pos1 up to but not including pos2 as the sort key, where a field is a string of characters delim- ited by spaces and tabs, unless a different field delimiter is specified with -t. Both pos1 and pos2 have the form m.n where m tells the number of fields and n tells the number of characters. Either m or n may be omitted. SEE ALSO
comm(1), grep(1), uniq(1). SORT(1)
All times are GMT -4. The time now is 02:39 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy