Sponsored Content
Special Forums UNIX and Linux Applications grep file to find unique instances of username Post 302487856 by methyl on Thursday 13th of January 2011 08:32:32 PM
Old 01-13-2011
Or:
Code:
awk -F '/' '{print $2}' SystemOut.log | sort | uniq > /tmp/uniq_users.txt

This User Gave Thanks to methyl For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Find, make and move file based on username

hi there, i'm new to UNIX( just 3month used), i found my new box contained alot of files and directories in /home/box/ i've tried to search script in tis forum and found many of them but, i don't know how to combine them to make a script, although using pipes. my tasks are: 1) to scan user... (5 Replies)
Discussion started by: Helmi
5 Replies

2. Shell Programming and Scripting

How to replace all string instances found by find+grep

Hello all Im performing find + grep operation that looks like this : find . -name "*.dsp" | xargs grep -on Project.lib | grep -v ':0' and I like to add to this one liner the possibility to replace the string " Project.lib" that found ( more then once in file ) with "Example.lib" how can I do... (0 Replies)
Discussion started by: umen
0 Replies

3. Shell Programming and Scripting

To find the username in /etc/passwd file

Hi, I need to a shell script to list out only the username in the /etc/passwd file. Regards Siva (7 Replies)
Discussion started by: gsiva
7 Replies

4. Shell Programming and Scripting

Grep to find single instances of each ERROR type

i have a file that contents multiple instances of the same ERROR.Below the content of the file ERROR_FILE.txt Archiver6.log:2009-05-25 17:58:44,385 ERROR - CleanLPDataMessage: Missing Intervals: 2 Archiver6.log:2009-05-25 18:27:36,056 ERROR - CleanLPDataMessage: Missing Intervals: 5... (5 Replies)
Discussion started by: ali560045
5 Replies

5. Shell Programming and Scripting

Grep with multiple instances of same pattern

Hi, This is my text file I'm trying to Grep. Apple Location Greenland Rdsds dsds fdfd ddsads http Received Return Immediately Received End My Grep command: grep only--matching 'Location.*Received' e. Because the keyword Received appears twice, the Grep command will stop at the last... (0 Replies)
Discussion started by: spywarebox
0 Replies

6. Shell Programming and Scripting

Grep with multiple instances of same pattern

Hi, This is my text file I'm trying to Grep. Apple Location Greenland Rdsds dsds fdfd ddsads http Received Return Immediately Received End My Grep command: grep only--matching 'Location.*Received' Because the keyword Received appears twice, the Grep command will stop at the last... (3 Replies)
Discussion started by: spywarebox
3 Replies

7. UNIX for Dummies Questions & Answers

Find all the unique file extensions

Hi How can i find the unique list of file extensions in a folder/subfolders e.g. MAIN/ a.txt b.txt a.clas a.java b.class a.txt.112 c.12.ram.jar i just need to get the below out irrespective of file being present in folder or subfolders txt clas java (5 Replies)
Discussion started by: reldb
5 Replies

8. Shell Programming and Scripting

List unique values and count instances in .csv file

I need to take the second column of a .csv file and count the number of instances of each unique value in that same second column. I'd like the output to be value,count sorted by most instances. Thanks for any guidance! Data example: 317476,317756,0 816063,318861,0 313123,319091,0... (4 Replies)
Discussion started by: batcho
4 Replies

9. UNIX for Dummies Questions & Answers

Grep to find matching patern and return unique values

Request: grep to find given matching patern and return unique values, eliminate the duplicate values I have to retrieve the unique folder on the below file contents like; /app/oracle/build_lib/pkg320.0_20120927 /app/oracle/build_lib/pkg320.0_20121004_prof... (5 Replies)
Discussion started by: Siva SQL
5 Replies

10. Shell Programming and Scripting

Using grep and a parameter file to return unique values

Hello Everyone! I have updated the first post so that my intentions are easier to understand, and also attached sample files (post #18). I have over 500 text files in a directory. Over 1 GB of data. The data in those files is organised in lines: My intention is to return one line per... (23 Replies)
Discussion started by: clippertm
23 Replies
JOIN(1) 						      General Commands Manual							   JOIN(1)

NAME
join - relational database operator SYNOPSIS
join [ options ] file1 file2 DESCRIPTION
Join forms, on the standard output, a join of the two relations specified by the lines of file1 and file2. If file1 is `-', the standard input is used. File1 and file2 must be sorted in increasing ASCII collating sequence on the fields on which they are to be joined, normally the first in each line. There is one line in the output for each pair of lines in file1 and file2 that have identical join fields. The output line normally con- sists of the common field, then the rest of the line from file1, then the rest of the line from file2. Fields are normally separated by blank, tab or newline. In this case, multiple separators count as one, and leading separators are dis- carded. These options are recognized: -an In addition to the normal output, produce a line for each unpairable line in file n, where n is 1 or 2. -e s Replace empty output fields by string s. -jn m Join on the mth field of file n. If n is missing, use the mth field in each file. -o list Each output line comprises the fields specified in list, each element of which has the form n.m, where n is a file number and m is a field number. -tc Use character c as a separator (tab character). Every appearance of c in a line is significant. SEE ALSO
sort(1), comm(1), awk(1) BUGS
With default field separation, the collating sequence is that of sort -b; with -t, the sequence is that of a plain sort. The conventions of join, sort, comm, uniq, look and awk(1) are wildly incongruous. 7th Edition April 29, 1985 JOIN(1)
All times are GMT -4. The time now is 01:14 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy