Splitting files using awk and reading filename value from input data

 
Thread Tools Search this Thread
Top Forums Programming Open Source Splitting files using awk and reading filename value from input data
# 1  
Old 06-23-2016
Splitting files using awk and reading filename value from input data

I have a process that requires me to read data from huge log files and find the most recent entry on a per-user basis. The number of users may fluctuate wildly month to month, so I can't code for it with names or a set number of variables to capture the data, and the files are large so I don't want to read the it several times.

The entries of interest have a particular string so I can extract just them from the overall log file and I have a way to split the output into separate files on a per-user basis, my plan being to then just read the last line of each files created with tail -1 and the filename giving me the user account in question.

My boss, however, worries about false-positive data matches for my expression (by chance or maliciously) that might try to overwrite a critical file.


My data has a syslog-type date in it which means doing a sort -u is proving tricky too. I've got this far with splitting the data out to files under /tmp/logs as splitlog.rbatte1 or similar but if field 11 were ever */../../etc/passwd then potentially I would be in trouble.

The date is the first three fields and 'as far as I am aware' a valid user name would be in field 11, but ........

A simplified part of the code would be:-
Code:
grep "Active transaction started" /var/log/qapplog | awk "{print \$1, \$2, \$3, \$11> \"/tmp/logs/splitlog.\"\$11}"
for userfile in /tmp/logs/splitlog.*
do
   lastrecord=$(tail -1 $userfile)
   printf "User %s last record is %s\n" "$userfile" "$lastrecord"
   .... whatever else here ....
done

I have considered adding tr -d "\/" to strip out the characters, but now that it's been raised, I'm concerned that there may be other things I'm not considering.

Is there a better way to work here, potentially with awk getting the equivalent of basename "$11" or variable substitution in the shell of "${{11}##*/}"?


Any suggestions welcome. Perhaps there is a better design overall that will find the last entry on a per-user basis. The log is thankfully written in time order, so the last in the file by user name is the last by time already.

Kind regards,
Robin
# 2  
Old 06-23-2016
How about posting a few lines from the input file? Some with, and some without the string of interest.

Not sure if I understood the requirement correctly, but, taking my syslog for a sample, wouldn't this fulfill your needs (redirect printout to a "user" file in /tmp/log if need be):
Code:
awk '{split ($5, U, /[[]/); T[U[1]] = $1 " " $2 " " $3} END {for (t in T) print t, T[t]}' /var/log/syslog
systemd-tmpfiles Jun 23 08:47:54
anacron Jun 23 08:46:57
dhclient: Jun 23 18:28:50
dbus Jun 23 12:03:34
CRON Jun 23 21:17:01
kernel: Jun 23 21:18:13
mtp-probe: Jun 23 11:40:12

# 3  
Old 06-25-2016
I think you've got it just right, but it will take me a bit to decipher your code!


Many thanks,
Robin
# 4  
Old 06-25-2016
Just add your Active transaction started as an awk "pattern", and it should fly, shouldn't it? You can even refine the pattern to fulfill your boss' proclivities.
This User Gave Thanks to RudiC For This Post:
# 5  
Old 06-25-2016
Quote:
Originally Posted by RudiC
How about posting a few lines from the input file? Some with, and some without the string of interest.

Not sure if I understood the requirement correctly, but, taking my syslog for a sample, wouldn't this fulfill your needs (redirect printout to a "user" file in /tmp/log if need be):
Code:
awk '{split ($5, U, /[[]/); T[U[1]] = $1 " " $2 " " $3} END {for (t in T) print t, T[t]}' /var/log/syslog
systemd-tmpfiles Jun 23 08:47:54
anacron Jun 23 08:46:57
dhclient: Jun 23 18:28:50
dbus Jun 23 12:03:34
CRON Jun 23 21:17:01
kernel: Jun 23 21:18:13
mtp-probe: Jun 23 11:40:12

Quote:
Originally Posted by rbatte1
I think you've got it just right, but it will take me a bit to decipher your code!


Many thanks,
Robin
Hello, rbatte1.

Without having an actual input or output representative example from you, I do think RudiC has shown you a valid indication of what it is possible. Definitively, you do not have the need to grep for it.
If is of any help to decipher what it has been shown, allow me to break it in pieces.
Essentially, there are two pieces to process each line, based on the format structure of /var/log/syslog
Code:
split ($5, U, /[[]/); # it divides field 5 into smaller parts ([ is the separator) and creates an array to capture the pieces using a variable named U.
T[U[1]] = $1 " " $2 " " $3 # builds an index using the variable T where the index is the first element of U, the split done before.

Once all lines has been processed, display the result:
Code:
END {for (t in T) print t, T[t]} # iterate over the index and display what's recorded.

Since, this is done for each line, you'll get multiple hits.
If you want to narrow it down to a particular criteria, let's say: "Active transaction started"
Code:
awk '/Active transaction started/ {split ($5, U, /[[]/); T[U[1]] = $1 " " $2 " " $3} END {for (t in T) print t, T[t]}' /var/log/syslog

Limiting the index to only those lines that contain the match.

If you were to post a representative example, I am quite sure we could provide other alternatives, as well.
Also, you might be interested in knowing about an open source project named: ELK
Which it does an amazing job at dealing with all sort of information on logs.

Last edited by Aia; 06-25-2016 at 06:53 PM..
This User Gave Thanks to Aia For This Post:
# 6  
Old 06-27-2016
A compromise perhaps bolting several things together.

From the suggestions given and further digging, I've ended up with:-
Code:
awk "/Active transaction started/ && /${userregex}/ {print \$1, \$2, \$3 \$11 \
> \"/tmp/logs/splitlog.\"gensub(/[^[:alnum:]._-]/, \"\", \"g\", \$11)}" /var/log/qapplog

for userfile in /tmp/logs/splitlog.*
do
   lastrecord=$(tail -1 $userfile)
   printf "User %s last record is %s\n" "$userfile" "$lastrecord"
   .... whatever else here ....
done

To explain it a bit more:-
  • The ${userregex} is a regular expression for all the users we are interested in, so we can exclude testing messages which sadly get written to the same log.

  • The text output is just for checking we've got it, there would be further processing (and that's all fine) This is finding a way to get the last logged entry for each user, so the printf line is just for debug.

  • We removed the grep with the two expressions (both must be satisfied) see this thread

  • The output file(s) are redirected to /tmp/logs/splitlog. appended by the output from the gensub on field 11. The gensub removes all characters from field 11 that are not matched by being alphanumeric, full-stop (period for American English), underscore or hyphen, those all being acceptable characters to build a filename from and sensibly allowed in user account names. We could possibly have apostrophes too, but these have been excluded.

It seems to work for me in testing, but I'd appreciate another few sets of eyes to validate I'm not doing something daft and leaving a gaping hole somewhere.


Robin
# 7  
Old 06-27-2016
You won't get a user name from your for loop but a full path to a splitlog.usernamemod file, and you run multiple processes to get at the last entries.

I'm not sure I understand why you abstain from packing as much processing as possible into the awk script, piping its output (= last entry per user) into a while loop.

And, maybe the "further processing" could - to a large? extent - be included into the awk as well?

Last edited by RudiC; 06-27-2016 at 01:46 PM.. Reason: typo
This User Gave Thanks to RudiC For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to embed data instead of reading user input from an array?

Hello, I am running under ubuntu1 14.04 and I have a script which is sending given process names to vanish so that I'd see less output when I run most popular tools like top etc in terminal window. In usual method it works. Whenever I restart the system, I have to enter the same data from... (2 Replies)
Discussion started by: baris35
2 Replies

2. Shell Programming and Scripting

Splitting the Data using awk

Hello All, I have a comma delimiter file with 10 columns. I took the desired data but from $4 I need to split into two columns as 3+7 bytes. awk -F"," -v OFS=',' '{print $2,$3,$4}' foo.txt 42366,11/10/2014,5012418769 42366,11/10/2014,2046955672 42366,11/10/2014,2076802951 ... (3 Replies)
Discussion started by: karumudi7
3 Replies

3. Shell Programming and Scripting

Splitting input CSV file into 3 files

Hi , I am receiving a CSV file that can vary in number of rows each time. I am supposed to split this file into 3 separate files like this: 1. create a file named 'File1.csv' that will contain first 3 rows of the input file 2. create file named 'File2.csv' that will contain last 3 rows of the... (7 Replies)
Discussion started by: kedrick
7 Replies

4. Shell Programming and Scripting

Help with reading two input files in awk

Hello, I'm trying to write an awk program that reads two files inputs. example, file 1: 0.00017835 0.000176738 0.00018811 0.000189504 0.000188155 0.000180065 0.000178991 0.000178252 0.000182513 file 2: 1.7871769E-05 1.5139576E-16 1.5140196E-16 1.5139874E-16 1.7827407E-04 ... (5 Replies)
Discussion started by: joseamck
5 Replies

5. Shell Programming and Scripting

Filename from splitting files to have the same filename of the original file with counter value

Hi all, I have a list of xml file. I need to split the files to a different files when see the <ko> tag. The list of filename are B20090908.1100-20090908.1200_CDMA=1,NO=2,SITE=3.xml B20090908.1200-20090908.1300_CDMA=1,NO=2,SITE=3.xml B20090908.1300-20090908.1400_CDMA=1,NO=2,SITE=3.xml ... (3 Replies)
Discussion started by: natalie23
3 Replies

6. Shell Programming and Scripting

Reading specific contents from 1 input files and appending it to another input file

Hi guys, I am new to AWK and unix scripting. Please see below my problem and let me know if anyone you can help. I have 2 input files (example given below) Input file 2 is a standard file (it will not change) and we have to get the name (second column after comma) from it and append it... (5 Replies)
Discussion started by: sksahu
5 Replies

7. Shell Programming and Scripting

Splitting input files into multiple files through AWK command

Hi, I needs to split *.txt files from single directory depends on the some mutltiple input values. i have wrote the code like below for file in *.txt do grep -i -h "value1|value2" $file > $file; done. My requirment is more input values needs to be given in grep; let us say 50... (3 Replies)
Discussion started by: arund_01
3 Replies

8. Shell Programming and Scripting

awk reading 2 input files but not getting expected value

I'm reading 2 input files but not getting expected value. I should get an alpha value on file_1_data but not getting any. Please help. >cat test6.sh awk ' FILENAME==ARGV { file_1_data=$0; print "----- 1 Line " NR " -----" $1; next } FILENAME==ARGV { file_2_data=$0; print "----- 2... (1 Reply)
Discussion started by: pdtak
1 Replies

9. Shell Programming and Scripting

Reading in data sets into arrays from an input file.

Hye all, I would like some help with reading in a file in which the data is seperated by commas. for instance: input.dat: 1,2,34,/test for the above case, the fn. will store the values into an array -> data as follows: data = 1 data = 2 data = 34 data = /test I am trying to write... (5 Replies)
Discussion started by: sidamin810
5 Replies

10. UNIX for Dummies Questions & Answers

Moving files by splitting the path embedded in the filename

Hello All. I am having a directory /tmp/rahul which contains many files in the format @#home@#rahul@#programs@#script.pl where /home/rahul/programs is the directory where the script.pl file is to be placed. I have many files in this format. What i want is a script which read these... (7 Replies)
Discussion started by: rahulrathod
7 Replies
Login or Register to Ask a Question