Sponsored Content
Top Forums Shell Programming and Scripting In PErl script: need to read the data one file and generate multiple files based on the data Post 303014485 by Sanjeev G on Tuesday 13th of March 2018 11:42:20 AM
Old 03-13-2018
take backup of old logs and generate new logs

Hi Aia,

Thanks for the reply.....
Appreciate your help for the below issue.
Im using below code.....I dont want to attach the logs when I ran the perl twice...I just want to take backup with today date and generate new logs...What I need to do for the below scirpt..............

1)if logs exist it should move the logs with extention of today date nad need to generate new one
2) only for first log file coming perfectly and for remaining logfiles and input.txt not adding in the outputfile.

Script :

Code:
my $last_name = '';
my $writeout;
 
while(<>) {
    if (/^#(\w+)#(.+)$/) {
        if ($last_name ne $1) {
            close $writeout if $writeout;
            open($writeout, '>>', "$1.log");
        }
        $last_name = $1;
if ($. == 1) {
print $writeout "Extract $1 \n";  
my $filename="input.txt";
open (my $ip, "<" , $filename) || die ("Can't open file input.txt");
while ( <$ip> ) {
    next if  (/^$/);
    print $writeout  "$_";
}
close $ip;
}
        print $writeout "$2\n" if $writeout;
    }
}
close $writeout or die;

input files :
Code:
more sanj.txt

#ext1#test1.tale2 drop
#ext1#test11.tale21 drop
#ext1#test123.tale21 drop
#ext2#test1.tale21 drop
#ext2#test12.tale21 drop
#ext3#test11.tale21 drop
#ext3#test123.tale21 drop
#ext4#test1.tale21 drop
#ext4#test124.tale21 drop
#ext1#test1.tale2 drop




Code:
more input.txt

1.1.1.1
2.2.2.2


Code:
ls ext[0-9]*.log | while read f; do printf "file: $f\n--------------\n";cat $f; echo; done


file: ext1.log

  Extract ext1 
1.1.1.1
2.2.2.2
test1.tale2 drop
test11.tale21 drop
test123.tale21 drop
test1.tale2 drop

file: ext2.log
--------------
  test1.tale21 drop
test12.tale21 drop

file: ext3.log
--------------
 test11.tale21 drop
test123.tale21 drop

file: ext4.log
--------------
  test1.tale21 drop
test124.tale21 drop

Thanks,
G sanjeev Kumar

Last edited by Sanjeev G; 03-14-2018 at 02:24 AM.. Reason: adding in corrected format.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read the data from multiple files and sum the value

Hi all, I have a requirement where i have to read multiple files using Shell Script in Korn Shell. each file will have the 3rd line as the amount field, i have to read this amount field and sum it for all the files. any idea on how to achieve this?? (i think i can achieve it using a loop,... (9 Replies)
Discussion started by: nvuradi
9 Replies

2. Shell Programming and Scripting

generate report based on data in files.

Hi All, I need to develop a shell script which does sanity check of a data file, as below. 1. For DATE columns, it should check if date is given in proper format or not? For example, if format of date is expected as DD-MON-YYYY HH24:MI:SS and we received the date in formation like DDMMYYYY HH24,... (1 Reply)
Discussion started by: ace_friends22
1 Replies

3. Shell Programming and Scripting

Read Data from Config file using Perl

Hi All, Can anyone please explain me how to read data from config file in Perl. Suppose i have a config file named cfile. The data in config file is name=parth lname=mittal user=2007 hostname=fluoride username=parthmittal password=XXXXXX account=unix url=www.unix.com ... (2 Replies)
Discussion started by: parthmittal2007
2 Replies

4. Shell Programming and Scripting

Read multiple files, parse data and append to a file

Hi..Can anyone suggest a simple way of achieving this. I have several files which ends with extension .vcf . I will give example with two files In the below files, we are interested in File 1: 38 107 C 3 T 6 C/T 38 241 C 4 T 5 C/T 38 247 T 4 C 5 T/C 38 259 T 3 C 6 T/C... (8 Replies)
Discussion started by: empyrean
8 Replies

5. Shell Programming and Scripting

Generate tabular data based on a column value from an existing data file

Hi, I have a data file with : 01/28/2012,1,1,98995 01/28/2012,1,2,7195 01/29/2012,1,1,98995 01/29/2012,1,2,7195 01/30/2012,1,1,98896 01/30/2012,1,2,7083 01/31/2012,1,1,98896 01/31/2012,1,2,7083 02/01/2012,1,1,98896 02/01/2012,1,2,7083 02/02/2012,1,1,98899 02/02/2012,1,2,7083 I... (1 Reply)
Discussion started by: himanish
1 Replies

6. Shell Programming and Scripting

Need a perl script to read and write the data

Hi, I have on Designdocument in that information is stored with in tabular format.I need Perlscript to read and write the datausing perl script? Regards, Ravi (0 Replies)
Discussion started by: toravi.pentaho
0 Replies

7. Shell Programming and Scripting

Need a UNIX/perl script to read and write the data

Hi, I have on Designdocument in that information is stored with in tabular format.I need Perl/unix script to read and write the data using perl script? Regards, Ravi (4 Replies)
Discussion started by: toravi.pentaho
4 Replies

8. Shell Programming and Scripting

Generate Join clause based on key data

Hi, I have a file pk.txt which has pk data in following format TableName | PK Employee | id Contact|name,country My Output should be Employee | t1.id=s.id Contact| t1.name=s.name AND t1.country=s.country I started of like this: for LIST in `cat pk.txt` do... (5 Replies)
Discussion started by: wahi80
5 Replies

9. Shell Programming and Scripting

Read multiple text files and copy data to csv

hi i need to extract lines from multiple files to a csv file. for example, i have these 3 files file1.txt date:29dec1980 caller:91245824255 called:8127766 file2.txt date:11apr2014 caller:9155584558 called:8115478 file3.txt date:25jun2015 caller:445225552 called:8117485 (30 Replies)
Discussion started by: lp.descamps
30 Replies

10. UNIX for Beginners Questions & Answers

Generate files and use csv data to replace multiple variables in a template

I have a source csv file consists of first field as variable name, and the rest are site-specific information (converted from excel file, where site -specific values in columns). I am trying to create a file for every site using a template and replace the multiple variables with values from the... (3 Replies)
Discussion started by: apalex
3 Replies
TRIMHISTORY(8)						      System Manager's Manual						    TRIMHISTORY(8)

NAME
trimhistory - Remove old Xymon history-log entries SYNOPSIS
trimhistory --cutoff=TIME [options] DESCRIPTION
The trimhistory tool is used to purge old entries from the Xymon history logs. These logfiles accumulate information about all status changes that have occurred for any given service, host, or the entire Xymon system, and is used to generate the event- and history-log web- pages. Purging old entries can be done while Xymon is running, since the tool takes care not to commit updates to a file if it changes mid-way through the operation. In that case, the update is aborted and the existing logfile is left untouched. Optionally, this tool will also remove logfiles from hosts that are no longer defined in the Xymon bb-hosts(5) file. As an extension, even logfiles from services can be removed, if the service no longer has a valid status-report logged in the current Xymon status. OPTIONS
--cutoff=TIME This defines the cutoff-time when processing the history logs. Entries dated before this time are discarded. TIME is specified as the number of seconds since the beginning of the Epoch. This is easily generated by the GNU date(1) utility, e.g. the following com- mand will trim history logs of all entries prior to Oct. 1st 2004: trimhistory --cutoff=`date +%s --date="1 Oct 2004"` --outdir=DIRECTORY Normally, files in the BBHIST directory are replaced. This option causes trimhistory to save the shortened history logfiles to another directory, so you can verify that the operation works as intended. The output directory must exist. --drop Causes trimhistory to delete files from hosts that are not listed in the bb-hosts(5) file. --dropsvcs Causes trimhistory to delete files from services that are not currently tracked by Xymon. Normally these files would be left untouched if only the host exists. --droplogs Process the BBHISTLOGS directory also, and delete status-logs from events prior to the cut-off time. Note that this can dramatically increase the processing time, since there are often lots and lots of files to process. --progress[=N] This will cause trimhistory to output a status line for every N history logs or status-log collections it processes, to indicate how far it has progressed. The default setting for N is 100. --env=FILENAME Loads the environment from FILENAME before executing trimhistory. --debug Enable debugging output. FILES
$BBHIST/allevents The eventlog of all events that have happened in Xymon. $BBHIST/HOSTNAME The per-host eventlogs. $BBHIST/HOSTNAME.SERVICE The per-service eventlogs. $BBHISTLOGS/*/* The historical status-logs. ENVIRONMENT VARIABLES
BBHIST The directory holding all history logs. BBHISTLOGS The top-level directory for the historical status-log collections. BBHOSTS The location of the bb-hosts file, holding the list of currently known hosts in Xymon. SEE ALSO
xymon(7), bb-hosts(5) Xymon Version 4.2.3: 4 Feb 2009 TRIMHISTORY(8)
All times are GMT -4. The time now is 08:38 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy