Sponsored Content
Full Discussion: parse files and delete
Top Forums Shell Programming and Scripting parse files and delete Post 302150789 by pimentelgg on Wednesday 12th of December 2007 05:06:08 PM
Old 12-12-2007
Create CSV from 'find' results

For the CSV file it should have one entry per line looking like this:

pwd/filename,YYYYMMDDhhmmss,OK

The date stamp should be from the file properties and OK will be a constant.

find . name 'W*' gives me the proper list of files and date +%Y%m%d%H%M%S is the proper date format but I don't know how to get them combined properly.

Please let me know it you'll be able to help.

Thanks!

Last edited by pimentelgg; 12-19-2007 at 01:27 PM.. Reason: better explanation, no responses received for initial question
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to parse 2 files simultaneously

Say I have 2 files of 2 rows of 3 columns each file1: cat catdata1 catdata2 dog dogdata1 dogdata2 file2: cat catdata3 catdata4 dog dogdata3 dogdata4 and I need to combine both files so that is appears like: cat catdata1 catdata2 catdata3 catdata4 dog dogdata1 dogdata2 dogdata3... (8 Replies)
Discussion started by: Awanka
8 Replies

2. Shell Programming and Scripting

[bash] Parse files

Hi I've 2 folder A and B, they have files with the same name but different content. I mean A contain---------> aa.txt, bb.txt, cc.txt B contain---------> aa.txt, bb.txt, cc.txt but aa.txt in A has different content from aa.txt in B. I'd like to parse the homonyms files in... (7 Replies)
Discussion started by: Dedalus
7 Replies

3. Shell Programming and Scripting

Parse and delete lines efficiently

Hi I have a set of options in the form of key value in a file. Need to find a particular value of 'a' and delete all lines till the next 'a' keyword . Ex : a bbb c ddd e fff g hhh a sss c ggg e xxx f sss a ddd d sss r sss g hhh (5 Replies)
Discussion started by: TDUser
5 Replies

4. Shell Programming and Scripting

Parse a File ColumnWise & Delete the Rows

Hi , I have a CSV file that has around 50K records in it. I have to delete all the duplicate rows from the file except one, depending upon data in column4. Lets say there are 3 rows for 'column4data' in the file. I have to retain only that line which has the smallest date value in column2.... (3 Replies)
Discussion started by: Sheel
3 Replies

5. Shell Programming and Scripting

Parse 2 or more files into one.

Hi, I have a really simple question...I think. I want to be able to parse two or more files into one by reading the first record from each file into new file then go back to the first file and start reading the second record in from each file into new file and so on. I am new to using awk and am... (5 Replies)
Discussion started by: qray2011
5 Replies

6. Shell Programming and Scripting

Parse tab delimited file, check condition and delete row

I am fairly new to programming and trying to resolve this problem. I have the file like this. CHROM POS REF ALT 10_sample.bam 11_sample.bam 12_sample.bam 13_sample.bam 14_sample.bam 15_sample.bam 16_sample.bam tg93 77 T C T T T T T tg93 79 ... (4 Replies)
Discussion started by: empyrean
4 Replies

7. Shell Programming and Scripting

awk to parse huge files

Hello All, I have a situation as below: (1) Read a source file (a single file of 1.2 million rows in it ) (2) Read Destination files one by one and replace the content ( few fields in it ) with the corresponding matching field from source file. I tried as below: ( please note I am not... (4 Replies)
Discussion started by: panyam
4 Replies

8. Shell Programming and Scripting

Script needed to delete to the list of files in a directory based on last created & delete them

Hi My directory structure is as below. dir1, dir2, dir3 I have the list of files to be deleted in the below path as below. /staging/retain_for_2years/Cleanup/log $ ls -lrt total 0 drwxr-xr-x 2 nobody nobody 256 Mar 01 16:15 01-MAR-2015_SPDBS2 drwxr-xr-x 2 root ... (2 Replies)
Discussion started by: prasadn
2 Replies

9. Shell Programming and Scripting

Parse log files

Hi all, We are having a sample log like .... test.log:2015.03.17 06:16:24 >> ABC.generateMethod() MethodAException while processing Request! DataForm: Header --- dtd: template.dtd, titleName: berger, requestId: 1503170032131, documentName: invoice123, hostName: acme.net, userName: userABC... (10 Replies)
Discussion started by: tandrei
10 Replies
clfmerge(1)							     logtools							       clfmerge(1)

NAME
clfmerge - merge Common-Log Format web logs based on time-stamps SYNOPSIS
clfmerge [--help | -h] [-b size] [-d] [file names] DESCRIPTION
The clfmerge program is designed to avoid using sort to merge multiple web log files. Web logs for big sites consist of multiple files in the >100M size range from a number of machines. For such files it is not practical to use a program such as gnusort to merge the files because the data is not always entirely in order (so the merge option of gnusort doesn't work so well), but it is not in random order (so doing a complete sort would be a waste). Also the date field that is being sorted on is not particularly easy to specify for gnusort (I have seen it done but it was messy). This program is designed to simply and quickly sort multiple large log files with no need for temporary storage space or overly large buf- fers in memory (the memory footprint is generally only a few megs). OVERVIEW
It will take a number (from 0 to n) of file-names on the command line, it will open them for reading and read CLF format web log data from them all. Lines which don't appear to be in CLF format (NB they aren't parsed fully, only minimal parsing to determine the date is per- formed) will be rejected and displayed on standard-error. If zero files are specified then there will be no error, it will just silently output nothing, this is for scripts which use the find com- mand to find log files and which can't be counted on to find any log files, it saves doing an extra check in your shell scripts. If one file is specified then the data will be read into a 1000 line buffer and it will be removed from the buffer (and displayed on stan- dard output) in date order. This is to handle the case of web servers which date entries on the connection time but write them to the log at completion time and thus generate log files that aren't in order (Netscape web server does this - I haven't checked what other web servers do). If more than one file is specified then a line will be read from each file, the file that had the earliest time stamp will be read from until it returns a time stamp later than one of the other files. Then the file with the earlier time stamp will be read. With multiple files the buffer size is 1000 lines or 100 * the number of files (whichever is larger). When the buffer becomes full the first line will be removed and displayed on standard output. OPTIONS
-b buffer-size Specify the buffer-size to use, if 0 is specified then it means to disable the sliding-window sorting of the data which improves the speed. -d Set domain-name mangling to on. This means that if a line starts with as the name of the site that was requested then that would be removed from the start of the line and the GET / would be changed to GET http://www.company.com/ which allows programs like Webal- izer to produce good graphs for large hosting sites. Also it will make the domain name in lower case. EXIT STATUS
0 No errors 1 Bad parameters 2 Can't open one of the specified files 3 Can't write to output AUTHOR
This program, its manual page, and the Debian package were written by Russell Coker <russell@coker.com.au>. SEE ALSO
clfsplit(1),clfdomainsplit(1) Russell Coker <;russell@coker.com.au> 0.06 clfmerge(1)
All times are GMT -4. The time now is 01:36 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy