Sponsored Content
Full Discussion: Parsing a large log
Top Forums Shell Programming and Scripting Parsing a large log Post 302199984 by era on Wednesday 28th of May 2008 07:43:20 AM
Old 05-28-2008
And show us the regex you are using. Simple greps would take a few seconds max unless your disks are very slow. It's reading the file linearly so you can't get much faster performance than that. If cat is too slow, there really isn't much hope in making it fast enough, other than replacing the disk, or managing the file in a different way (split into smaller chunks? Import into a DBMS?)
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Splitting a large log file

Okay, absolute newbie here... I'm on a Mac trying to split an almost 2 Gig log file on a Unix box into manageable chunks for my web-based log analysis tool. What do I need to do, what programs do I need to do it? All and any help appreciated/needed :-) Cheers (8 Replies)
Discussion started by: simmonet
8 Replies

2. Shell Programming and Scripting

Problem with parsing a large file

Hi All, Following is the sample file and following is the op desired that is the last entry of each unique first field is required. My solution is as follows However the original file has around a million entries and around a 100,000 uniques first fields, so this soln.... (6 Replies)
Discussion started by: gauravgoel
6 Replies

3. Shell Programming and Scripting

Cutting a large log file in to smaller ones

I have a very large (150 megs) IRC log file from 2000-2001 which I want to cut down to individual daily log files. I have a very basic knowledge of the cat, sed and grep commands. The log file is time stamped and each day in the large log file begins with a "Session Start" string like so: ... (11 Replies)
Discussion started by: MrTangent
11 Replies

4. Shell Programming and Scripting

parsing large CDR XML file

Dear Freind in the file attached how parse the data to be like a normal table :D (3 Replies)
Discussion started by: saifsafaa
3 Replies

5. Red Hat

Help for capturing a URL from a line from large log file

Can someone please help me how do I find a URL from lines of log file and write all the output to a new file? For e.g - Log file has similar entries, 39.155.67.5 - - "GET /abc/login?service=http://161.120.36.39/CORPHR/TMA2007/default.asp HTTP/1.1" 401 3218 54.155.63.9 - - "GET... (2 Replies)
Discussion started by: rockf1bull
2 Replies

6. Shell Programming and Scripting

Help needed for parsing large XML with awk.

My XML structure looks like: <?xml version="1.0" encoding="UTF-8"?> <SearchRepository> <SearchItems> <SearchItem> ... </SearchItem> <SearchItem> ... ... (1 Reply)
Discussion started by: jasonjustice
1 Replies

7. UNIX for Dummies Questions & Answers

I need to isolate a date in a large log file

I wrote head -n1 example.log I grab the first line of the log, but I need to isolate just the date, which is 08/May/2012:09:52:52. I also need to find the reverse of this, which would be tail... http://i.imgur.com/Lp1eBD0.png Thanks in advance (4 Replies)
Discussion started by: spookydll
4 Replies

8. Shell Programming and Scripting

Log parsing

I have a directory with daily logs that have records like this: Date: 04/17/13 Time: 09:29:15 IP: 123.123.123.123 URL: usr/local/file1 and I want to only count how many times each file was accessed (e.g. file1 in that example above), and I want to also look in all the logs in the current... (3 Replies)
Discussion started by: Jaymz
3 Replies

9. Shell Programming and Scripting

Parsing large files in Solaris 11

I have a 1.2G file that contains no newline characters. This is essentially a log file with each entry being exactly 78bits long. The basic format is /DATE/USER/MISC/. The single uniform thing about the file is that that the 8 character is always ":" I worked with smaller files of the same... (8 Replies)
Discussion started by: os2mac
8 Replies

10. Shell Programming and Scripting

Parsing a subset of data from a large matrix

I do have a large matrix of the following format and it is tab delimited ch-ab1-20 ch-bb2-23 ch-ab1-34 ch-ab1-24 er-cc1-45 bv-cc1-78 ch-ab1-20 0 2 3 4 5 6 ch-bb2-23 3 0 5 ... (6 Replies)
Discussion started by: Kanja
6 Replies
vgchgid(1M)															       vgchgid(1M)

NAME
vgchgid - modify the Volume Group ID (VGID) on a given set of physical devices SYNOPSIS
PhysicalVolumePath [PhysicalVolumePath] ... DESCRIPTION
The command is designed to change the LVM Volume Group ID (VGID) on a supplied set of disks. will work with any type of storage, but it is primarily targeted at disk arrays that are able to create "snapshots" or "clones" of mirrored LUNs. accepts a set of raw physical devices and ensures that they all belong to the same volume group, before altering the VGID (see section). The same VGID is set on all the disks and it should be noted that in cases of multi-PV volume groups, all the physical volumes should be supplied in a single invocation of the command. Options recognizes the following options and arguments: PhysicalVolumePath The raw devices path name of a physical volume. Background Some storage subsystems have a feature which allows a user to split off a set of mirror copies of physical storage (termed or just as LVM splits off logical volumes with the command. As the result of the "split," the split-off devices will have the same VGID as the original disks. is needed to modify the VGID on the BCV devices. Once the VGID has been altered, the BCV disks can be imported into a new volume group by using WARNINGS
Once the VGID has been changed, the original VGID is lost until a disk device is re-mirrored with the original devices. If is used on a subset of disk devices (for example, two out of four disk devices), the two groups of disk devices would not be able to be imported into the same volume group since they have different VGIDs on them. The solution is to re-mirror all four of the disk devices and re-run on all four BCV devices at the same time, and then use to import them into the same new volume group. If a disk is newly added to an existing volume group and no subsequent LVM operations has been performed to alter the structures (in other words, operations which perform an automated vgcfgbackup(1M)); then it is possible a subsequent will fail. It will report that the disk does not belong to the volume group. This may be overcome by performing a structure changing operation on the volume group (for example, using It is the system administrator's responsibility to make sure that the devices provided in the command line are all Business Copy volumes of the existing standard physical volumes and are in the ready state and writable. Mixing the standard and BC volumes in the same volume group can cause data corruption. RETURN VALUE
returns the following values: 0 VGID was modified with no error 1 VGID was not modified EXAMPLES
An example showing how might be used: 1. The system administrator uses the following commands to create the Business Continuity (BCV or BC) copy: 1) For EMC Symmetrix disks, the commands are and 2) For XP disk array, the commands are and Three BCV disks are created. 2. Change the VGID on the BCV disks. 3. Make a new volume group using the BCV disks. This step can be skipped as the group file will be created automatically. If the file is manually created it will have different major and minor numbers (see lvm(7)). 4. Import the BCV disks into the new volume group. 5. Activate the new volume group. 6. Backup the new volume group's LVM data structure. 7. Mount the associated logical volumes. SEE ALSO
vgimport(1M), vgscan(1M), vgcfgbackup(1M). vgchgid(1M)
All times are GMT -4. The time now is 12:47 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy