Sponsored Content
Top Forums Shell Programming and Scripting awk help to make my work faster Post 302271357 by matrixmadhan on Thursday 25th of December 2008 02:48:04 AM
Old 12-25-2008
0.6 M don't seem to be a big number ( I guess )
load everything to memory with a map as

line_no => line_no_contents

then do a lookup with line_no, its done then Smilie
 

10 More Discussions You Might Find Interesting

1. Solaris

looking for different debugger for Solaris or to make sunstudio faster

im using the sunstudio but it is very slow , is there ant other GUI debugger for sun Solaris or at list some ways to make it faster ? im using to debug throw telnet connection connected to remote server thanks (0 Replies)
Discussion started by: umen
0 Replies

2. Shell Programming and Scripting

Can anyone make this script run faster?

One of our servers runs Solaris 8 and does not have "ls -lh" as a valid command. I wrote the following script to make the ls output easier to read and emulate "ls -lh" functionality. The script works, but it is slow when executed on a directory that contains a large number of files. Can anyone make... (10 Replies)
Discussion started by: shew01
10 Replies

3. Red Hat

Re:How to make the linux pc faster

Hi, Can any one help me out in solving the problem i have a linux database server it is tooo slow that i am unable to open even the terminial is there any solution to get rid of this problem.How to make this server faster. Thanks & Regards Venky (0 Replies)
Discussion started by: venky_vemuri
0 Replies

4. Shell Programming and Scripting

How to make copy work faster

I am trying to copy a folder which contains a list of C executables. It takes 2 mins for completion,where as the entire script takes only 3 more minutes for other process. Is there a way to copy the folder faster so that the performance of the script will improve? (2 Replies)
Discussion started by: prasperl
2 Replies

5. Shell Programming and Scripting

Running rename command on large files and make it faster

Hi All, I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

6. Shell Programming and Scripting

Make script faster

Hi all, In bash scripting, I use to read files: cat $file | while read line; do ... doneHowever, it's a very slow way to read file line by line. E.g. In a file that has 3 columns, and less than 400 rows, like this: I run next script: cat $line | while read line; do ## Reads each... (10 Replies)
Discussion started by: AlbertGM
10 Replies

7. Shell Programming and Scripting

awk changes to make it faster

I have script like below, who is picking number from one file and and searching in another file, and printing output. Bu is is very slow to be run on huge file.can we modify it with awk #! /bin/ksh while read line1 do echo "$line1" a=`echo $line1` if then echo "$num" cat file1|nawk... (6 Replies)
Discussion started by: mirwasim
6 Replies

8. Shell Programming and Scripting

How to make awk command faster?

I have the below command which is referring a large file and it is taking 3 hours to run. Can something be done to make this command faster. awk -F ',' '{OFS=","}{ if ($13 == "9999") print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12 }' ${NLAP_TEMP}/hist1.out|sort -T ${NLAP_TEMP} |uniq>... (13 Replies)
Discussion started by: Peu Mukherjee
13 Replies

9. Shell Programming and Scripting

How to make awk command faster for large amount of data?

I have nginx web server logs with all requests that were made and I'm filtering them by date and time. Each line has the following structure: 127.0.0.1 - xyz.com GET 123.ts HTTP/1.1 (200) 0.000 s 3182 CoreMedia/1.0.0.15F79 (iPhone; U; CPU OS 11_4 like Mac OS X; pt_br) These text files are... (21 Replies)
Discussion started by: brenoasrm
21 Replies

10. Shell Programming and Scripting

How to make faster loop in multiple directories?

Hello, I am under Ubuntu 18.04 Bionic. I have one shell script run.sh (which is out of my topic) to run files under multiple directories and one file to control all processes running under those directories (control.sh). I set a cronjob task to check each of them with two minutes of intervals.... (3 Replies)
Discussion started by: baris35
3 Replies
csplit(1)							   User Commands							 csplit(1)

NAME
csplit - split files based on context SYNOPSIS
csplit [-ks] [-f prefix] [-n number] file arg1... argn DESCRIPTION
The csplit utility reads the file named by the file operand, writes all or part of that file into other files as directed by the arg oper- ands, and writes the sizes of the files. OPTIONS
The following options are supported: -f prefix Names the created files prefix00, prefix01, ..., prefixn. The default is xx00 ... xxn. If the prefix argument would create a file name exceeding 14 bytes, an error results. In that case, csplit exits with a diagnostic message and no files are created. -k Leaves previously created files intact. By default, csplit removes created files if an error occurs. -n number Uses number decimal digits to form filenames for the file pieces. The default is 2. -s Suppresses the output of file size messages. OPERANDS
The following operands are supported: file The path name of a text file to be split. If file is -, the standard input will be used. The operands arg1 ... argn can be a combination of the following: /rexp/[offset] Create a file using the content of the lines from the current line up to, but not including, the line that results from the evaluation of the regular expression with offset, if any, applied. The regular expression rexp must follow the rules for basic regular expressions. Regular expressions can include the use of '/' and '\%'. These forms must be properly quoted with single quotes, since "" is special to the shell. The optional offset must be a positive or negative integer value representing a number of lines. The integer value must be preceded by + or -. If the selection of lines from an offset expression of this type would create a file with zero lines, or one with greater than the number of lines left in the input file, the results are unspecified. After the section is created, the current line will be set to the line that results from the evaluation of the regular expression with any offset applied. The pattern match of rexp always is applied from the cur- rent line to the end of the file. %rexp%[offset] This operand is the same as /rexp/[offset], except that no file will be created for the selected section of the input file. line_no Create a file from the current line up to (but not including) the line number line_no. Lines in the file will be numbered starting at one. The current line becomes line_no. {num} Repeat operand. This operand can follow any of the operands described previously. If it follows a rexp type operand, that operand will be applied num more times. If it follows a line_no operand, the file will be split every line_no lines, num times, from that point. An error will be reported if an operand does not reference a line between the current position and the end of the file. USAGE
See largefile(5) for the description of the behavior of csplit when encountering files greater than or equal to 2 Gbyte (2**31 bytes). EXAMPLES
Example 1: Splitting and combining files This example creates four files, cobol00...cobol03. example% csplit -f cobol filename '/procedure division/' /par5./ /par16./ After editing the ``split'' files, they can be recombined as follows: example% cat cobol0[0-3] > filename Note: This example overwrites the original file. Example 2: Splitting a file into equal parts This example splits the file at every 100 lines, up to 10,000 lines. The -k option causes the created files to be retained if there are less than 10,000 lines; however, an error message would still be printed. example% csplit -k filename 100 {99} Example 3: Creating a file for separate C routines If prog.c follows the normal C coding convention (the last line of a routine consists only of a } in the first character position), this example creates a file for each separate C routine (up to 21) in prog.c. example% csplit -k prog.c '%main(%' '/^}/+1' {20} ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of csplit: LANG, LC_ALL, LC_COLLATE, LC_CTYPE, LC_MESSAGES, and NLSPATH. EXIT STATUS
The following exit values are returned: 0 Successful completion. >0 An error occurred. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWesu | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ SEE ALSO
sed(1), split(1), attributes(5), environ(5), largefile(5), standards(5) DIAGNOSTICS
The diagnostic messages are self-explanatory, except for the following: arg - out of range The given argument did not reference a line between the current position and the end of the file. SunOS 5.10 4 Dec 2003 csplit(1)
All times are GMT -4. The time now is 01:07 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy