Sponsored Content
Top Forums Shell Programming and Scripting Improve script - slow process with big files Post 302990400 by stomp on Thursday 26th of January 2017 03:17:41 AM
Old 01-26-2017
Hi jiam,

thanks for not inserting the files into this forum because it is too much.

Futhermore it's annoying for me needing to download anything, install an unpacking program prior to reading the files. I politely ask you if you please use a pasting service. Maybe this one:

New paste • Fedora Project Pastebin

What's regarding the script:

Even I'm not acquainted in the use of csh, these are my
recommendations:
  • You have a lot external awk-calls in your loop. That's one reason making your program very slow. Like this one:
Code:
        set lineinfo = `cat info_records.list|head -$i|tail -1`
        set tap = `echo $lineinfo|awk '{print $1;}'`
        set rec = `echo $lineinfo|awk '{print $2;}'`
        set lin = `echo $lineinfo|awk '{print $3;}'`
        set pnt = `echo $lineinfo|awk '{print $4;}'`
        set spx = `echo $lineinfo|awk '{print $5;}'`
        set spy = `echo $lineinfo|awk '{print $6;}'`
        set spz = `echo $lineinfo|awk '{print $7;}'`
        set tim = `echo $lineinfo|awk '{print $8;}'`
        set lfr = `echo $lineinfo|awk '{print $9;}'`
        set lto = `echo $lineinfo|awk '{print $10;}'`

If you create one awk-program it will be a lot faster.

Regards,
stomp

Last edited by rbatte1; 01-26-2017 at 06:01 AM..
This User Gave Thanks to stomp For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

looking for solution to improve process replicate files to remote loc.

looking for solution to replicate 1.5GB files to a remote location... Currently, this process looks like the following: move 1.5GB files into a staging area. compress files. rsync files to remote server. remove compressed files. I have performed some timings, and compress seems more... (5 Replies)
Discussion started by: mr_manny
5 Replies

2. Shell Programming and Scripting

bash script working for small size files but not for big size files.

Hi, I have one file stat. Stat file contents are as follows: for example. H50768020040913,00260100,507680,13,0000000643,0000000643,00000,0000 H50769520040808,00260100,507695,13,0000000000,0000000000,00000,0000 H50770620040611,00260100,507706,13,0000000000,0000000000,00000,0000 Now i... (1 Reply)
Discussion started by: davidpreml
1 Replies

3. AIX

How to send big files over slow network?

Hi, I am trying to send oracle archives over WAN and it is taking hell a lot of time. To reduce the time, I tried to gzip the files and send over to the other side. That seems to reduce the time. Does anybody have experienced this kind of problem and any possible ways to reduce the time. ... (1 Reply)
Discussion started by: giribt
1 Replies

4. Shell Programming and Scripting

egrep is very slow : How to improve performance

We have an egrep search in a while loop. egrep -w "$key" ${PICKUP_DIR}/new_update >> ${PICKUP_DIR}/update_record_new ${PICKUP_DIR}/new_update is 210 MB file In each iteration, the egrep on an average takes around 50-60 seconds to search. Ther'es nothing significant in the loop other... (7 Replies)
Discussion started by: hidnana
7 Replies

5. UNIX for Advanced & Expert Users

sed working slow on big files

HI Experts , I'm using the following code to remove spaces appearing at the end of the file. sed "s/*$//g" <filename> > <new_filename> mv <new_filename> <filename> this is working fine for volumes upto 20-25 GB. for the bigger files it is taking more time that it is required... (5 Replies)
Discussion started by: sumoka
5 Replies

6. Shell Programming and Scripting

Very big text file - Too slow!

Hello everyone, suppose there is a very big text file (>800 mb) that each line contains an article from wikipedia. Each article begins with a tag (<..>) containing its url. Currently there are 10^6 articles in the file. I want to take random N articles, eliminate all non-alpharithmetic... (14 Replies)
Discussion started by: fedonMan
14 Replies

7. UNIX for Dummies Questions & Answers

How do I slow down a process?

Hello, I've been searching for something that slows down a process for some time now. Slow down as in make time pass by slower. I have rarely turned to asking a forum in the past but at this point I've given up. For example: if I made a program that would print "Hello" in 5 seconds, I would use... (18 Replies)
Discussion started by: Nathan1
18 Replies

8. HP-UX

Script execution is very slow when trying to find all files and their owners on HP-UX box

Hi, I have a HP-UX server were I need to list all the files in the entire file system, their directory path, last modified date, owner and group. I do not need to search the file contents. I created the script given below and I am excluding directories and files of type tmp, temp and log. The... (4 Replies)
Discussion started by: Adyan Faruqi
4 Replies

9. Solaris

Rsync quite slow (using very little cpu): how to improve its speed?

I have "inherited" a OmniOS (illumos based) server. I noticed rsync is significantly slower in respect to my reference, FreeBSD 12-CURRENT, running on exactly same hardware. Using same hardware, same command with same source and target disks, OmniOS r151026 gives: test@omniosce:~# time... (11 Replies)
Discussion started by: priyadarshan
11 Replies

10. Shell Programming and Scripting

Bash script search, improve performance with large files

Hello, For several of our scripts we are using awk to search patterns in files with data from other files. This works almost perfectly except that it takes ages to run on larger files. I am wondering if there is a way to speed up this process or have something else that is quicker with the... (15 Replies)
Discussion started by: SDohmen
15 Replies
cscope-indexer(1)					      General Commands Manual						 cscope-indexer(1)

NAME
cscope-indexer - Script to index files for cscope SYNOPSIS
cscope-indexer [-v] [-f database_file] [-i list_file] [-l] [-r] DESCRIPTION
This script generates a list of files to index (cscope.out), which is then (optionally) used to generate a cscope database. You can use this script to just build a list of files, or it can be used to build a list and database. This script is not used to just build a data- base (skipping the list of files step), as this can be simply done by just calling "cscope -b". Normally, cscope will do its own indexing, but this script can be used to force indexing. This is useful if you need to recurse into sub- directories, or have many files to index (you can run this script from a cron job, during the night). It is especially useful for large projects, which can contstantly have source files added and deleted; by using this script, the changing sources files are automatically handled. Currently, any paths containing "/CVS/" or "/RCS/" are stripped out (ignored). OPTIONS
-f database_file Specifies the cscope database file (default: cscope.out). -i list_file Specifies the name of the file into which the list of files to index is placed (default: cscope.files). -l Suppress the generation/updating of the cscope database file. Only a list of files is generated. -r Recurse into subdirectories to locate files to index. Without this option, only the current directory is searched. -v Be verbose. Output simple progress messages. SEE ALSO
cscope(1) AUTHOR
This manual page was written for the Debian GNU/Linux system by Robert Lemmen <robertle@semistable.com> (but may be used by others, of course) Script to index files for cscope 30. December 2002 cscope-indexer(1)
All times are GMT -4. The time now is 04:27 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy