Sponsored Content
Top Forums UNIX for Dummies Questions & Answers gref -f taking long time for big file Post 302410605 by jim mcnamara on Tuesday 6th of April 2010 10:43:43 AM
Old 04-06-2010
Or read the man page on comm - that is designed to find rows found in two files.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

fetchmail taking long time to fetchmail...

Hi peeps, We are having around 60 users. The time set to retrieve the mail is 300 sec. But it's taking around 1 hour to deliver mails. I am using debian sarge 3.1. any clues? And how it will affect if I decrease the time? My machine has got 1 p4 3.0 GHZ processor and 1 GB ram. The home... (2 Replies)
Discussion started by: squid04
2 Replies

2. Red Hat

login process taking a long time

I'm having a bit of a login performance issue.. wondering if anyone has any ideas where I might look. Here's the scenario... Linux Red Hat ES 4 update 5 regardless of where I login from (ssh or on the text console) after providing the password the system seems to pause for between 30... (4 Replies)
Discussion started by: retlaw
4 Replies

3. Shell Programming and Scripting

For Loop Taking Too Long

I'm new from UNIX scripting. Please help. I have about 10,000 files from the $ROOTDIR/scp/inbox/string1 directory to compare with the 50 files from /$ROOTDIR/output/tma/pnt/bad/string1/ directory and it takes about 2 hours plus to complete the for loop. Is there a better way to re-write the... (5 Replies)
Discussion started by: hanie123
5 Replies

4. Shell Programming and Scripting

<AIX>Problem in purge script, taking very very long time to complete 18.30hrs

Hi, I have here a script which is used to purge older files/directories based on defined purge period. The script consists of 45 find commands, where each command will need to traverse through more than a million directories. Therefore a single find command executes around 22-25 mins... (7 Replies)
Discussion started by: sravicha
7 Replies

5. UNIX for Dummies Questions & Answers

Job is taking long time

Hi , We have 20 jobs are scheduled. In that one of our job is taking long time ,it's not completing. If we are not terminating it's running infinity time actually the job completion time is 5 minutes. The job is deleting some records from the table and two insert statements and one select... (7 Replies)
Discussion started by: ajaykumarkona
7 Replies

6. Solaris

How to find out bottleneck if system is taking long time in gzip

Dear All, OS = Solaris 5.10 Hardware Sun Fire T2000 with 1 Ghz quode core We have oracle application 11i with 10g database. When ever i am trying to take cold backup of database with 55GB size its taking long time to finish. As the application is down nobody is using the server at all... (8 Replies)
Discussion started by: yoojamu
8 Replies

7. UNIX for Dummies Questions & Answers

ls is taking long time to list

Hi, All the data are kept on Netapp using NFS. some directories are so fast when doing ls but few of them are slow. After doing few times, it becomes fast. Then again after few minutes, it becomes slow again. Can you advise what's going on? This one directory I am very interested is giving... (3 Replies)
Discussion started by: samnyc
3 Replies

8. Shell Programming and Scripting

While loop problem taking too long

while read myhosts do while read discovered do echo "$discovered" done < $LOGFILE | grep -Pi "|" | egrep... (7 Replies)
Discussion started by: SkySmart
7 Replies

9. Shell Programming and Scripting

Rm -rf is taking very long, will it timeout?

I have so many (hundreds of thousands) files and directories within this one specific directory that my "rm -rf" command to delete them has been taking forever. I did this via the SSH, my question is: if my SSH connection times out before rm -rf finishes, will it continue to delete all of those... (5 Replies)
Discussion started by: phpchick
5 Replies

10. Red Hat

Du -sh command taking time to calculate the big size files

Hi , My linux server is taking more time to calculate big size from long time. * i am accessing server through ssh * commands # - du -sh * #du -sh * | sort -n | grep G Please guide me for fast way to find big size directories under to / partition Thanks (8 Replies)
Discussion started by: Nats
8 Replies
CHOPTEST(8)						      System Manager's Manual						       CHOPTEST(8)

NAME
choptest - HylaFAX page chopping test program SYNOPSIS
/usr/sbin/choptest [ options ] input.tif DESCRIPTION
choptest is a program for testing the page chopping support in the HylaFAX software (specifically, in the faxq(8) program). choptest ana- lyzes a TIFF/F (TIFF Class F) file with 1-D MH- or 2-D MR-encoded data and reports what the automatic page chopping logic would do if the file were submitted for transmission. Options are provided for controlling whether choptest checks the last page or all pages of the docu- ment, and what whitespace threshold to use in deciding if a page should be chopped. OPTIONS
-a Chop all pages in the document. This is equivalent to setting the PageChop configuration parameter to ``all''; c.f. hylafax- config(5). By default only the last page of the document is considered. -t inches Set the minimum whitespace threshold that must be present on a page for it to be chopped. This is equivalent to the Page- ChopThreshold configuration parameter; c.f. hylafax-config(5). By default choptest requires that at least 3 inches of trailing whitespace be present. EXAMPLES
The following shows a multi-page, high-resolution document. Each page has insufficient whitespace for it to be chopped. hyla% ./choptest -a ~/tiff/pics/faxix.tif Chop pages with >=3" of white space at the bottom. Don't chop, found 67 rows, need 588 rows Don't chop, found 67 rows, need 588 rows Don't chop, found 67 rows, need 588 rows Don't chop, found 53 rows, need 588 rows Don't chop, found 91 rows, need 588 rows Don't chop, found 99 rows, need 588 rows Don't chop, found 47 rows, need 588 rows SEE ALSO
faxq(8), hylafax-config(5) October 3, 1995 CHOPTEST(8)
All times are GMT -4. The time now is 06:13 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy