Sponsored Content
Top Forums UNIX for Advanced & Expert Users Help optimizing sort of large files Post 302926061 by kogorman3 on Thursday 20th of November 2014 11:42:02 PM
Old 11-21-2014
I'm not going to load a database, because the results of the sort will be used just once, and as a practical matter may be passed in a pipe without ever hitting the filesystem. For testing, there's an output file, but just for testing, and to make the results more generally relevant to anyone else who might read this.

My sort times are generally within a factor of 2 of the cost of copying the file to temp and then to the output. So thrashing and computing are not horrible and I'm not going to write a separate sort or any part of it because it will take too long to get it right, and I'm not going to use multiple invocations of sort(1) because the disk I/O will clearly eat any benefits.

I've re-run my timing scripts on the small test file, and some comments I made earlier have to be corrected. The differences between runs are not that alarming after all, and are easily explained by differences in other competing activities on the same machine. My speedups so far are more modest than I thought, but --parallel=4 really does give me 20%, and there's about another 20% available from jiggering parameters.

Running after a fresh boot, I noticed some things that surprised me, though perhaps they should not have. By the time testing is done, the kernel has filled 64GB of memory, mostly with "cached" blocks, and has swapped out a little over 3 MB of memory. I presume it's swapping idle daemons. So these results will not scale up for files large enough to do a complete cache wipe.

I've pretty much determined that the main thing to avoid is getting more than one merge pass on the temporaries. I think it's time to try just a few things with my TB-sized things, because I know those were doing at least 2 extra passes with the default parameters. It took forever. I think the defaults are 4GB buffers (1/8 real memory) and merges of 16 files. The buffers seem to have a lot of overhead, so the temp files are smaller than you might expect. On a 1TB file, that will be roughly 500 2-GB temp files, and 3 levels of merge. The question is: given a choice, is it better to use a bigger buffer, or a wider merge? I'm betting 4GB buffers are too big, but I need to do some testing.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Large files

I am trying to understand the webserver log file for an error which has occured on my live web site. The webserver access file is very big in size so it's not possible to open this file using vi editor. I know the approximate time the error occured, so i am interested in looking for the log file... (4 Replies)
Discussion started by: sehgalniraj
4 Replies

2. Shell Programming and Scripting

Large Text Files

Hi All I have approximately 10 files that are at least 100+ MB in size. I am importing them into a DB to output them to the web. What i need to do first is clean the files up so i dont have un necessary rows in the DB. Below is what the file looks like: Ignore the <TAB> annotations as that... (4 Replies)
Discussion started by: caddyjoe77
4 Replies

3. UNIX for Dummies Questions & Answers

large files?

How do we check 'large files' is enabled on a Unix box -- HP-UX B11.11 (2 Replies)
Discussion started by: ranj@chn
2 Replies

4. UNIX for Dummies Questions & Answers

Sort large file

I was wondering how sort works. Does file size and time to sort increase geometrically? I have a 5.3 billion line file I'd like to use with sort -u I'm wondering if that'll take forever because of a geometric expansion? If it takes 100 hours that's fine but not 100 days. Thanks so much. (2 Replies)
Discussion started by: dcfargo
2 Replies

5. Shell Programming and Scripting

a problem with large files

hello all, kindly i need your help, i made a script to print a specific lines from a huge file about 3 million line. the output of the script will be about 700,000 line...the problem is the script is too slow...it kept working for 5 days and the output was only 200,000 lines !!! the script is... (16 Replies)
Discussion started by: m_wassal
16 Replies

6. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

7. UNIX for Dummies Questions & Answers

Speeding/Optimizing GREP search on CSV files

Hi all, I have problem with searching hundreds of CSV files, the problem is that search is lasting too long (over 5min). Csv files are "," delimited, and have 30 fields each line, but I always grep same 4 fields - so is there a way to grep just those 4 fields to speed-up search. Example:... (11 Replies)
Discussion started by: Whit3H0rse
11 Replies

8. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

9. UNIX for Advanced & Expert Users

Script to sort the files and append the extension .sort to the sorted version of the file

Hello all - I am to this forum and fairly new in learning unix and finding some difficulty in preparing a small shell script. I am trying to make script to sort all the files given by user as input (either the exact full name of the file or say the files matching the criteria like all files... (3 Replies)
Discussion started by: pankaj80
3 Replies

10. Shell Programming and Scripting

Script to sort large file with frequency

Hello, I have a very large file of around 2 million records which has the following structure: I have used the standard awk program to sort: # wordfreq.awk --- print list of word frequencies { # remove punctuation #gsub(/_]/, "", $0) for (i = 1; i <= NF; i++) freq++ } END { for (word... (3 Replies)
Discussion started by: gimley
3 Replies
WRAP-AND-SORT(1)					      General Commands Manual						  WRAP-AND-SORT(1)

NAME
wrap-and-sort - wrap long lines and sort items in Debian packaging files SYNOPSIS
wrap-and-sort [options] DESCRIPTION
wrap-and-sort wraps the package lists in Debian control files. By default the lists will only split into multiple lines if the entries are longer than 80 characters. wrap-and-sort sorts the package lists in Debian control files and all .install files. Beside that wrap-and-sort removes trailing spaces in these files. This script should be run in the root of a Debian package tree. It searches for control, control.in, copyright, copyright.in, install, and *.install in the debian directory. OPTIONS
-h, --help Show this help message and exit. -a, --wrap-always Wrap all package lists in the Debian control file even if the entries are shorter than 80 characters and could fit in one line line. -s, --short-indent Only indent wrapped lines by one space (default is in-line with the field name). -b, --sort-binary-packages Sort binary package paragraphs by name. -k, --keep-first When sorting binary package paragraphs, leave the first one at the top. Unqualified debhelper(7) configuration files are applied to the first package. -n, --no-cleanup Do not remove trailing whitespaces. -d path, --debian-directory=path Location of the debian directory (default: ./debian). -f file, --file=file Wrap and sort only the specified file. You can specify this parameter multiple times. All supported files will be processed if no files are specified. -v, --verbose Print all files that are touched. AUTHORS
wrap-and-sort and this manpage have been written by Benjamin Drung <bdrung@debian.org>. Both are released under the ISC license. DEBIAN
Debian Utilities WRAP-AND-SORT(1)
All times are GMT -4. The time now is 02:42 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy