Sponsored Content
Top Forums UNIX for Advanced & Expert Users Help optimizing sort of large files Post 302925546 by kogorman3 on Monday 17th of November 2014 09:19:12 PM
Old 11-17-2014
Quote:
Originally Posted by Corona688
Not really.

If your files can fit in memory, either way you do it -- cache or sort buffers -- it will be held in RAM. Yes, too many files at once is bad since the disk has to seek them individually.
I'm testing with my idea of a small file: 13 GB. My targets are more like TB-sized. These are the cases that motivate optimizing GNU sort. For modest-sized files, I wouldn't bother.

Quote:
I don't think this has much to do with the size of the temp files as much as their number. Doing a lot of seeking on a non-SSD disk makes its performance really bad. I once measured a disk's random read performance with caching disabled -- hopping from one sector to another, reading then moving, gets you fifty kilobytes per second on a disk good for 100MB/s. Using all available RAM is about as bad as turning off caching, by the way. Worse actually -- memory pressure will start pushing out useful things, making programs wait pointlessly for bits of code to return to them at need.
Size and number are linked. If they're bigger, there are fewer of them. So would love for the cache to contain all of the data from files being merged. I don't know how to do it, but I'm hoping my data will guide me to it indirectly.

Quote:
Cache could also explain the differing times your runs take. If there's significant parts of your file left in cache, that could speed up the next run.
True. I do not do anything to clear the caches before each run, but the tests are done by scripts that always do them in the same order, so I would expect more uniformity than I've seen so far.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Large files

I am trying to understand the webserver log file for an error which has occured on my live web site. The webserver access file is very big in size so it's not possible to open this file using vi editor. I know the approximate time the error occured, so i am interested in looking for the log file... (4 Replies)
Discussion started by: sehgalniraj
4 Replies

2. Shell Programming and Scripting

Large Text Files

Hi All I have approximately 10 files that are at least 100+ MB in size. I am importing them into a DB to output them to the web. What i need to do first is clean the files up so i dont have un necessary rows in the DB. Below is what the file looks like: Ignore the <TAB> annotations as that... (4 Replies)
Discussion started by: caddyjoe77
4 Replies

3. UNIX for Dummies Questions & Answers

large files?

How do we check 'large files' is enabled on a Unix box -- HP-UX B11.11 (2 Replies)
Discussion started by: ranj@chn
2 Replies

4. UNIX for Dummies Questions & Answers

Sort large file

I was wondering how sort works. Does file size and time to sort increase geometrically? I have a 5.3 billion line file I'd like to use with sort -u I'm wondering if that'll take forever because of a geometric expansion? If it takes 100 hours that's fine but not 100 days. Thanks so much. (2 Replies)
Discussion started by: dcfargo
2 Replies

5. Shell Programming and Scripting

a problem with large files

hello all, kindly i need your help, i made a script to print a specific lines from a huge file about 3 million line. the output of the script will be about 700,000 line...the problem is the script is too slow...it kept working for 5 days and the output was only 200,000 lines !!! the script is... (16 Replies)
Discussion started by: m_wassal
16 Replies

6. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

7. UNIX for Dummies Questions & Answers

Speeding/Optimizing GREP search on CSV files

Hi all, I have problem with searching hundreds of CSV files, the problem is that search is lasting too long (over 5min). Csv files are "," delimited, and have 30 fields each line, but I always grep same 4 fields - so is there a way to grep just those 4 fields to speed-up search. Example:... (11 Replies)
Discussion started by: Whit3H0rse
11 Replies

8. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

9. UNIX for Advanced & Expert Users

Script to sort the files and append the extension .sort to the sorted version of the file

Hello all - I am to this forum and fairly new in learning unix and finding some difficulty in preparing a small shell script. I am trying to make script to sort all the files given by user as input (either the exact full name of the file or say the files matching the criteria like all files... (3 Replies)
Discussion started by: pankaj80
3 Replies

10. Shell Programming and Scripting

Script to sort large file with frequency

Hello, I have a very large file of around 2 million records which has the following structure: I have used the standard awk program to sort: # wordfreq.awk --- print list of word frequencies { # remove punctuation #gsub(/_]/, "", $0) for (i = 1; i <= NF; i++) freq++ } END { for (word... (3 Replies)
Discussion started by: gimley
3 Replies
SHELL-QUOTE(1)						User Contributed Perl Documentation					    SHELL-QUOTE(1)

NAME
shell-quote - quote arguments for safe use, unmodified in a shell command SYNOPSIS
shell-quote [switch]... arg... DESCRIPTION
shell-quote lets you pass arbitrary strings through the shell so that they won't be changed by the shell. This lets you process commands or files with embedded white space or shell globbing characters safely. Here are a few examples. EXAMPLES
ssh preserving args When running a remote command with ssh, ssh doesn't preserve the separate arguments it receives. It just joins them with spaces and passes them to "$SHELL -c". This doesn't work as intended: ssh host touch 'hi there' # fails It creates 2 files, hi and there. Instead, do this: cmd=`shell-quote touch 'hi there'` ssh host "$cmd" This gives you just 1 file, hi there. process find output It's not ordinarily possible to process an arbitrary list of files output by find with a shell script. Anything you put in $IFS to split up the output could legitimately be in a file's name. Here's how you can do it using shell-quote: eval set -- `find -type f -print0 | xargs -0 shell-quote --` debug shell scripts shell-quote is better than echo for debugging shell scripts. debug() { [ -z "$debug" ] || shell-quote "debug:" "$@" } With echo you can't tell the difference between "debug 'foo bar'" and "debug foo bar", but with shell-quote you can. save a command for later shell-quote can be used to build up a shell command to run later. Say you want the user to be able to give you switches for a command you're going to run. If you don't want the switches to be re-evaluated by the shell (which is usually a good idea, else there are things the user can't pass through), you can do something like this: user_switches= while [ $# != 0 ] do case x$1 in x--pass-through) [ $# -gt 1 ] || die "need an argument for $1" user_switches="$user_switches "`shell-quote -- "$2"` shift;; # process other switches esac shift done # later eval "shell-quote some-command $user_switches my args" OPTIONS
--debug Turn debugging on. --help Show the usage message and die. --version Show the version number and exit. AVAILABILITY
The code is licensed under the GNU GPL. Check http://www.argon.org/~roderick/ or CPAN for updated versions. AUTHOR
Roderick Schertler <roderick@argon.org> perl v5.16.3 2010-06-11 SHELL-QUOTE(1)
All times are GMT -4. The time now is 04:26 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy