Sponsored Content
Full Discussion: a problem with large files
Top Forums Shell Programming and Scripting a problem with large files Post 302436294 by dr.house on Saturday 10th of July 2010 02:08:02 PM
Old 07-10-2010
From my experience, running a pile of small(er) files take much less time than working with a single "fatty" Smilie thus, you may want to split up both source files in junks, then process them (possibly in parallel) and finally concatenate the results.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Problem in processing a very large file.

Hi Friends, Getting an error while processing a very large file using an sqlloader........ The file is larger than 2 GB. Now need to change the compiler to 64-bit so that the file can be processed. Is there any command for the same. Thanks in advance. (1 Reply)
Discussion started by: Rohini Vijay
1 Replies

2. Shell Programming and Scripting

problem with 0 byte and large files

how to remove all zero byte files in a particular directory and also files that are morew than 1GB. pLEASE let me know (3 Replies)
Discussion started by: dsravan
3 Replies

3. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

4. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

5. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

6. UNIX for Dummies Questions & Answers

Large Problem with nautilus

Hi, I am a torrent-maniak and I use Transmission. All things were good but Nautilus begun to show problem while I was runnning Transmission.Its situation was becoming worse and worse. Now, when I boot I can hardly open a nautilus window and browse my files.It will "stack" in seconds for sure! I... (2 Replies)
Discussion started by: hakermania
2 Replies

7. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

8. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

9. Shell Programming and Scripting

A Large Percent Problem

Hello everyone, I have two matrices at same sizes. I need to re-calculate the numbers in matrix A according to the percentages in martix B it is like matrix A is 10.00 20.00 30.00 40.00 60.00 70.00 80.00 90.00 20.00 30.00 80.00 50.00 martix B is 00.08 00.05 ... (2 Replies)
Discussion started by: miriammiriam
2 Replies
ppmquantall(1)						      General Commands Manual						    ppmquantall(1)

NAME
ppmquantall - run ppmquant on a bunch of files all at once, so they share a common colormap SYNOPSIS
ppmquantall [-ext extension] ncolors ppmfile ... DESCRIPTION
Takes a bunch of portable pixmap as input. Chooses ncolors colors to best represent all of the images, maps the existing colors to the new ones, and overwrites the input files with the new quantized versions. If you don't want to overwrite your input files, use the -ext option. The output files are then named the same as the input files, plus a period and the extension text you specify. Verbose explanation: Let's say you've got a dozen pixmaps that you want to display on the screen all at the same time. Your screen can only display 256 different colors, but the pixmaps have a total of a thousand or so different colors. For a single pixmap you solve this problem with ppmquant; this script solves it for multiple pixmaps. All it does is concatenate them together into one big pixmap, run ppmquant on that, and then split it up into little pixmaps again. (Note that another way to solve this problem is to pre-select a set of colors and then use ppmquant's -map option to separately quantize each pixmap to that set.) SEE ALSO
ppmquant(1), ppm(5) AUTHOR
Copyright (C) 1991 by Jef Poskanzer. 27 July 1990 ppmquantall(1)
All times are GMT -4. The time now is 07:26 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy