Sponsored Content
Top Forums Shell Programming and Scripting Fastest way calculating directory Post 302868511 by rufino on Monday 28th of October 2013 06:10:15 AM
Old 10-28-2013
thanks for response, but not working here still slow.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

fastest copy command

wich is the fastest command in HP-UX to copy an entire disk to dat tapes, or even disk to disk? thanks (0 Replies)
Discussion started by: vascobrito
0 Replies

2. Shell Programming and Scripting

Scripts for calculating size and remaining space of a directory automatically.

I would like to create a script for calculating size and remaining space of a directory automatically every 24 hours, then send an email to report to the admin. * POSIX and PERL are preferred. Can anyone help, please? (1 Reply)
Discussion started by: leonall
1 Replies

3. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

4. UNIX for Advanced & Expert Users

Fastest way for searching the file

I want to search a file in fastest manner. Presently I am using 'find' command. But it is taking around 15min for searching. Is there any other method through which I can do it fast? (3 Replies)
Discussion started by: vaibhavbhat
3 Replies

5. AIX

Fastest way to count big amount of files in sub directory

Hi, what happened is we want to count all the files in a directory and inside this directory got many folders and so take long time to count it. Already run for about few minutes but still not done. The command we use to count is find . -type f | wc -l Just wondering if there is any other... (9 Replies)
Discussion started by: ngaisteve1
9 Replies

6. Solaris

The FASTEST copy method?

Hi Experts, I've been asked if there is a fast way to duplicate a file(10GB) and zip it at the same time. The zipped file would be FTP'd.....management is asking this. Maybe there is a better method all together? any ideas? CP will not cut it. Thanks in advance Harley (1 Reply)
Discussion started by: Harleyrci
1 Replies

7. Shell Programming and Scripting

calculating column summation in a directory of flat files

Hello Guru s I need your kind help to solve my below issue I have a directory of flat files and have to calculate sum of some columns from the flat file . Say for flat file 302 I need the column summation of 2 and 3 rd column For flat file 303 I need the column summation of 5 and... (2 Replies)
Discussion started by: Pratik4891
2 Replies

8. Shell Programming and Scripting

Calculating the epoch time from standard time using awk and calculating the duration

Hi All, I have the following time stamp data in 2 columns Date TimeStamp(also with milliseconds) 05/23/2012 08:30:11.250 05/23/2012 08:30:15.500 05/23/2012 08:31.15.500 . . etc From this data I need the following output. 0.00( row1-row1 in seconds) 04.25( row2-row1 in... (5 Replies)
Discussion started by: ks_reddy
5 Replies

9. Shell Programming and Scripting

How to Calculating space used in GB for any particular directory in UNIX?

How to Calculating space used in GB for any particular directory in unix Currently I am using : df -h which gives me space for each mout point ldndyn1:/vol/v01/dyn/sbcexp/dyn 1.1T 999G 29G 98% /sbcimp/dyn but I need for some internal particular directory... (3 Replies)
Discussion started by: RahulJoshi
3 Replies

10. Shell Programming and Scripting

Check fastest server and using it

hello we have upload some data in 15 servers in usa asia ... i consider to add new feature , script can detect download speed between localhost and destination and use fastest server, i have cut this part from a script which have this feature, download a xx MB file from all its source and... (0 Replies)
Discussion started by: nimafire
0 Replies
MKCFM(1)						      General Commands Manual							  MKCFM(1)

NAME
mkcfm - create summaries of font metric files in CID font directories SYNOPSIS
mkcfm [CID-font-directory-name] DESCRIPTION
There is usually only one CID font directory on the X font path. It is usually called /usr/X11R6/lib/X11/fonts/CID. If you do not specify an argument, mkcfm will try to go through the subdirectories of that directory, and create one summary of font metric files for each CID- Font (character descriptions) file and each CMap (Character Maps) file it finds. The summaries of font metric files are put in the existing CFM subdirectory. The CFM subdirectories are created when CID-keyed fonts are installed. If you specify a CID font directory as an argument, mkcfm will try to go through the subdirectories of that directory, and create one sum- mary of font metric files for each CIDFont file and each CMap file it finds. mkcfm will calculate the summaries of the font metric files stored in AFM subdirectories of the CID font directory. Those summaries are needed by the rasterizer of CID-keyed fonts to speed up the response to X font calls. If those files do not exist, CID rasterizer will have to go through usually large font metric files, and calculate the summaries itself each time the font is called. You will notice a substantial wait on a call to a large CID-keyed font. FILES
.afm files Each CID-keyed font file is supposed to have a font metric file (.afm file). mkcfm creates summary files (.cfm files) of those font metric files. mkcfm should be run whenever a change is made to the files stored in the subdirectories of the CID font directory. For example, it should be run when new CID fonts are installed. .cfm files Summaries of font metric (.afm) files created by mkcfm. SEE ALSO
The rasterizer for CID-keyed fonts in the directory xc/lib/font/Type1. CID Fonts Version 1.0 Release 1.0 MKCFM(1)
All times are GMT -4. The time now is 03:53 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy