Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Unix File System performance with large directories Post 48557 by malcom on Wednesday 10th of March 2004 07:00:28 AM
Old 03-10-2004
Hi Dirk,

if you want to tune your filesystem, the most important question for you is , "What size are the files on it ?"

Related to this, you will change the blocksize and similar.

Regards
Malcom
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

list directories on a file system

how can i see the list of directories, mounted on a filesystem? example, to show the list of directories mounted on / for aix (8 Replies)
Discussion started by: yls177
8 Replies

2. AIX

Why is my file system cache so large

Hi I have a filesystem cache which is around 20G in size and I'm a bit perplexed as to what is in it. I'm running Sybase on the machine with the db on raw volumes and a tempdb on a ramdisk. My understanding is that raw volumes are not cached and I assumed that the ramdisk is not either. Am... (1 Reply)
Discussion started by: mgibbons
1 Replies

3. UNIX for Advanced & Expert Users

suggestions require for unix system performance on certain task

Dear all, On my UNIX server there is an apache web log file. The rate of logging of data in this file is very high. I want to extract user logging log from this file in run time. As soon as the user logging log logged in this file I want to redirect this user log into another file. I want to... (4 Replies)
Discussion started by: zing_foru
4 Replies

4. Shell Programming and Scripting

Performance issue in UNIX while generating .dat file from large text file

Hello Gurus, We are facing some performance issue in UNIX. If someone had faced such kind of issue in past please provide your suggestions on this . Problem Definition: /Few of load processes of our Finance Application are facing issue in UNIX when they uses a shell script having below... (19 Replies)
Discussion started by: KRAMA
19 Replies

5. Programming

question about empty directories in unix system

how is it possible for a directory to be empty and still have a size greater than 0 in bytes... i made a shell script that shows info about all files/directories and this is what came up the last one is the size, here its showing 1024 in the for loop i did something like for h in * .*; do ... (4 Replies)
Discussion started by: omega666
4 Replies

6. Red Hat

Empty directory, large size and performance

Hi, I've some directory that I used as working directory for a program. At the end of the procedure, the content is deleted. This directory, when I do a ls -l, appears to still take up some space. After a little research, I've seen on a another board of this forum that it's not really taking... (5 Replies)
Discussion started by: bdx
5 Replies

7. Red Hat

GFS file system performance is very slow

My code Hi All, I am having redhat linux 5.3 (Tikanga) with GFS file system and its very very slow for executing ls -ls command also.Please see the below for 2minits 12 second takes. Please help me to fix the issue. $ sudo time ls -la BadFiles |wc -l 0.01user 0.26system... (3 Replies)
Discussion started by: susindram
3 Replies

8. Shell Programming and Scripting

Performance issue in Grepping large files

I have around 300 files(*.rdf,*.fmb,*.pll,*.ctl,*.sh,*.sql,*.prog) which are of large size. Around 8000 keywords(which will be in the file $keywordfile) needed to be searched inside those files. If a keyword is found in a file..I have to insert the filename,extension,catagoery,keyword,occurrence... (8 Replies)
Discussion started by: millan
8 Replies

9. HP-UX

Test cases for file system mount/umount performance in HP

Hi Folks, Could anyone please assist me with the what could be the scenarios to test the file system mount/umount performance check in HPUX. Thanks in advance, Vaishey (5 Replies)
Discussion started by: Vaishey
5 Replies

10. Shell Programming and Scripting

Bash script search, improve performance with large files

Hello, For several of our scripts we are using awk to search patterns in files with data from other files. This works almost perfectly except that it takes ages to run on larger files. I am wondering if there is a way to speed up this process or have something else that is quicker with the... (15 Replies)
Discussion started by: SDohmen
15 Replies
File::HomeDir::Unix(3pm)				User Contributed Perl Documentation				  File::HomeDir::Unix(3pm)

NAME
File::HomeDir::Unix - Find your home and other directories on legacy Unix SYNOPSIS
use File::HomeDir; # Find directories for the current user $home = File::HomeDir->my_home; # /home/mylogin $desktop = File::HomeDir->my_desktop; # All of these will... $docs = File::HomeDir->my_documents; # ...default to home... $music = File::HomeDir->my_music; # ...directory $pics = File::HomeDir->my_pictures; # $videos = File::HomeDir->my_videos; # $data = File::HomeDir->my_data; # DESCRIPTION
This module provides implementations for determining common user directories. In normal usage this module will always be used via File::HomeDir. SUPPORT
See the support section the main File::HomeDir module. AUTHORS
Adam Kennedy <adamk@cpan.org> Sean M. Burke <sburke@cpan.org> SEE ALSO
File::HomeDir, File::HomeDir::Win32 (legacy) COPYRIGHT
Copyright 2005 - 2011 Adam Kennedy. Some parts copyright 2000 Sean M. Burke. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. The full text of the license can be found in the LICENSE file included with this module. perl v5.14.2 2012-01-25 File::HomeDir::Unix(3pm)
All times are GMT -4. The time now is 01:19 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy