I haven't had the time to do a test on Linux yet, but I just finished a test on my Windows XP desktop machine (NTFS). I'm not sure how valuable this test is, but it's very interesting... Please give your thoughts on this.
Opening and closing 100 selected files randomly 100,000 times from directories containing different amount of files (relative times):
Obviously, some caching is going on. So, if you open the same files over and over (and the number of files is small enough), it doesn't seem to matter how many files you keep in the directories.
This caching could suggest that the performance hit above would be larger if I had opened more files than 100. Another way of doing this test would be to read every single file in random order.
Maybe I should have used the same 1,000,000 files in each test case and instead distributed them differently (100 files per directory, 1000 files per directory etc). But then other variables would affect the results, such as how I distributed them -- path depth, number of directories etc.
Details
I used a script to create files with random names of 10+3 characters. I copied the files from the "100 directory" to the other directories, then added additional files. The files were almost empty (72 bytes).
Then I ran a Python script that opened and closed randomly selected files (from the 100 files above) in each directory. The source code is:
And the results:
Machine Specifications
I ran the tests on my old desktop DELL Optiplex 280 with a Pentium 4 CPU (2.8 GHz), 2 GB DDR2 SDRAM and 80 GB Serial ATA-150, 7200 rpm hard drive (cache size unknown).
I'm using Windows XP SP3 with NTFS. I shut down all anti-virus, indexing and updating services and most programs before running the tests.
The hard drive was defragmented after creating the small files and before running the tests. I also rebooted before running the tests.
I reinstalled my Linux box with RedHat 7.2 and used the ext3 journaling file system. This thing is a pig now. There isn't much running on the box, and performance is sad. (1 Reply)
Hi all, I have a php file that grabs xml, parses it and updates my db accordingly. I want to automate the execution of this process, rather than having to hit the url manually.
I have been looking into using cron to execute a script to do this, however i'm not exactly sure what command i would... (1 Reply)
hi,
in my application, i have set up to capture SIGINT and execute a handler.the problem is whenever i hit C-c, multiple SIGINT are sent to the application.I have blocked the SIGINT right after catching the first one but it is unsuccessful.Here is what i do :
jmp_buf main_loop;
int... (1 Reply)
I have a unix shell script (ex.sh) written.
How to find out how many users (incl. myself) have run this .sh ?
I can insert code snipet at top of script if need be.
- Ravi (2 Replies)
Hello,
I'm having trouble looking for info for SUSIE on this CVE-2012-4681.
This is basically the newest Java hit. It is mostly a web browser issue but I would like to see if the versions on our servers are vulnerable. I already found the pages/info for Solaris and RHEL.
Any help would be... (4 Replies)
Hello,
I'm having an issue with VNC. Security at work says that they scanned my servers (Solaris, RHEL, SLES) and found that you don't need a password to access a VNC session. I have tested this and you can't login to the VNC session without a password. Can someone tell what the Retina scanner... (1 Reply)
Hello,
I am looking to hit a URL using curl and click on submit button so that I can get the results. The below is the code
<input name="tos_accepted" id="tos_accepted" class="button" type="submit" value="Yes, I Agree"/>
<input name="tos_discarded" id="tos_discarded"... (1 Reply)
Hi Gurus,
I need to merge two files.
file1 (small file, only one line)
this is first linefile2 (large file)
abc
def
ghi
... I use below command to merge the file, since the file2 is really large file, the command read whole file2, the performance is not good.
cat file1 > file3... (7 Replies)