Is it possible to find the seek rate of the find command in Solaris?
Hello,
I am running some performance based tests on Solaris, and I was wondering how fast the "seeking" rate of Solaris is, or how fast Solaris can get information about files with the "find" command. Does anyone know what 'find' command I could run to traverse through my system to see the rate of files scanned/hour?
Long version:
Solaris has an inode cache. So as long as the file in question is cached in the inode cache, there is very little overhead on calling stat() - which is how find works. Look up the man page for either ftw() or nftw().
Once you stat() a file, the inode will get cached if it was not in there. Eventually the cache fills up and some inodes get moved out.
Bottom line: So when you hit the indoe cache you are in kernel you are not testing disk I/O.
So, you are not measuring what you think you're trying to measure by trying find.
I/O is hard to test as a one-off operation
Why? inode caching, different disk controller types, disk speeds (rotational latency), i/o queue request length, file data caching all contribute to how fast/slow you can access a file's data and metadata on disk.
Modern systems with fast disks and no competition for the disk can usually read the first few hundred blocks of a file that is not cached anywhere in something under ~10 milliseconds.
Short answer: don't use find. use nftw() and open() and read() in a simple piece of C.
If you use the shell, remember most commands involve opening files, lots of files, over and over again. Not all commands do this but most do.
Try this:
Note how many files are opened just to run this one simple command.
You get around most of this extra file activity by using one piece of code to try to open all the files on your disk and read one block.
To actually test seek times accurately you need to do something like timing driver-mode code to ask a drive to seek all over the place. Some disk vendors have benchmarking code or disk controller test code that does this. You have to run it as root against a dismounted disk. See if you can find code for your disks.
Hi,
I am looking for a generic find command that works on both Linux and Solaris.
I have the below command that works fine on Linux but fails on solaris.find /web/config -type f '(' -name '*.txt' -or -name '*.xml' -name '*.pro' ')' Fails on SunOS mysolaris 5.10 Generic_150400-61 sun4v sparc... (1 Reply)
Hi all ,
I'm new to unix
I have a checked project , there exists a file called xxx.config .
now my task is to find all the files in the checked out project which references to this xxx.config file.
how do i use grep or find command . (2 Replies)
Hi,
I need to find out the gigabytes/hour growth in filesystem.:confused:
i am using df command for finding the filesystem size and free space every hour. But how can i find the increase in size per hour? :rolleyes:
Do i have to store the last hour entries in a file and comapre with... (2 Replies)
Hi,
I need a command to find the Hardware Model in Solaris 8,9,10,11. The command which I am using right now is:
/usr/platform/`uname -i`/sbin/prtdiag
The problem is in this output:-
System Configuration: Sun Microsystems sun4u Netra T1 200 (UltraSPARC-IIe 500MHz)
Here I am... (1 Reply)
I need to find whether there is a file named vijay is there or not in folder named "opt" .I tried "ls *|grep vijay" but it showed permission problem.
so i need to use find command (6 Replies)
Yes , I have to find a file in unix without using any find or where commands.Any pointers for the same would be very helpful as i am beginner in shell scritping and need a solution for the same.
Thanks in advance.
Regards
Jatin Jain (10 Replies)
It has happended twice the past 3 months. The find command which is the standard part of unix accounting script "dodisk", which searches directories to find out how much disk space a user has used.
On a particular cluster of 6 servers, several file systems, the find command has twice used all... (2 Replies)
Hello,
I create a file touch 1201093003 fichcomp
and inside a repertory (which hava a lot of files) I want to list all files created before this file :
find *.* \! -maxdepth 1 - newer fichcomp but this command returned bash: /usr/bin/find: Argument list too long
but i make a filter all... (1 Reply)
Hi folks,
I am DBA who needs some help with find command. We have a directory with files dated from January 1, 2001. Several files are created each day. We probably have several hundred files. I want to delete all files OLDER than 90 days (I want to keep files that are 90 days before today &... (3 Replies)