Quote:
Originally Posted by
Annihilannic
What OS revision (cat /etc/release)
qmmm1#cat /etc/release
Solaris 9 s9_58shwpl3 SPARC
Copyright 2002 Sun Microsystems, Inc. All Rights Reserved.
Use is subject to license terms.
Assembled 15 April 2002
Quote:
Originally Posted by
Annihilannic
and architecture (uname -a) are we dealing with here?
qmmm1#uname -a
SunOS qmmm1 5.9 Generic_112233-04 sun4u sparc SUNW,Sun-Blade-100
Quote:
Originally Posted by
Annihilannic
What type of filesystem is /users (mount -v | grep users)? If it is NFS mounted or automounted, have you checked the behaviour on the server where it really resides?
mount -v | grep users gives nothing; as mount -v doesn't display any details about /users? df also doesn't show the /users folder; but it does show / which is (/dev/disk/c0t0d0s0)
Here is the output from
mount -v:
qmmm1#mount -v
/dev/dsk/c0t0d0s0 on / type ufs read/write/setuid/intr/largefiles/logging/xattr/onerror=panic/dev=2200000 on Tue Oct 28 07:07:07 2008
/dev/dsk/c0t0d0s5 on /usr type ufs read/write/setuid/intr/largefiles/logging/xattr/onerror=panic/dev=2200005 on Tue Oct 28 07:07:08 2008
/proc on /proc type proc read/write/setuid/dev=3b40000 on Tue Oct 28 07:07:05 2008
mnttab on /etc/mnttab type mntfs read/write/setuid/dev=3c00000 on Tue Oct 28 07:07:05 2008
fd on /dev/fd type fd read/write/setuid/dev=3c40000 on Tue Oct 28 07:07:09 2008
/dev/dsk/c0t0d0s3 on /var type ufs read/write/setuid/intr/largefiles/logging/xattr/onerror=panic/dev=2200003 on Tue Oct 28 07:07:17 2008
swap on /var/run type tmpfs read/write/setuid/xattr/dev=1 on Tue Oct 28 07:07:17 2008
/dev/dsk/c0t0d0s4 on /opt type ufs read/write/setuid/intr/largefiles/logging/xattr/onerror=panic/dev=2200004 on Tue Oct 28 07:07:20 2008
swap on /tmp type tmpfs read/write/setuid/xattr/dev=2 on Tue Oct 28 07:07:20 2008
Also: I have since been able to create other files > 1024 bytes large. I did this by creating short dummy files, and then cat'ing with a wildcard and redirecting output to another dummy file (which now contains the contents of all the other files). I had to continue doing this iteratively to build up the size of the largest dummy file. I know this doesn't help much - I was trying everything I could think of so I can't really reproduce what I did. I am now at the stage where I seem to have hit a hard-limit on my $HOME folder, as I have filled it with dummy files that I can later bastardize for my needs as I did with .bash_history. This problem is weird, but I've got a strong feeling I came across it years ago... I just can't remember the cause!