Sponsored Content
Operating Systems SCO Unable to dump due to limited space? Post 100398 by Mac Tire on Monday 27th of February 2006 09:18:40 AM
Old 02-27-2006
Thanks RTM. The version is SCO Open Server Release 5.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Unable to catch the output after core dump and bus error

I have a weird situation in which the binary dumps core and gives bus error. But before dumping the core and throwing the buss error, it gives some output. unfortunately I can't grep the output before core dump db2bfd -b test.bnd maxSect 15 Bus Error (core dumped) But if I do ... (4 Replies)
Discussion started by: rakeshou
4 Replies

2. HP-UX

HPVM Unable to create more guests due to lack of RAM

Hi All, There are few threads regarding this subject of being unable to create more guests due to lack of RAM. So I am aware how the sum works.. add 8.5% to whatever is allocated, be that the host or guest. But I'm not sure if I have a hardware issue with memory or what I see is correct as I am... (3 Replies)
Discussion started by: EricF
3 Replies

3. Solaris

Solaris file system unable to use available space

Hi, The solaris filesystem /u01 shows available space as 100GB, and used space as 6 GB. The Problem is when iam trying to install some software or copy some files in this file system /u01 Iam unable to copy or install in this file system due to lack of space. ofcourse the software... (31 Replies)
Discussion started by: iris1
31 Replies

4. Red Hat

Unable to free space due to inode in use by database

Hi, I am having similar issue showing filesystem 100% even after deleting the files. I understood the issue after going through this chain. But i can not restart the processes being oracle database. Is there way like mounting filesytem with specific options would avoid happening this issue. How... (0 Replies)
Discussion started by: prashant185
0 Replies

5. OS X (Apple)

Compiling fails due to space in path to home folder

I seem to have issues compiling software and I think I've narrowed it down to something having to do with having a space in the path name to my Home folder (which contains "Macintosh HD"). The reason I think this is shown here: $ echo $HOME /Volumes/Macintosh HD/Users/Tom $ cd $HOME -sh:... (7 Replies)
Discussion started by: tdgrant1
7 Replies

6. Shell Programming and Scripting

Unable to read the first space of a record in while loop

I have a loop like while read i do echo "$i" . . . done < tms.txt The tms.txt contians data like 2008-02-03 00:00:00 <space>00:00:00 . . . 2010-02-03 10:54:32 (2 Replies)
Discussion started by: machomaddy
2 Replies

7. Red Hat

Unable to copy files due to many files in directory

I have directory that has some billion file inside , i tried copy some files for specific date but it's always did not respond for long time and did not give any result.. i tried everything with find command and also with xargs.. even this command find . -mtime -2 -print | xargs ls -d did not... (2 Replies)
Discussion started by: before4
2 Replies

8. HP-UX

Unable to create a tar file due to link

Hi, I am trying to tar a directory structure. but unable to do due to a symbolic link. Please help indomt@behpux $ tar -cvf test.tar /home/indomt a /home/indomt symbolic link to /dxdv/03/ap1dm1 Thanks (1 Reply)
Discussion started by: nag_sathi
1 Replies

9. HP-UX

Unable to get full FS space after mounting

Hi, I am unable to get the full FS space, as /home is 100% utilized and after deleting unwanted files, its still 100%. After checking the du -sk * | sort -n output and converting it to MBs, the total sizes comes out to be 351 MBs only however the lvol is of 3GB. I don't know where is all the space... (2 Replies)
Discussion started by: Kits
2 Replies

10. Ubuntu

Unable to add space

I have / root directory has file system /dev/sda1 with 19G space I want to add some more space to /home directory but unable to do it while running below command getting below message $sudo mkfs -t ext4 /dev/sda2 mke2fs 1.42.9 (4-Feb-2014) mkfs.ext4: inode_size (128) * inodes_count (0) too... (4 Replies)
Discussion started by: megh
4 Replies
bdf(1M) 																   bdf(1M)

NAME
bdf - report number of free disk blocks (Berkeley version) SYNOPSIS
type [filesystem|file] ... ] DESCRIPTION
The command displays the amount of free disk space available either on the specified filesystem for example) or on the file system in which the specified file (such as is contained. If no file system is specified, the free space on all of the normally mounted file systems is printed. The reported numbers are in kilobytes. Options The command recognizes the following options: Display information regarding file system swapping. Report the number of used and free inodes. Display information for local file systems only (for example, HFS and CDFS file systems). Do not sync the file system data on the disk before reporting the usage. Note that the data reported by may not be up to date. Report on the file systems of a given type (for example, or RETURN VALUE
The command returns 0 on success (able to get status on all file systems), or returns 1 on failure (unable to get status on one or more file systems). WARNINGS
If file system names are too long, the output for a given entry is displayed on two lines. The command does not account for any disk space reserved for swap space, or used for the HFS boot block (8 KB, 1 per file system), HFS superblocks (8 KB each, 1 per disk cylinder), HFS cylinder group blocks (1 KB - 8 KB each, 1 per cylinder group), and inodes (currently 128 bytes reserved for each inode). Non-HFS file systems may have other items not accounted for by this command. AUTHOR
was developed by the University of California, Berkeley. FILES
Static information about the file systems. Mounted file system table. File system devices. SEE ALSO
df(1M), fstab(4), mnttab(4). bdf(1M)
All times are GMT -4. The time now is 01:57 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy