Problem with creating big files


 
Thread Tools Search this Thread
Operating Systems Linux Problem with creating big files
# 1  
Old 11-06-2007
Problem with creating big files

Hi...
I have a very wired problem with my redhat4 update 4 server...
Every time i create a file bigger then my physical memory the server kills the process\session that creates the file, and in the "messages" file i see this error:
"Oct 21 15:22:22 optidev kernel: Out of Memory: Killed process 14300 (sshd)."
The problem occurs only when I'm creating the file on NFS file-system, on local file-system it's ok...

Any thoughts?

Thanks and sorry for my pour English.
Eliraz.
# 2  
Old 11-06-2007
1. What program are you using to create the file?

2. Does the NFS server impose limits on user file space?
# 3  
Old 11-06-2007
Also check if the filesystem exported from the NFS server can handle largefiles. You can do that by checking the filesystem like this:

mount | grep "/path_to_filesystem"

This will show the mount options. If largefiles isn't visible, there's your problem.
# 4  
Old 11-06-2007
Quote:
Originally Posted by porter
1. What program are you using to create the file?

2. Does the NFS server impose limits on user file space?
1. basically I'm trying to create the file using sqlplus (it's an oracle data file), but i have tried also "dd" and it happened again...

2. the NFS server does not have any limits and the file-system is 100GB, also the process killing comes from the OS not from the NFS server...
# 5  
Old 11-06-2007
Quote:
Originally Posted by blowtorch
Also check if the file-system exported from the NFS server can handle largefiles. You can do that by checking the file-system like this:

mount | grep "/path_to_filesystem"

This will show the mount options. If largefiles isn't visible, there's your problem.
this is what i get:
nfserver4:/fs_db_optidev on /db/optidev type NFS (rw,retry=1000,addr=10.57.8.189)

but "largefiles" is an option that handle files larger than 2GB... my file is reaching 4 GB (the size of physical memory) and then crashes, so i don't think this is the problem...
# 6  
Old 11-06-2007
need to increase the swap size in the NFS server

Reason: NFS copies files from the client to the RAM to the FS in the server

Now if the RAM is out of space and the swap cannot store any more too, its an usual behavior to kill the process trying to write to it

~Sage
# 7  
Old 11-06-2007
Quote:
Originally Posted by s4g3
need to increase the swap size in the NFS server

Reason: NFS copies files from the client to the RAM to the FS in the server

Now if the RAM is out of space and the swap cannot store any more too, its an usual behavior to kill the process trying to write to it

~Sage
the NFS server is a celerra (EMC)... i don't think it's because of that... this machine is built for NFS serving of large files...
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Solaris

Big /etc/group creating issue

I have Solaris-10 with kernel patch 142900-03, Update 7 Sparc server. root@ddlomps40:/# uname -a SunOS ddlomps40 5.10 Generic_142900-03 sun4u sparc SUNW,SPARC-Enterprise root@ddlomps40:/# cat /etc/release Solaris 10 5/09 s10s_u7wos_08 SPARC Copyright 2009 Sun... (5 Replies)
Discussion started by: solaris_1977
5 Replies

2. Post Here to Contact Site Administrators and Moderators

Big problem during the access for the password

Every time that i quit from the forum despite i'm 100 % sure that i put the right login (i make a copy and past of the correct data) i obtain always the message that the login is wrong so i must always reset it. :wall: Could be a bug? I use a 64 bit linux system. (4 Replies)
Discussion started by: alexscript
4 Replies

3. Shell Programming and Scripting

big problem with rm command, please help me!!

Hi every body, well, i made quite a big mistake... In a script, I created a directory called "$PWD" ( it was unwanted...). nothing very frightening with that. The problem is that I wanted to remove it and I used the command rm -f -r $PWD And here is the big mistake!! Rather than deleting... (5 Replies)
Discussion started by: Moumou
5 Replies

4. Programming

please help me with this big makefile problem

I need to create an executable with these two makefiles(they both have libaries i need(qt and ruby)) i have extconf.rb gui.ui gui_include.h main.cpp ScaleIM_client.rb ui_gui.h i want to combine them all into one executable please!... (2 Replies)
Discussion started by: gjgfuj
2 Replies

5. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

6. UNIX for Dummies Questions & Answers

A Big Problem in LINUX Booting

i have some problem in linux booting will u please help me the problem is i was using federo core 1 on my system everything was fine i made one entry in /etc/fstab file for accessing E drive of WINDOWS XP in that i had given file system as VFAT after rebooting system it was not... (1 Reply)
Discussion started by: great_indian
1 Replies

7. UNIX for Dummies Questions & Answers

kinda big problem

Hi again! I finally managed to image the 3 floppies for installing freeBSD but I simply can't boot! I've selected floppy for first boot device and it even starts to readf the boot floppy and the hard drive light is continuously flashing but it doesn't stop even after half an hour and doesn't do a... (4 Replies)
Discussion started by: Mudrack
4 Replies

8. UNIX for Advanced & Expert Users

a BIG problem

dear experts ...help :eek: I'm running FC2 and i was installing fluxbox but it wouldnt work so i uninstalled it by using rpm -e but because i installed fluxbox-styles it was dependent on fluxbox-0.10 so i used rpm -e --nodeps that was all fine and im not sure that was the cause of the error, but... (1 Reply)
Discussion started by: zeeman
1 Replies

9. UNIX for Dummies Questions & Answers

problem editing big file in vi

I have a big file, which vi opens it with message not sufficient space with file system. I am not adding any data in the file but changing some values within. To make these changes effective, it asks for forced write (w!), even after doing this, I see this particular record, change is not... (4 Replies)
Discussion started by: videsh77
4 Replies
Login or Register to Ask a Question