Have problem transfer large file bigger 1GB


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Have problem transfer large file bigger 1GB
# 1  
Old 01-04-2009
Have problem transfer large file bigger 1GB

Hi folks,

I have a big problem.... and need help from your experience/knowledge.

I previously install and use FREEBSD 7.0 release on my storage/backup
file server, for some reason, I can not transfer any files that is bigger
than 1GB. If I transfer it to Freebsd file server, the system complain
a kernel panic and reboot itself. So, I then waited and finally try
the new FreeBSD 7.1 release 2 days ago and test out with
transfer a file larger than 1GB, well, the system just freeze so I have
to reboot the system.
So does anyone know what is going on? Is there a limit size on for
transfer large file? What can I do to resolve this? FreeBSD is my file
server and backup and if I can not transfer large file guess I have to
change to UBUNTU and hope that it will not have the same problem.

My system info:
P4 640 3.2 GHZ, 2GB memory, 80gb hd for FREEBSD, 640GB storage.

One more thing, as I keep rebooting my system, the hard drive is starting
to get a few bad sector now too.

PLEASE help me,

THanks FOLKS
# 2  
Old 01-04-2009
Sound more like a problem with the hardware than the software. When transferring the file you probably hit a bad sector and BSD (for some reason) panics. I have to admit that it's pretty unusual for it to panic at that, but it's possible. How old is your harddisk, and what's the output of fsck?
# 3  
Old 01-04-2009
pludi, thanks for your reply.

I have western digital 640GB about 6 months old and it OEM unit.
I first run and test out the hard disk surface a couple of time
when I got it and seem fine. But since I keep encounter
this kernel panic problem, the system reboot it self in ver 7.0 release,
both my hard drive seem to have some corrupted place on it.

Previously, I have no problem transfer the file a couple of months
at first use of the hard drive, just recently back to 3 months max.
So I think it the hard drive also, but before I go purchase new
one just want to ask around to see anyone have same issue and
what do they do to resolve it. Money is tight these days.
 
Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. AIX

Problem using scp to transfer a file

I am testing the following command to transfer a file from my server (AIX 5.2) to another server. I was able to generate the keys and sent them the public key. scp -v -P 4030 /home/lawson/.ssh/jimtest.txt someuser@some.ftpsite.net:/Inbound/jimtest.txt > jimtest_out.txt 2>&1 Based on... (3 Replies)
Discussion started by: jyoung
3 Replies

2. UNIX for Advanced & Expert Users

problem while doing Large file transfer thru Scp and FTP

Hi , I want to transfer one file having 6GB(after compression) which is in .cpk format from one server to other server. I tried scp command as well as FTP and also split the file then transfer the files thru scp command. At last i am facing the data lost and connection lost issue. Generally it... (2 Replies)
Discussion started by: Sumit sarangi
2 Replies

3. Red Hat

1Gb+ memory problem

Hi, I have a Linux distribution ( Oralce Enterprise Linux 5.3 i.e. Redhat ) that I have installed. It works fine when I used 2*512Mb dimms or replace them with a single 1Gb dimm. However when I try to go above 1 Gb the bootup and general performance deteriorates badly. The BIOS picks up the memory... (3 Replies)
Discussion started by: jimthompson
3 Replies

4. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

5. Programming

Problem in file transfer using sockets

Hai Friends I am writing a c program to transfer files from one system to another using TCP/IP socket programming.. My Recieve Program #include <stdio.h> #include <sys/types.h> #include <sys/socket.h> #include <netinet/in.h> extern int errno; extern char *sys_erlist; void... (2 Replies)
Discussion started by: collins
2 Replies

6. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

7. UNIX for Advanced & Expert Users

large file transfer

I have 130GB file to transfer to another machine. What is the best way to make sure the file succesfully transfered? I'm using Solaris & I'm thinking to use scp but afraid if anything happen, I've to scp it again. From my calculation it going to take 1-2 days to complete. Is scp supports resume? Or... (1 Reply)
Discussion started by: ashterix
1 Replies

8. UNIX for Advanced & Expert Users

Large file transfer problem

Hello Everyone, I can't transfer a large file (~15GB TAR Archive) from one linux machine to another via FTP. I have tried the following: 1) Normal FTP the whole 15GB. This stops when it gets to about 2GB and doesn't go any further. 2) Split the 15GB file into 500MB pieces using the... (1 Reply)
Discussion started by: VVV
1 Replies
Login or Register to Ask a Question