Content Management System for uploading large files


 
Thread Tools Search this Thread
Top Forums Web Development Content Management System for uploading large files
# 1  
Old 02-22-2009
Content Management System for uploading large files

Hi everybody,

I am currently trying to develop a simple content management system where I have an internal website for my users to upload large files onto the server. The site is password protected and my users won't be trying to hack into the system so security is a non-factor (as least for this post).

At the moment I have a PHP/HTML file that allows users to upload files. As these files can get quite large up to 3 gigs in size (transfer rate is ~4 megs a second), I was wondering what would be the best approach? Ideally I would like to have a progress bar to show the status of the transfer. I don't know of too many options but as my users are laboratory staff (minimal computer skills), I would like to have a solution that is web based.

Thanks for your time,

Dave
# 2  
Old 03-05-2009
3 GB? Whew. Whatever system you choose, whether it be MediaWiki, Tiki, or the thousands of others, you'll need to modify the HTTP server and PHP configuration to allow such large uploads. For Apache, this means make sure you have this in your virtualhost/global config:

LimitRequestBody 0

For PHP, see this page
# 3  
Old 03-06-2009
in one of our systems, sometimes we upload huge data files. we use a java program written by someone here, the program runs for a few hours to finish the job.
# 4  
Old 03-20-2009
Hey otheus and Yogest,

Thanks for the replies. Sorry for the late notice, but I didn't get an email notification that my thread was answered.

I shall have a look at the apache and PHP settings when I get around to doing this.

Since I only know Perl, I will do some research to see whether I can write a script for this purpose.

Thanks again.

Dave
Login or Register to Ask a Question

Previous Thread | Next Thread

2 More Discussions You Might Find Interesting

1. Programming

Best Method For Query Content In Large JSON Files

I wanted to know what is the best way to query json formatted files for content? Ex. Data https://usn.ubuntu.com/usn-db/database-all.json.bz2 When looking at keys as in: import json json_data = json.load(open('database-all.json')) for keys in json_data.iterkeys(): print 'Keys--> {}... (0 Replies)
Discussion started by: metallica1973
0 Replies

2. Programming

Is there a system call other than 'open' for opening very large files?

Dear all, Inside a C program, I want to open a very big file (about 12 GB) in order to read its content. Here is the code: /* argv contains the path to the file. */ inputFileDescriptor = open(argv, O_RDONLY); if (inputFileDescriptor < 0) { ... (6 Replies)
Discussion started by: dariyoosh
6 Replies
Login or Register to Ask a Question