10-18-2008
Actually I was thinking even PHP was not necessary but that being my core expertise, I thought I'd cover where it would be useful. Perl is more regex centric and so it seems to suffice for most large dataset processing but if anyone is kind enough to explain the power of Python, that would be great too!
![Smilie Smilie](https://www.unix.com/images/smilies/smile.gif)
7 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I have a zipped file that is ~ 10GB. I tried tarring it off to a tape, but I receive:
tar: <filename> too large to archive. Use E function modifier.
The file is stored on a UFS mount, so I was unable to use ufsdump.
What other options do I have? (I don't have a local file system large... (3 Replies)
Discussion started by: FredSmith
3 Replies
2. Solaris
I 've few data sets in my zfs pool which has been exported to the non global zones and i want to copy data on those datasets/file systems to my datasets in new pool mounted on global zone, how can i do that ? (2 Replies)
Discussion started by: fugitive
2 Replies
3. Programming
Hi All,
I am just curious, not programming anything of my own. I know there are libraries like gmp which does all such things. But I really need to know HOW they do all such things i.e. working with extremely large unimaginable numbers which are beyond the integer limit. They can do add,... (1 Reply)
Discussion started by: shoaibjameel123
1 Replies
4. Shell Programming and Scripting
Hi All,
I have a very large single record file.
abc;date||bcd;efg|......... pqr;stu||record_count;date
when i do wc -l on this file it gives me "0" records, coz of missing line feed.
my problem is there is an extra pipe that is coming at the end of this record
like... (6 Replies)
Discussion started by: Gurkamal83
6 Replies
5. Solaris
Hi,
We have file which is about 756 MB of size and vi/vim do not work when we try to edit this file. I'm looking for any editor ( ok if its NOT free ) which has the ability to open/edit a file of 1+GB seamlessly. The OS is SUN Solaris 10 ( Sparc )
Thanks in Advance
Maverick (13 Replies)
Discussion started by: maverick_here
13 Replies
6. UNIX for Dummies Questions & Answers
Hi,
I need a command/script, who opened my dataset, consider a situation like, if a user has opened the dataset few days back then, that command/script should list his/her id.
I don't want audit on my dataset, i need only list of users who are using my dataset.
Thank you. (10 Replies)
Discussion started by: subbarao12
10 Replies
7. Programming
Hello, I wrote code that generates an image that writes its contents to a Targa file. Due to modifications that I wish to do, I decided to copy the value of each pixel as they are calculated to a dynamically allocated array before write it to a file. The problem is now that I all I see is a big... (2 Replies)
Discussion started by: colt
2 Replies
LEARN ABOUT PHP
fann_train_on_data
FANN_TRAIN_ON_DATA(3) 1 FANN_TRAIN_ON_DATA(3)
fann_train_on_data - Trains on an entire dataset for a period of time
SYNOPSIS
bool fann_train_on_data (resource $ann, resource $data, int $max_epochs, int $epochs_between_reports, float $desired_error)
DESCRIPTION
Trains on an entire dataset for a period of time.
This training uses the training algorithm chosen by fann_set_training_algorithm(3) and the parameters set for these training algorithms.
PARAMETERS
o $ann
-Neural network resource.
o $data
-Neural network training data resource.
o $max_epochs
- The maximum number of epochs the training should continue
o $epochs_between_reports
- The number of epochs between calling a callback function. A value of zero means that user function is not called.
o $desired_error
- The desired fann_get_MSE(3) or fann_get_bit_fail(3), depending on the stop function chosen by fann_set_train_stop_function(3)
RETURN VALUES
Returns TRUE on success, or FALSE otherwise.
SEE ALSO
fann_train_on_file(3), fann_train_epoch(3), fann_get_bit_fail(3), fann_get_MSE(3), fann_set_train_stop_function(3), fann_set_training_algo-
rithm(3), fann_set_callback(3).
PHP Documentation Group FANN_TRAIN_ON_DATA(3)