Sponsored Content
Full Discussion: Java - is it worth learning?
Top Forums Shell Programming and Scripting Java - is it worth learning? Post 41412 by frustrated1 on Monday 6th of October 2003 06:58:12 AM
Old 10-06-2003
Java - is it worth learning?

I have the opportunituy of learning basics/intermediate jave for 600 Euro. Is this worth learning??
 

8 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Is learning Unix worth it?

Hello. I am a comp sci major and am forced to take a intro to Unix class. So far i am loving it. I was wondering is it useful to learn more off on my own? Will it have any use to me when i get a job after school is done? Same applies to Perl Sed and Awk? (5 Replies)
Discussion started by: smiledk1
5 Replies

2. UNIX for Dummies Questions & Answers

Is Unix Worth it?

I have been wanting to get much deaper into the world of computers for quite some time. I know a lot of c++, and plenty of website programming, and decided that the next step should be Unix. But here's the thing - I know nothing about Unix. I installed it and everything, but it just seemed like... (3 Replies)
Discussion started by: GuyWithAPen
3 Replies

3. Shell Programming and Scripting

How do i get only last 5 minute worth of data

I have a text file called 'tomcat_temp_out'. I want to get only last 5 minute worth of data from this file and redirect those data into another fule. Could you please help to work on this? (2 Replies)
Discussion started by: shivanete
2 Replies

4. UNIX for Dummies Questions & Answers

What is this system worth?

So my family is cleaning out our house and my dad stumbled on a Unix system with Unigraphix installed on it, and he remembers that it was 1 of 6 computers in a set that he used at a tool and dye machine shop where he worked. He said that the computer by itself with the monitor was $20,000! I was... (8 Replies)
Discussion started by: NVOtosReborn
8 Replies

5. What is on Your Mind?

Is M.Sc (FOSS) worth doing?

Recently while reading an linux magazine I understood that FOSS (Free or open source software) is gaining momentum.. And in my home town there is an reputed university which offers M.Sc online program on FOSS. The course covers: INTRODUCTION TO COMPUTING, PHILOSOPHY AND PRACTICE OF FOSS,... (4 Replies)
Discussion started by: Arun_Linux
4 Replies

6. Linux

Are /home partitions worth it?

I'm new to the Linux world and whilst I've been learning the ropes, I've read some conflicting opinions regarding the creation of separate partitions for /home and other directories during OS install. Some say that having these directories in separate partitions allows you to reinstall without... (12 Replies)
Discussion started by: maerlyngb
12 Replies

7. Programming

Is C worth the effort?

Hello guys, I have a little question. I think about learning c or c++ because im very interessted in low Level programming. And because i love Unix Too i thought C would be the better choice since Most it Done in c. Or should i learn c++? Because C++ has all this nice Features like oop and... (9 Replies)
Discussion started by: dryPants
9 Replies

8. What is on Your Mind?

Are certifications worth it?

I have just been on RedHat SA 3 training course (4 days) and sat exams EX200 (RHCSA) and EX300 (RHCE) The daft thing was that politics meant I wasn't allowed to take courses SA 1 or 2. So I learnt about stuff I would never use (SELinux; iSCSI; NFS Kerberos encrypted with user specific access... (22 Replies)
Discussion started by: rbatte1
22 Replies
VW(1)								   User Commands							     VW(1)

NAME
vw - Vowpal Wabbit -- fast online learning tool DESCRIPTION
VW options: -h [ --help ] Look here: http://hunch.net/~vw/ and click on Tutorial. --active_learning active learning mode --active_simulation active learning simulation mode --active_mellowness arg (=8) active learning mellowness parameter c_0. Default 8 --adaptive use adaptive, individual learning rates. --exact_adaptive_norm use a more expensive exact norm for adaptive learning rates. -a [ --audit ] print weights of features -b [ --bit_precision ] arg number of bits in the feature table --bfgs use bfgs optimization -c [ --cache ] Use a cache. The default is <data>.cache --cache_file arg The location(s) of cache_file. --compressed use gzip format whenever possible. If a cache file is being created, this option creates a compressed cache file. A mixture of raw-text & compressed inputs are supported with autodetection. --conjugate_gradient use conjugate gradient based optimization --nonormalize Do not normalize online updates --l1 arg (=0) l_1 lambda --l2 arg (=0) l_2 lambda -d [ --data ] arg Example Set --daemon persistent daemon mode on port 26542 --num_children arg (=10) number of children for persistent daemon mode --pid_file arg Write pid file in persistent daemon mode --decay_learning_rate arg (=1) Set Decay factor for learning_rate between passes --input_feature_regularizer arg Per feature regularization input file -f [ --final_regressor ] arg Final regressor --readable_model arg Output human-readable final regressor --hash arg how to hash the features. Available options: strings, all --hessian_on use second derivative in line search --version Version information --ignore arg ignore namespaces beginning with character <arg> --initial_weight arg (=0) Set all weights to an initial value of 1. -i [ --initial_regressor ] arg Initial regressor(s) --initial_pass_length arg (=18446744073709551615) initial number of examples per pass --initial_t arg (=1) initial t value --lda arg Run lda with <int> topics --lda_alpha arg (=0.100000001) Prior on sparsity of per-document topic weights --lda_rho arg (=0.100000001) Prior on sparsity of topic distributions --lda_D arg (=10000) Number of documents --minibatch arg (=1) Minibatch size, for LDA --span_server arg Location of server for setting up spanning tree --min_prediction arg Smallest prediction to output --max_prediction arg Largest prediction to output --mem arg (=15) memory in bfgs --noconstant Don't add a constant feature --noop do no learning --output_feature_regularizer_binary arg Per feature regularization output file --output_feature_regularizer_text arg Per feature regularization output file, in text --port arg port to listen on --power_t arg (=0.5) t power value -l [ --learning_rate ] arg (=10) Set Learning Rate --passes arg (=1) Number of Training Passes --termination arg (=0.00100000005) Termination threshold -p [ --predictions ] arg File to output predictions to -q [ --quadratic ] arg Create and use quadratic features --quiet Don't output diagnostics --rank arg (=0) rank for matrix factorization. --random_weights arg make initial weights random -r [ --raw_predictions ] arg File to output unnormalized predictions to --save_per_pass Save the model after every pass over data --sendto arg send examples to <host> -t [ --testonly ] Ignore label information and just test --loss_function arg (=squared) Specify the loss function to be used, uses squared by default. Currently available ones are squared, classic, hinge, logistic and quantile. --quantile_tau arg (=0.5) Parameter au associated with Quantile loss. Defaults to 0.5 --unique_id arg (=0) unique id used for cluster parallel jobs --total arg (=1) total number of nodes used in cluster parallel job --node arg (=0) node number in cluster parallel job --sort_features turn this on to disregard order in which features have been defined. This will lead to smaller cache sizes --ngram arg Generate N grams --skips arg Generate skips in N grams. This in conjunction with the ngram tag can be used to generate generalized n-skip-k-gram. vw 6.1 June 2012 VW(1)
All times are GMT -4. The time now is 10:27 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy