iTALC promotes learning on a classroom network


 
Thread Tools Search this Thread
Special Forums News, Links, Events and Announcements UNIX and Linux RSS News iTALC promotes learning on a classroom network
# 1  
Old 03-03-2008
iTALC promotes learning on a classroom network

Mon, 03 Mar 2008 19:00:00 GMT
iTALC, or Intelligent Teaching and Learning with Computers, is a didactical tool designed to assist teachers. Despite its name, the tool itself isn't a learning environment. It's meant to let teachers control their students' computers in a computer-driven classroom setting. Thanks to its powerful remote desktop control features, simple setup, and lack of cost, it's a potential remote assistance tool for any type of network.


Source...
Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. IP Networking

I would like to monitor network traffic for a computer on my network

My son does homework on a school laptop. I was thinking about setting up a gateway on my home network, so that I can monitor web traffic and know if he is doing his homework without standing over his shoulder. Ideally I would like to use the Raspberry Pi Model b that I already have. However, I... (15 Replies)
Discussion started by: gandolf989
15 Replies

2. UNIX for Dummies Questions & Answers

Setting up a home network for learning Linux

I am working on learning Linux, and somebody suggested setting up Linux on a separate computer, and searching for answers to whatever may be needed, on a different computer plugged in to the Internet. I have a Windows 7 PC, plugged in to a cable modem, and an old notebook, Compaq Presario R3000... (5 Replies)
Discussion started by: AdultFoundry
5 Replies

3. Red Hat

Network becomes slow and return fast only after restart network

Hi, I have 2 machines in production environment: 1. redhat machine for application 2. DB machine (oracle) The application doing a lot of small read&writes from and to the DB machine. The problem is that after some few hours the network from the application to the DB becomes very slow and... (4 Replies)
Discussion started by: moshesa
4 Replies

4. Shell Programming and Scripting

I should not post classroom questions here

Hi Please help me. I have file which contains 60 59 52 45 43 40 70 69 62 which bash script can give me "pass on every line and every time remembering previous number give me the first mutch where next number greater than previous, in this example 70" Please help thank you. edit... (1 Reply)
Discussion started by: Hayko
1 Replies

5. Solaris

No network cable But Network interface is UP and Running

I've one Netra 240 After changing main board and system configuration card reader, Network is not accessible any more, Network interfaces are always UP and Running even when there is no cable connected to Network interfaces. I tried to restart and plumb/unplumb with no luck. ifconfig -a... (7 Replies)
Discussion started by: samer.odeh
7 Replies

6. UNIX and Linux Applications

Access to network interface (Mac-network)

Hi, I'm a italian student. For my thesis I develop a gateway with protocol 6lowpan. For that I must access to network interface to develope my personal stack based on standard 802.15.4. Can you help me? I need an explanation for that. (0 Replies)
Discussion started by: berny88
0 Replies

7. Solaris

configure zones to have different network interface and network

i need to configure a zone to use different interface (bge2) than global and have connected to completely different network switch & to use its own defaultrouter and hosts file .. is it possible ..if so ..how ? Thanks (9 Replies)
Discussion started by: skamal4u
9 Replies

8. IP Networking

ssh server is attachable from local network not from another network

hello i have a ubuntu ssh server that i can acess from any of my comnputers but only if they are on the same wireless network as the server. i tested trhis my tehtehring my samsung blackjack to my windows partition and installing openssh to windows it works when windows is on the wireless but no... (1 Reply)
Discussion started by: old noob
1 Replies
Login or Register to Ask a Question
VW(1)								   User Commands							     VW(1)

NAME
vw - Vowpal Wabbit -- fast online learning tool DESCRIPTION
VW options: -h [ --help ] Look here: http://hunch.net/~vw/ and click on Tutorial. --active_learning active learning mode --active_simulation active learning simulation mode --active_mellowness arg (=8) active learning mellowness parameter c_0. Default 8 --adaptive use adaptive, individual learning rates. --exact_adaptive_norm use a more expensive exact norm for adaptive learning rates. -a [ --audit ] print weights of features -b [ --bit_precision ] arg number of bits in the feature table --bfgs use bfgs optimization -c [ --cache ] Use a cache. The default is <data>.cache --cache_file arg The location(s) of cache_file. --compressed use gzip format whenever possible. If a cache file is being created, this option creates a compressed cache file. A mixture of raw-text & compressed inputs are supported with autodetection. --conjugate_gradient use conjugate gradient based optimization --nonormalize Do not normalize online updates --l1 arg (=0) l_1 lambda --l2 arg (=0) l_2 lambda -d [ --data ] arg Example Set --daemon persistent daemon mode on port 26542 --num_children arg (=10) number of children for persistent daemon mode --pid_file arg Write pid file in persistent daemon mode --decay_learning_rate arg (=1) Set Decay factor for learning_rate between passes --input_feature_regularizer arg Per feature regularization input file -f [ --final_regressor ] arg Final regressor --readable_model arg Output human-readable final regressor --hash arg how to hash the features. Available options: strings, all --hessian_on use second derivative in line search --version Version information --ignore arg ignore namespaces beginning with character <arg> --initial_weight arg (=0) Set all weights to an initial value of 1. -i [ --initial_regressor ] arg Initial regressor(s) --initial_pass_length arg (=18446744073709551615) initial number of examples per pass --initial_t arg (=1) initial t value --lda arg Run lda with <int> topics --lda_alpha arg (=0.100000001) Prior on sparsity of per-document topic weights --lda_rho arg (=0.100000001) Prior on sparsity of topic distributions --lda_D arg (=10000) Number of documents --minibatch arg (=1) Minibatch size, for LDA --span_server arg Location of server for setting up spanning tree --min_prediction arg Smallest prediction to output --max_prediction arg Largest prediction to output --mem arg (=15) memory in bfgs --noconstant Don't add a constant feature --noop do no learning --output_feature_regularizer_binary arg Per feature regularization output file --output_feature_regularizer_text arg Per feature regularization output file, in text --port arg port to listen on --power_t arg (=0.5) t power value -l [ --learning_rate ] arg (=10) Set Learning Rate --passes arg (=1) Number of Training Passes --termination arg (=0.00100000005) Termination threshold -p [ --predictions ] arg File to output predictions to -q [ --quadratic ] arg Create and use quadratic features --quiet Don't output diagnostics --rank arg (=0) rank for matrix factorization. --random_weights arg make initial weights random -r [ --raw_predictions ] arg File to output unnormalized predictions to --save_per_pass Save the model after every pass over data --sendto arg send examples to <host> -t [ --testonly ] Ignore label information and just test --loss_function arg (=squared) Specify the loss function to be used, uses squared by default. Currently available ones are squared, classic, hinge, logistic and quantile. --quantile_tau arg (=0.5) Parameter au associated with Quantile loss. Defaults to 0.5 --unique_id arg (=0) unique id used for cluster parallel jobs --total arg (=1) total number of nodes used in cluster parallel job --node arg (=0) node number in cluster parallel job --sort_features turn this on to disregard order in which features have been defined. This will lead to smaller cache sizes --ngram arg Generate N grams --skips arg Generate skips in N grams. This in conjunction with the ngram tag can be used to generate generalized n-skip-k-gram. vw 6.1 June 2012 VW(1)