Help in job submission!


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Help in job submission!
# 1  
Old 05-27-2009
Help in job submission!

I am attempting to submit alot of jobs to this CONDOR program, in which it utilizes alot of computers and puts a single job onto each. Anyways, im trying to use the same program (autodock) to analyze different input files (dpf files), where each input file has a different name but same file extension. The submit files uses an executable file however i cannot figure out to get this executable to recognize the different input dpf files and send those to the computers for the job to be ran. Does this make sense? The executable reads,

autodock4 -p name.dpf -l name.dlg

Ive tried a few different things but i cannot get this to work. Your help is appreciated. Thanks
 
Login or Register to Ask a Question

Previous Thread | Next Thread

4 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Multiple variables using awk and for loop for web form submission

Hi My goal is to fill an HTML form and submit. What I have managed to do: 1. curl command to fill up the form and submit 2. a file which has the input curl command: curl -v -b cookie.txt -d __CSRFToken__=dc23d5da47953b3b390ec68d972af10380908b14 -d do=create -d a=open -d... (10 Replies)
Discussion started by: zorrox
10 Replies

2. Shell Programming and Scripting

Request for help with SGE submission script manipulation

Hi, I have the following SGE submission (HPC calculation) script, which is just a Bash script: #!/bin/bash -l #$ -S /bin/bash #$ -l h_rt=1:00:0 #$ -l mem=4G #$ -N XXX #$ -pe qlc 24 #$ -P XXX #$ -wd /home/uccaxxx/Scratch/222PdT/3vac/c0001/ mpirun -m $TMPDIR/machines -np $NSLOTS... (2 Replies)
Discussion started by: crunchgargoyle
2 Replies

3. Shell Programming and Scripting

Script to Start a Job after finding the Old job completed

Hi Experts, I need a script advice to schedule 12 jobs ( SAS Codes execute back ground ). Algorithem: 1. Script checks first job. 2. Finds first job is done; invoke second job. 3. finds second job is done; invoke third job. .. Request you to please assist. (3 Replies)
Discussion started by: Jerald Nathan
3 Replies

4. Solaris

killing a unix job after the job process gets completed

Hi, Thanks in advance. i need to kill a unix background running job after that job process completes. i can kill a job by giving the following unix command kill -9 processid how to kill the job after the current process run gets completed ? Appreciate your valuable help. Thanks... (7 Replies)
Discussion started by: dtazv
7 Replies
Login or Register to Ask a Question
CD-HIT-PARA.PL(1)						   User Commands						 CD-HIT-PARA.PL(1)

NAME
cd-hit-para.pl - divide a big clustering job into pieces to run cd-hit or cd-hit-est jobs SYNOPSIS
cd-hit-para.pl options DESCRIPTION
This script divide a big clustering job into pieces and submit jobs to remote computers over a network to make it parallel. After all the jobs finished, the script merge the clustering results as if you just run a single cd-hit or cd-hit-est. You can also use it to divide big jobs on a single computer if your computer does not have enough RAM (with -L option). Requirements: 1 When run this script over a network, the directory where you run the scripts and the input files must be available on all the remote hosts with identical path. 2 If you choose "ssh" to submit jobs, you have to have passwordless ssh to any remote host, see ssh manual to know how to set up passwordless ssh. 3 I suggest to use queuing system instead of ssh, I currently support PBS and SGE 4 cd-hit cd-hit-2d cd-hit-est cd-hit-est-2d cd-hit-div cd-hit-div.pl must be in same directory where this script is in. Options -i input filename in fasta format, required -o output filename, required --P program, "cd-hit" or "cd-hit-est", default "cd-hit" --B filename of list of hosts, requred unless -Q or -L option is supplied --L number of cpus on local computer, default 0 when you are not running it over a cluster, you can use this option to divide a big clustering jobs into small pieces, I suggest you just use "--L 1" unless you have enough RAM for each cpu --S Number of segments to split input DB into, default 64 --Q number of jobs to submit to queue queuing system, default 0 by default, the program use ssh mode to submit remote jobs --T type of queuing system, "PBS", "SGE" are supported, default PBS --R restart file, used after a crash of run -h print this help More cd-hit/cd-hit-est options can be speicified in command line Questions, bugs, contact Weizhong Li at liwz@sdsc.edu cd-hit-para.pl 4.6-2012-04-25 April 2012 CD-HIT-PARA.PL(1)