Sponsored Content
Top Forums Shell Programming and Scripting Writing a Perl Script that processes multiple files Post 302561912 by Corona688 on Wednesday 5th of October 2011 01:43:07 PM
Old 10-05-2011
Code:
for($N=1; $N<=22; $N++)
{
        $name=sprintf("/path/to/250.1chr%d.ped", $N);
        if(!open(FILE, "<${name}"))
        {
                print STDERR "Couldn't open ${name}\n";
                continue;
        }

        ...

        close(FILE);
}

This User Gave Thanks to Corona688 For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Combining Multiple files in one in a perl script

All, I want to combine multiple files in one file. Something like what we do on the commad line as follows -> cat file1 file2 file3 > Main_File. Can something like this be done in a perl script very efficiently? Thanks, Rahul. (1 Reply)
Discussion started by: rahulrathod
1 Replies

2. Shell Programming and Scripting

Multiple processes writing on the same file simultaneously

Hi All, I have encountered a problem,please help me. I have a script in which multiple processes are writing on to the same file . How should I stop this , I mean lock mechanism can be implemented or we can write the at different files and then concatenate the files. What would be a better... (1 Reply)
Discussion started by: Sayantan
1 Replies

3. Shell Programming and Scripting

perl script on multiple files

I have a script that runs on one file (at a time). like this: $> perl myscript.pl filename > output How can I run it on >6000 files and have the output sent out into slightly modified file name $> perl myscript 6000files> output6000files.new extension Thanks in anticipation (4 Replies)
Discussion started by: aritakum
4 Replies

4. Shell Programming and Scripting

Run perl script on files in multiple directories

Hi, I want to run a Perl script on multiple files, with same name ("Data.txt") but in different directories (eg : 2010_06_09_A/Data.txt, 2010_06_09_B/Data.txt). I know how to run this perl script on files in the same directory like: for $i in *.txt do perl myscript.pl $i > $i.new... (8 Replies)
Discussion started by: ad23
8 Replies

5. UNIX for Dummies Questions & Answers

Writing a for loop that processes multiple input files

I would like to write a for loop that does the following: I have a file called X.txt and other files called 1.txt,2.txt, .....,1000.txt. I want to substitute the 6th column of the file X.txt with 1.txt and store the output as X.1. Then I want to do the same with X.txt and 2.txt and store the... (1 Reply)
Discussion started by: evelibertine
1 Replies

6. UNIX for Dummies Questions & Answers

Writing a loop to manipulate a script and store it in multiple output files

I have a script where the the 9th line looks like this: $filename=sprintf("250.1chr%d.ped", $N); I want to modify this script 1000 times, changing 250.1chr%d.ped to 250.2chr%d.ped, 250.3chr%.ped.......and so on all the way to 250.1000chr%d.ped and store each output in files called ... (4 Replies)
Discussion started by: evelibertine
4 Replies

7. UNIX for Dummies Questions & Answers

Writing a loop to process multiple input files by a shell script

I have multiple input files that I want to manipulate using a shell script. The files are called 250.1 through 250.1000 but I only want the script to manipulate 250.300 through 250.1000. Before I was using the following script to manipulate the text files: for i in 250.*; do || awk... (4 Replies)
Discussion started by: evelibertine
4 Replies

8. Shell Programming and Scripting

How can I do one liner import multiple custom .pm files in my perl script?

I am new for Perl I want to ask one question. I have around 50 custom packages which i am using in my Perl script. I want to import all .pm packages in my Perl script in an easy way. Right now i have to import each package individually. So Is there any way to do so?? Right Now i am doing like: ... (1 Reply)
Discussion started by: Navrattan Bansa
1 Replies

9. UNIX for Dummies Questions & Answers

Writing a script to print the number of lines in multiple files

Hi I have 1000 files labelled data1.txt through data1000.txt. I want to write a script that prints out the number of lines in each txt file and outputs it in the following format: Column 1: number of data file (1 through 1000) Column 2: number of lines in the text file Thanks! (2 Replies)
Discussion started by: evelibertine
2 Replies

10. Shell Programming and Scripting

How to run perl script on multiple files of two directories?

Hi I have 100 files under file A labled 1.txt 2.txt.....100.txt(made up name) I have 1 files under file B labled name.txt How can i run the same perl script on 100 files and file name.txt I want to run perl script.pl A/1.txt B/name.txt perl script.pl A/2.txt B/name.txt ....... perl... (3 Replies)
Discussion started by: grace_shen
3 Replies
ENVEXT(1)						  The Canonical Csound Reference						 ENVEXT(1)

NAME
envext - Extracts the envelope of a file to a text file. . SYNTAX
envext [-flags] soundfile csound -U envext [-flags] soundfile INITIALIZATION
soundfile - Name of the input soundfile. The following flags are available for envext (The default values are stated in parenthesis): -o fnam Name of output filename (newenv) -w size (in seconds) of analysis window (0.25) The envext utility generates a text file containing time and amplitude pairs by finding the absolute peak within each window. EXAMPLE
Using the command (while in the manual directory): csound -U envext examples/mary.wav will produce the a text file containing the following: 0.000 0.000 0.000 0.000 0.250 0.000 0.500 0.000 0.750 0.000 1.249 0.170 1.499 0.269 1.530 0.307 1.872 0.263 2.056 0.304 2.294 0.241 2.570 0.216 2.761 0.178 3.077 0.011 3.251 0.001 3.500 0.000 Which shows the time for the peak amplitude within each measured window. CREDITS
Author: John ffitch 1995 AUTHORS
Barry Vercoe MIT Media Lab Author. Dan Ellis MIT Media Lab, Cambridge Massachussetts Author. COPYRIGHT
5.07 06/23/2009 ENVEXT(1)
All times are GMT -4. The time now is 06:35 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy