Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Splitting a file and sending results to another file Post 302455268 by ladyAnne on Tuesday 21st of September 2010 07:52:38 AM
Old 09-21-2010
The reason I want to do this within the scrpt, is because I have hundreds of files to run.. I eventually want it to run in a loop from my one directory of files, and store each output in another directory of resultant files
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Splitting a file based on record sin another file

All, We receive a file with a large no of records (records can vary) and we have to split it into two files based on another file. e.g. File1: UHDR 2008112 "25187","00000022","00",21-APR-1991,"" ,"D",-000000519,+0000000000,"C", ,+000000000,+000000000,000000000,"2","" ,21-APR-1991... (7 Replies)
Discussion started by: er_ashu
7 Replies

2. Shell Programming and Scripting

File splitting, naming file according to internal field

Hi All, I have a rather stange set of requirements that I'm hoping someone here could help me with. We receive a file that is actually a concatenation of 4 files (don't believe this would change, but ideally the solution would handle n files). The super-file looks like:... (7 Replies)
Discussion started by: Leedor
7 Replies

3. Shell Programming and Scripting

Splitting a file in to multiple files and passing each individual file to a command

I have an input file with contents like: MainFile.dat: 12247689|7896|77698080 16768900|hh78|78959390 12247689|7896|77698080 16768900|hh78|78959390 12247689|7896|77698080 16768900|hh78|78959390 12247689|7896|77698080 16768900|hh78|78959390 12247689|7896|77698080 16768900|hh78|78959390 ... (4 Replies)
Discussion started by: rkrish
4 Replies

4. UNIX for Dummies Questions & Answers

Pipe binary file matches grep results to file

I am using grep to match a pattern, but the output is strange. $ grep -r -o "pattern" * Gives me: Binary file foo1 matches Binary file foo2 matches Binary file foo3 matches To find the lines before/after, I then have to use the following on each file: $ strings foo1 | grep -A1 -B1... (0 Replies)
Discussion started by: chipperuga
0 Replies

5. UNIX for Dummies Questions & Answers

Extracting data from one file, based on another file (splitting)

Dear All, I have two files but want to extract data from one based on another... can you please help me file 1 David Tom Ellen and file 2 David|0010|testnamez|resultsz David|0004|testnamex|resultsx Tom|0010|testnamez|resultsz Tom|0004|testnamex|resultsx Ellen|0010|testnamez|resultsz... (12 Replies)
Discussion started by: A-V
12 Replies

6. Shell Programming and Scripting

Splitting XML file on basis of line number into multiple file

Hi All, I have more than half million lines of XML file , wanted to split in four files in a such a way that top 7 lines should be present in each file on top and bottom line of should be present in each file at bottom. from the 8th line actual record starts and each record contains 15 lines... (14 Replies)
Discussion started by: ajju
14 Replies

7. Shell Programming and Scripting

Execution of loop :Splitting a single file into multiple .dat file

hdr=$(cut -c1 $path$file|head -1)#extract header”H” trl=$(cut -c|path$file|tail -1)#extract trailer “T” SplitFile=$(cut -c 50-250 $path 1$newfile |sed'$/ *$//' head -1')# to trim white space and extract table name If; then # start loop if it is a header While read I #read file Do... (4 Replies)
Discussion started by: SwagatikaP1
4 Replies

8. Shell Programming and Scripting

Splitting a text file into smaller files with awk, how to create a different name for each new file

Hello, I have some large text files that look like, putrescine Mrv1583 01041713302D 6 5 0 0 0 0 999 V2000 2.0928 -0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 5.6650 0.2063 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 3.5217 ... (3 Replies)
Discussion started by: LMHmedchem
3 Replies

9. UNIX for Beginners Questions & Answers

Splitting the file based on two fields - Fixed length file

Hi , I am having a scenario where I need to split the file based on two field values. The file is a fixed length file. ex: AA0998703000000000000190510095350019500010005101980301 K 0998703000000000000190510095351019500020005101480 ... (4 Replies)
Discussion started by: saj
4 Replies

10. Shell Programming and Scripting

Create a text file and a pdf file from Linux command results.

Hello. The task : Using multiple commands like : gdisk -l $SOME_DISK >> $SOME_FILEI generate some text file. For readiness I must insert page break. When the program is finished I want to convert the final text file to a pdf file. When finished, I got two files : One text file and One pdf... (1 Reply)
Discussion started by: jcdole
1 Replies
FAILLOCK(8)							 Linux-PAM Manual						       FAILLOCK(8)

NAME
faillock - Tool for displaying and modifying the authentication failure record files SYNOPSIS
faillock [--dir /path/to/tally-directory] [--user username] [--reset] DESCRIPTION
The pam_faillock.so module maintains a list of failed authentication attempts per user during a specified interval and locks the account in case there were more than deny consecutive failed authentications. It stores the failure records into per-user files in the tally directory. The faillock command is an application which can be used to examine and modify the contents of the the tally files. It can display the recent failed authentication attempts of the username or clear the tally files of all or individual usernames. OPTIONS
--dir /path/to/tally-directory The directory where the user files with the failure records are kept. The default is /var/run/faillock. --user username The user whose failure records should be displayed or cleared. --reset Instead of displaying the user's failure records, clear them. FILES
/var/run/faillock/* the files logging the authentication failures for users SEE ALSO
pam_faillock(8), pam(8) AUTHOR
faillock was written by Tomas Mraz. Linux-PAM Manual 06/17/2014 FAILLOCK(8)
All times are GMT -4. The time now is 05:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy