Sponsored Content
Top Forums Shell Programming and Scripting Split a content in a file with specific interval base on the delimited values using UNIX command Post 303004295 by apmcd47 on Friday 29th of September 2017 04:54:03 AM
Old 09-29-2017
Quote:
Originally Posted by rdrtx1
Code:
rm -f outfile_*

file_count=1
line_count=1

outfile="outfile_$file_count"

while read line
do
   echo "$line" >> $outfile
   (( line_count = line_count + 1 ))
   [[ $line_count -ge 5 ]] && { echo "$line" | grep -q "^ *; *$" && {
         (( file_count = file_count + 1 ))
         (( line_count = 1 ))
      }
      outfile="outfile_$file_count"
   }
done < input_file

This looks like it should work to me. Have you tested it? I don't think you need the echo "$line" | grep. If using ksh93 or bash 4 you could write:
Code:
[[ "${line}" =~ ";" ]] && [[ "${line_count}" -ge 5 ]] && {

Andrew
This User Gave Thanks to apmcd47 For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to split pipe delimited file

I have a pipe delimited input file as below. First byte of the each line indicate the record type. Then i need to split the file based on record_type = null,0,1,2,6 and create 5 files. How do i do this in a ksh script? Pls help |sl||SL|SL|SL|1996/04/03|1988/09/15|C|A|sl||||*|... (4 Replies)
Discussion started by: njgirl
4 Replies

2. Shell Programming and Scripting

Retrieving values from tab-delimited file in unix script

Hi I am trying to retrieve values from a tab-delimited file.I am using while read record value=`echo $record | cut -f12` done Where 12 is the column no i want retieve and record is one line of the file. But it is returning the full record. Plz help (4 Replies)
Discussion started by: akashtcs
4 Replies

3. UNIX for Dummies Questions & Answers

unix command to insert double quotes in a delimited file

Hi, I am looking for a unix command which inserts double quotes around all values in a delimited file. For example, Input File 153665031,abc,abc ,abc 131278839,def,def ,dec 179821481,efg,efg ,lmn ... (6 Replies)
Discussion started by: Bachu
6 Replies

4. Shell Programming and Scripting

Extract value from delimited file base on white list

I would like to use a variable to store the IDs that I would like to extract. I would like to extract a list of values of the IDs from a delimited string. Using bash here. file format would be id1=we1;id2=er2;id3=rt3;id4=yu4 The number of fields and records is not fixed. There could be... (2 Replies)
Discussion started by: milo7
2 Replies

5. Shell Programming and Scripting

how to Insert values in multiple lines(records) within a pipe delimited text file in specific cols

this is Korn shell unix. The scenario is I have a pipe delimited text file which needs to be customized. say for example,I have a pipe delimited text file with 15 columns(| delimited) and 200 rows. currently the 11th and 12th column has null values for all the records(there are other null columns... (4 Replies)
Discussion started by: vasan2815
4 Replies

6. Shell Programming and Scripting

split file content into specific folders

Hi I have a large text file and I want to split its content into multiple flies. this large file contains several blocks of codes separated by a comment line for each block. this comment line represents a directory path So, when separate these blocks each into a separate file, This output... (7 Replies)
Discussion started by: turki_00
7 Replies

7. UNIX for Dummies Questions & Answers

Quick UNIX command to display specific lines in the middle of a file from/to specific word

This could be a really dummy question. I have a log text file. What unix command to extract line from specific string to another specific string. Is it something similar to?: more +/"string" file_name Thanks (4 Replies)
Discussion started by: aku
4 Replies

8. Linux

Split a large textfile (one file) into multiple file to base on ^L

Hi, Anyone can help, I have a large textfile (one file), and I need to split into multiple file to break each file into ^L. My textfile ========== abc company abc address abc contact ^L my company my address my contact my skills ^L your company your address ========== (3 Replies)
Discussion started by: fspalero
3 Replies

9. Shell Programming and Scripting

Split Big XML file Base on tag

HI I want to split file base on tag name. I have few header and footer on file <?xml version="1.33" encing="UTF-8"?> <bulkCmConfigDataFile" <xn:SubNetwork id="ONRM_ROOT"> <xn:MeContext id="PPP04156"> ... (4 Replies)
Discussion started by: pareshkp
4 Replies

10. UNIX for Beginners Questions & Answers

awk command to split pipe delimited file

Hello, I need to split a pipe de-limited file based on the COLUMN 7 value . If the column value changes I need to split the file Source File Payment|ID|DATE|TIME|CONTROLNUMBER|NUMBER|NAME|INDICATOR 42156974|1137937|10/1/2018|104440|4232|2054391|CARE|1... (9 Replies)
Discussion started by: rosebud123
9 Replies
Perl::Metrics::Simple(3pm)				User Contributed Perl Documentation				Perl::Metrics::Simple(3pm)

NAME
Perl::Metrics::Simple - Count packages, subs, lines, etc. of many files. SYNOPSIS
use Perl::Metrics::Simple; my $analyzer = Perl::Metrics::Simple->new; my $analysis = $analyzer->analyze_files(@paths, @refs_to_file_contents); $file_count = $analysis->file_count; $package_count = $analysis->package_count; $sub_count = $analysis->sub_count; $lines = $analysis->lines; $main_stats = $analysis->main_stats; $file_stats = $analysis->file_stats; VERSION
This is VERSION 0.12 DESCRIPTION
Perl::Metrics::Simple provides just enough methods to run static analysis of one or many Perl files and obtain a few metrics: packages, subroutines, lines of code, and an approximation of cyclomatic (mccabe) complexity for the subroutines and the "main" portion of the code. Perl::Metrics::Simple is far simpler than Perl::Metrics. Installs a script called countperl. USAGE
See the countperl script (included with this distribution) for a simple example of usage. CLASS METHODS
new Takes no arguments and returns a new Perl::Metrics::Simple object. is_perl_file Takes a path and returns true if the target is a Perl file. OBJECT METHODS
analyze_files( @paths, @refs_to_file_contents ) Takes an array of files and or directory paths, and/or SCALAR refs to file contents and returns an Perl::Metrics::Simple::Analysis object. find_files( @directories_and_files ) Uses list_perl_files to find all the readable Perl files and returns a reference to a (possibly empty) list of paths. list_perl_files Takes a list of one or more paths and returns an alphabetically sorted list of only the perl files. Uses is_perl_file so may throw an exception if a file is unreadable. is_perl_file($path) Takes a path to a file and returns true if the file appears to be a Perl file, otherwise returns false. If the file name does not match any of @Perl::Metrics::Simple::PERL_FILE_SUFFIXES then the file is opened for reading and the first line examined for a a Perl 'shebang' line. An exception is thrown if the file cannot be opened in this case. should_be_skipped($path) Returns true if the path should be skipped when looking for Perl files. Currently skips .svn, CVS, and _darcs directories. BUGS AND LIMITATIONS
See: http://rt.cpan.org/NoAuth/Bugs.html?Dist=Perl-Metrics-Simple SUPPORT
Via CPAN: Disussion Forum http://www.cpanforum.com/dist/Perl-Metrics-Simple Bug Reports http://rt.cpan.org/NoAuth/Bugs.html?Dist=Perl-Metrics-Simple AUTHOR
Matisse Enzer CPAN ID: MATISSE Eigenstate Consulting, LLC matisse@eigenstate.net http://www.eigenstate.net/ LICENSE AND COPYRIGHT
Copyright (c) 2006-2009 by Eigenstate Consulting, LLC. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. The full text of the license can be found in the LICENSE file included with this module. SEE ALSO
The countperl script included with this distribution. PPI Perl::Critic Perl::Metrics http://en.wikipedia.org/wiki/Cyclomatic_complexity perl v5.10.1 2010-05-13 Perl::Metrics::Simple(3pm)
All times are GMT -4. The time now is 04:33 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy