Sponsored Content
Top Forums Shell Programming and Scripting Need to Preprocess a text file and convert into csv Post 302954856 by Don Cragun on Friday 11th of September 2015 02:29:39 PM
Old 09-11-2015
The following seems to do what you want:
Code:
awk -v nof=123 -v OFS=, '
{	printf("%s%s", $1, OFS)
	f = 1
	for(i = 2; i <= NF; i++) {
		while(f < $i + 0 && f <= nof)
			printf("0%s", f++ < nof ? OFS : ORS)
		if(f == $i + 0 && f <= nof)
			printf("1%s", f++ < nof ? OFS : ORS)
	}
	while(f++ <= nof)
		printf("0%s", f <= nof ? OFS : ORS)
}' file

 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

how to convert text/csv to excel

Hello All, I have a sql report with 50 columns and 1000 rows result in a file ( txt / csv). is there is any way that we can move them to excel in KSH. Thanks, Sateesh (7 Replies)
Discussion started by: kotasateesh
7 Replies

2. Programming

convert text file to csv

hi all, i have a select query that gives me the output in the following way... SYSTYPE -------------------------------------------------------------------------------- Success Failures Total RFT ---------- ---------- ---------- ---------- TYP 1 0 ... (3 Replies)
Discussion started by: sais
3 Replies

3. Programming

awk script to convert a text file into csv format

hi...... thanks for allowing me to start a discussion i am collecting usb usage details of all users and convert it into csv files so that i can export it into some database.. the input text file is as follows:- USB History Dump by nabiy (c)2008 (1) --- Kingston DataTraveler 130 USB... (2 Replies)
Discussion started by: certteam
2 Replies

4. Shell Programming and Scripting

Perl program to convert PDF to text/CSV

Please suggest ways to easily convert pdf to text in perl only on windows (no other tools can be downloaded) Here is what I have been doing : using a module CAM::PDF to extract data. But it shows everything in messy format :wall: But this module is the only one working with the pdf... (0 Replies)
Discussion started by: chakrapani
0 Replies

5. Shell Programming and Scripting

Convert text to CSV

Hi Gurus I need urgent help to convert a flat log file into csv format to load into database. Log looks like: a=1 b=2 c=3 a=4 b=5 c=6 Only the values at right side of = will come into csv and it should create a new line once it receives "a" field. (8 Replies)
Discussion started by: sandipjee
8 Replies

6. Shell Programming and Scripting

Awk to convert a text file to CSV file with some string manipulation

Hi , I have a simple text file with contents as below: 12345678900 971,76 4234560890 22345678900 5971,72 5234560990 32345678900 71,12 6234560190 the new csv-file should be like: Column1;Column2;Column3;Column4;Column5 123456;78900;971,76;423456;0890... (9 Replies)
Discussion started by: FreddyDaKing
9 Replies

7. Shell Programming and Scripting

Trying extract from text file and convert csv

I want to extract IP address, system ID and engine IDs of this file ( marked in red) and put in a csv. E.g. 1.1.1.1, SYSTEMID, 000012345678981123548912 I get these file by running an expect script from solaris. Here is the text file output of my expect script. working on 1.1.1.1 SNMP... (5 Replies)
Discussion started by: pbshillong
5 Replies

8. Shell Programming and Scripting

How to convert excel file to csv file or text file?

Hi all, I need to find a way to convert excel file into csv or a text file in linux command. The reason is I have hundreds of files to convert. Another complication is the I need to delete the first 5 lines of the excel file before conversion. so for instance input.xls description of... (6 Replies)
Discussion started by: johnkim0806
6 Replies

9. Shell Programming and Scripting

Read csv file, convert the data and make one text file in UNIX shell scripting

I have input data looks like this which is a part of a csv file 7,1265,76548,"0102:04" 8,1266,76545,"0112:04" I need to make the output data should look like this and the output data will be part of text file: 7|1265000 |7654899 |A| 8|12660000 |76545999 |B| The logic behind the... (6 Replies)
Discussion started by: RJG
6 Replies

10. Shell Programming and Scripting

Convert text to csv

Hi, Is there somebody there to post an idea on how to convert this 5 liner row to 1 liner or tab delimiter to be import to database. Here the text file format: Description: Description1 Link: https://www.google.com Date: June 2, 2018 Time: 00:07:44 Age: 1 days ago Description:... (2 Replies)
Discussion started by: lxdorney
2 Replies
PERLTRAP(1)						 Perl Programmers Reference Guide					       PERLTRAP(1)

NAME
perltrap - Perl traps for the unwary DESCRIPTION
The biggest trap of all is forgetting to "use warnings" or use the -w switch; see perllexwarn and perlrun. The second biggest trap is not making your entire program runnable under "use strict". The third biggest trap is not reading the list of changes in this version of Perl; see perldelta. Awk Traps Accustomed awk users should take special note of the following: o A Perl program executes only once, not once for each input line. You can do an implicit loop with "-n" or "-p". o The English module, loaded via use English; allows you to refer to special variables (like $/) with names (like $RS), as though they were in awk; see perlvar for details. o Semicolons are required after all simple statements in Perl (except at the end of a block). Newline is not a statement delimiter. o Curly brackets are required on "if"s and "while"s. o Variables begin with "$", "@" or "%" in Perl. o Arrays index from 0. Likewise string positions in substr() and index(). o You have to decide whether your array has numeric or string indices. o Hash values do not spring into existence upon mere reference. o You have to decide whether you want to use string or numeric comparisons. o Reading an input line does not split it for you. You get to split it to an array yourself. And the split() operator has different arguments than awk's. o The current input line is normally in $_, not $0. It generally does not have the newline stripped. ($0 is the name of the program executed.) See perlvar. o $<digit> does not refer to fields--it refers to substrings matched by the last match pattern. o The print() statement does not add field and record separators unless you set $, and "$". You can set $OFS and $ORS if you're using the English module. o You must open your files before you print to them. o The range operator is "..", not comma. The comma operator works as in C. o The match operator is "=~", not "~". ("~" is the one's complement operator, as in C.) o The exponentiation operator is "**", not "^". "^" is the XOR operator, as in C. (You know, one could get the feeling that awk is basically incompatible with C.) o The concatenation operator is ".", not the null string. (Using the null string would render "/pat/ /pat/" unparsable, because the third slash would be interpreted as a division operator--the tokenizer is in fact slightly context sensitive for operators like "/", "?", and ">". And in fact, "." itself can be the beginning of a number.) o The "next", "exit", and "continue" keywords work differently. o The following variables work differently: Awk Perl ARGC scalar @ARGV (compare with $#ARGV) ARGV[0] $0 FILENAME $ARGV FNR $. - something FS (whatever you like) NF $#Fld, or some such NR $. OFMT $# OFS $, ORS $ RLENGTH length($&) RS $/ RSTART length($`) SUBSEP $; o You cannot set $RS to a pattern, only a string. o When in doubt, run the awk construct through a2p and see what it gives you. C/C++ Traps Cerebral C and C++ programmers should take note of the following: o Curly brackets are required on "if"'s and "while"'s. o You must use "elsif" rather than "else if". o The "break" and "continue" keywords from C become in Perl "last" and "next", respectively. Unlike in C, these do not work within a "do { } while" construct. See "Loop Control" in perlsyn. o The switch statement is called "given/when" and only available in perl 5.10 or newer. See "Switch Statements" in perlsyn. o Variables begin with "$", "@" or "%" in Perl. o Comments begin with "#", not "/*" or "//". Perl may interpret C/C++ comments as division operators, unterminated regular expressions or the defined-or operator. o You can't take the address of anything, although a similar operator in Perl is the backslash, which creates a reference. o "ARGV" must be capitalized. $ARGV[0] is C's "argv[1]", and "argv[0]" ends up in $0. o System calls such as link(), unlink(), rename(), etc. return nonzero for success, not 0. (system(), however, returns zero for success.) o Signal handlers deal with signal names, not numbers. Use "kill -l" to find their names on your system. Sed Traps Seasoned sed programmers should take note of the following: o A Perl program executes only once, not once for each input line. You can do an implicit loop with "-n" or "-p". o Backreferences in substitutions use "$" rather than "". o The pattern matching metacharacters "(", ")", and "|" do not have backslashes in front. o The range operator is "...", rather than comma. Shell Traps Sharp shell programmers should take note of the following: o The backtick operator does variable interpolation without regard to the presence of single quotes in the command. o The backtick operator does no translation of the return value, unlike csh. o Shells (especially csh) do several levels of substitution on each command line. Perl does substitution in only certain constructs such as double quotes, backticks, angle brackets, and search patterns. o Shells interpret scripts a little bit at a time. Perl compiles the entire program before executing it (except for "BEGIN" blocks, which execute at compile time). o The arguments are available via @ARGV, not $1, $2, etc. o The environment is not automatically made available as separate scalar variables. o The shell's "test" uses "=", "!=", "<" etc for string comparisons and "-eq", "-ne", "-lt" etc for numeric comparisons. This is the reverse of Perl, which uses "eq", "ne", "lt" for string comparisons, and "==", "!=" "<" etc for numeric comparisons. Perl Traps Practicing Perl Programmers should take note of the following: o Remember that many operations behave differently in a list context than they do in a scalar one. See perldata for details. o Avoid barewords if you can, especially all lowercase ones. You can't tell by just looking at it whether a bareword is a function or a string. By using quotes on strings and parentheses on function calls, you won't ever get them confused. o You cannot discern from mere inspection which builtins are unary operators (like chop() and chdir()) and which are list operators (like print() and unlink()). (Unless prototyped, user-defined subroutines can only be list operators, never unary ones.) See perlop and perlsub. o People have a hard time remembering that some functions default to $_, or @ARGV, or whatever, but that others which you might expect to do not. o The <FH> construct is not the name of the filehandle, it is a readline operation on that handle. The data read is assigned to $_ only if the file read is the sole condition in a while loop: while (<FH>) { } while (defined($_ = <FH>)) { }.. <FH>; # data discarded! o Remember not to use "=" when you need "=~"; these two constructs are quite different: $x = /foo/; $x =~ /foo/; o The "do {}" construct isn't a real loop that you can use loop control on. o Use "my()" for local variables whenever you can get away with it (but see perlform for where you can't). Using "local()" actually gives a local value to a global variable, which leaves you open to unforeseen side-effects of dynamic scoping. o If you localize an exported variable in a module, its exported value will not change. The local name becomes an alias to a new value but the external name is still an alias for the original. As always, if any of these are ever officially declared as bugs, they'll be fixed and removed. perl v5.18.2 2014-01-06 PERLTRAP(1)
All times are GMT -4. The time now is 05:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy