Sponsored Content
Full Discussion: Text processing of file
Top Forums Shell Programming and Scripting Text processing of file Post 302576783 by ahamed101 on Saturday 26th of November 2011 09:11:55 AM
Old 11-26-2011
Try this...
Code:
awk '{
        sub(/-1/,0)?NULL:sub(/+1/,1)
        printf $1
        for(i=2;i<=NF;i++){
                split($i,arr,":")
                for(j=last+1;j<arr[1];j++) printf " 0"
                printf " "arr[2]
                last=arr[1]
        }
        for(i=last+1;i<=123;i++) printf " 0"
        last=0;printf "\n"
}' input_file

--ahamed

Last edited by ahamed101; 11-26-2011 at 10:03 AM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Processing a text file

A file contains one name per line, such as: john doe jack bruce nancy smith sam riley When I 'cat' the file, the white space is treated as a new line. For example list=`(cat /path/to/file.txt)` for items in $list do echo $items done I get: john doe (1 Reply)
Discussion started by: TheCrunge
1 Replies

2. UNIX for Dummies Questions & Answers

text file processing

Hello! There is a text file, that contains hierarchy of menues, like: Aaaaa->Bbbbb Aaaaa->Cccc Aaaaa-> {spaces} Ddddd (it means that the full path is Aaaaa->Cccc->Ddddd ) Aaaaa-> {more spaces} Eeeee (it means that the full path is Aaaaa->Cccc->Ddddd->Eeeee ) Fffffff->Ggggg... (1 Reply)
Discussion started by: alias47
1 Replies

3. Shell Programming and Scripting

KSH script -text file processing NULL issues

I'm trying to strip any garbage that may be at the end of my text file and that part is working. The problem only seems to be with the really long lines in the file. When the head command is executed I am directing the output to a new file. The new file always get a null in the 4096 position but... (2 Replies)
Discussion started by: geauxsaints
2 Replies

4. Shell Programming and Scripting

processing file names using text files

Hi, I have to perform an iterative function on a set of 10 files. After the first round the output files are named differently than the input files. examples input file name = xxxx1.yyy output file name = xxxx1_0001.yyy I need to rename all of the output files to the original input... (5 Replies)
Discussion started by: ligander
5 Replies

5. Shell Programming and Scripting

awk, perl Script for processing a single line text file

I need a script to process a huge single line text file: The sample of the text is: "forward_inline_item": "Inline", "options_region_Australia": "Australia", "server_event_err_msg": "There was an error attempting to save", "Token": "Yes", "family": "Family","pwd_login_tab": "Enter Your... (1 Reply)
Discussion started by: hmsadiq
1 Replies

6. UNIX for Advanced & Expert Users

perl text file processing using hash

Hi Experts, I have this requirement to process large files (200MB+).Format of files is like: recordstart val1 1 val2 2 val3 4 recordstart val1 5 val2 6 val3 1 val4 1 recordstart val1 ... (4 Replies)
Discussion started by: mtomar
4 Replies

7. UNIX for Dummies Questions & Answers

Take output of processing in text file

Hi ALL, I am presently using perl script mukesh.pl I just want to catch its output into another text file . So I am using > File.txt . I am getting output but i want the whole processing of the script into that file please let me know . Thanks in advance Cheers Mukesh (1 Reply)
Discussion started by: mumakhij
1 Replies

8. Programming

awk processing / Shell Script Processing to remove columns text file

Hello, I extracted a list of files in a directory with the command ls . However this is not my computer, so the ls functionality has been revamped so that it gives the filesizes in front like this : This is the output of ls command : I stored the output in a file filelist 1.1M... (5 Replies)
Discussion started by: ajayram
5 Replies

9. Shell Programming and Scripting

Grep -c text processing of a log file

I have a log file with below format. Log File: 1 started job on date & time JOB-A 2 started job on date & time JOB-B 3 completed job on data & time JOB-A 4 started job on date & time JOB-C 5 started job on date & time JOB-D 6 completed job on data & time JOB-B 7 started job on date &... (8 Replies)
Discussion started by: ctrld
8 Replies

10. Shell Programming and Scripting

Text File with Binary Values processing

Hello all, I have a txt file containing millions of lines. Below is the example: {tx:be} head -50 file.txt Instr1: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Instr1:... (6 Replies)
Discussion started by: Zam_1234
6 Replies
English(3pm)						 Perl Programmers Reference Guide					      English(3pm)

NAME
English - use nice English (or awk) names for ugly punctuation variables SYNOPSIS
use English; use English qw( -no_match_vars ) ; # Avoids regex performance penalty # in perl 5.16 and earlier ... if ($ERRNO =~ /denied/) { ... } DESCRIPTION
This module provides aliases for the built-in variables whose names no one seems to like to read. Variables with side-effects which get triggered just by accessing them (like $0) will still be affected. For those variables that have an awk version, both long and short English alternatives are provided. For example, the $/ variable can be referred to either $RS or $INPUT_RECORD_SEPARATOR if you are using the English module. See perlvar for a complete list of these. PERFORMANCE
NOTE: This was fixed in perl 5.20. Mentioning these three variables no longer makes a speed difference. This section still applies if your code is to run on perl 5.18 or earlier. This module can provoke sizeable inefficiencies for regular expressions, due to unfortunate implementation details. If performance matters in your application and you don't need $PREMATCH, $MATCH, or $POSTMATCH, try doing use English qw( -no_match_vars ) ; . It is especially important to do this in modules to avoid penalizing all applications which use them. perl v5.18.2 2014-01-06 English(3pm)
All times are GMT -4. The time now is 01:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy