Sponsored Content
Top Forums Shell Programming and Scripting Insert a text from a specific row into a specific column using SED or AWK Post 302333483 by rakeshawasthi on Monday 13th of July 2009 07:42:23 AM
Old 07-13-2009
Try:
Code:
awk '/LOCATION/ {
s="LOCATION "substr($NF,1,1);
}
/DRI/{
   print s" "$0;
}' file

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Can sed be used to insert data at specific column?

I'm trying to use sed to insert data at a specific column, let's say my data looks like this: 0553 1828 0552 1829 0550 1829 0549 1830 0548 1831 what I want is this: timein 0553 timeout 1828 timein 0552 timeout 1829 timein 0550 timeout 1829 timein 0549 timeout 1830 timein 0548... (5 Replies)
Discussion started by: mswartz
5 Replies

2. UNIX for Dummies Questions & Answers

AWK Command to find text in specific column

I'm new to scripting and would appreciate any help. I have a list of over 20 words in File1 that I need to find in columns 10-15 of File2. I need the entire row of File2 that the File1 list matches. I originally used a grep command which works, but provides File1 results that can be found... (3 Replies)
Discussion started by: Chillin
3 Replies

3. Shell Programming and Scripting

Assigning a specific format to a specific column in a text file using awk and printf

Hi, I have the following text file: 8 T1mapping_flip02 ok 128 108 30 1 665000-000008-000001.dcm 9 T1mapping_flip05 ok 128 108 30 1 665000-000009-000001.dcm 10 T1mapping_flip10 ok 128 108 30 1 665000-000010-000001.dcm 11 T1mapping_flip15 ok 128 108 30... (2 Replies)
Discussion started by: goodbenito
2 Replies

4. UNIX for Dummies Questions & Answers

Use sed to replace but only in a specific column of the text file

Hi, I would like to use sed to replace NA to x ('s/NA/x/g'), but only in the 5th column of the space delimited text file, nowhere else. How do I go about doing that? Thanks! (1 Reply)
Discussion started by: evelibertine
1 Replies

5. UNIX for Dummies Questions & Answers

awk: convert column to row in a specific way

Hi all! I have this kind of output: a1|b1|c1|d1|e1 a2|b2|c2 a3|b3|c3|d3 I would like to transpose columns d and e (when they exist) in column c, and under the row where they come from. Then copying the beginning of the row. In order to obtain: a1|b1|c1 a1|b1|d1 a1|b1|e1 a2|b2|c2... (1 Reply)
Discussion started by: lucasvs
1 Replies

6. Shell Programming and Scripting

Print unique names in each row of a specific column using awk

Is it possible to remove redundant names in the 4th column? input cqWE 100 200 singapore;singapore AZO 300 400 brazil;america;germany;ireland;germany .... .... output cqWE 100 200 singapore AZO 300 400 brazil;america;germany;ireland (4 Replies)
Discussion started by: quincyjones
4 Replies

7. Shell Programming and Scripting

Insert text into specific column

I have the following data: Dec 24 11:31:10 0000008b 9911662486 Answered Price SGD 0.003 PERIOD: 0 m 6 s Dec 24 11:21:42 00000086 9911662486 Answered Price SGD 0.001 PERIOD: 0 m 2 s Dec 20 15:34:28 00000004 9911662486 Answered Price SGD 0.007 PERIOD: 0 m 12 s Dec 20 18:42:30 0000017b... (6 Replies)
Discussion started by: alegnagrp
6 Replies

8. Shell Programming and Scripting

awk or sed to find specific column from different files

Hi everybody, I have a folder with many files: Files with 8 columns: X 123 A B C D E F And files with 7 columns: X1234 A B C D E F I am trying to find a way to extract the 5th column when the files have eight columns, or the 4th column when the files have... (3 Replies)
Discussion started by: Tzole
3 Replies

9. Shell Programming and Scripting

sed or awk to remove specific column to one range

I need to remove specific column to one range source file 3 1 000123456 2 2 000123569 3 3 000123564 12 000123156 15 000125648 128 000125648 Output required 3 000123456 2 000123569 3 000123564 12 000123156 15 000125648 128 000125648 (6 Replies)
Discussion started by: ranjancom2000
6 Replies

10. Shell Programming and Scripting

Using awk to change a specific column and in a specific row

I am trying to change the number in bold to 2400 01,000300032,193631306,190619,0640,1,80,,2/ 02,193631306,000300032,1,190618,0640,CAD,2/ I'm not sure if sed or awk is the answer. I was going to use sed and do a character count up to that point, but that column directly before 0640 might... (8 Replies)
Discussion started by: juggernautjoee
8 Replies
Net::DRI::Logging::Null(3pm)				User Contributed Perl Documentation			      Net::DRI::Logging::Null(3pm)

NAME
Net::DRI::Logging::Null - Null Logging Operations for Net::DRI VERSION
This documentation refers to Net::DRI::Logging::Null version 1.01 SYNOPSIS
See Net::DRI::Logging DESCRIPTION
This is the default logging class used by Net::DRI if nothing else is specified, It discards everything (no logging at all). EXAMPLES
$dri->new({..., logging => 'null' ,...}); If not provided during "new()", this is the default behaviour. SUBROUTINES
/METHODS All mandated by superclass Net::DRI::Logging. DIAGNOSTICS
None. CONFIGURATION AND ENVIRONMENT
None. DEPENDENCIES
This module has to be used inside the Net::DRI framework and needs the following components: Net::DRI::Logging INCOMPATIBILITIES
None BUGS AND LIMITATIONS
No known bugs. Please report problems to author (see below) or use CPAN RT system. Patches are welcome. SUPPORT
For now, support questions should be sent to: <netdri@dotandco.com> Please also see the SUPPORT file in the distribution. SEE ALSO
<http://www.dotandco.com/services/software/Net-DRI/> AUTHOR
Patrick Mevzek, <netdri@dotandco.com> LICENSE AND COPYRIGHT
Copyright (c) 2009 Patrick Mevzek <netdri@dotandco.com>. All rights reserved. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. See the LICENSE file that comes with this distribution for more details. perl v5.10.1 2010-03-25 Net::DRI::Logging::Null(3pm)
All times are GMT -4. The time now is 04:22 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy