Sponsored Content
Full Discussion: Problem with awk script
Top Forums Shell Programming and Scripting Problem with awk script Post 302445850 by Man83Nagesh on Tuesday 17th of August 2010 04:04:26 AM
Old 08-17-2010
hi rdcwayx,

i'm to doing the thing same thing but actually my data consists of huge data with somany columns and records....is it feasible to create a new file all the time and rename ..

instead is there any other way to just to replace the description filed in tmp1.csv with the code in spec.csv directly..

i'm relatively new to shell scripting.....
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk script Problem

I wrote a awk but doesnt work as expected. The Input File attached input file My awk Script /^.......*EXEC CICS /,/END-EXEC/ { if ( $0 ~ / LINK / ) { tsflag=1 } if ( $0 ~ /EXEC CICS/ && tsflag == 1 ) ... (6 Replies)
Discussion started by: pbsrinivas
6 Replies

2. Shell Programming and Scripting

Problem with one awk script

Hi , I am having a file having the contents like this file1 ##################### kite kshitij jolly admire in the wing and tell me the secret behind opus 123 and the right of the track ######################### I have to write one awk script to substitue some values with other... (6 Replies)
Discussion started by: kshitij
6 Replies

3. Shell Programming and Scripting

Problem with awk script

Hi Can anyone help me in this Problem File1 ######################### HOLI 123 AND ONE TWO THREE AMITABH SAMSUNG POLI AND TWO SENSE CRYING WING PPIN TBFLAG I B AND OROLE TB_HOT=" DCT" TB_CAT=" CAT" TC_NOT=" AND" +PIN TB=" HOT" TB_GATE=" KOT" TB_LATE=" MAT" TC=LOT MAT DAT SAT... (5 Replies)
Discussion started by: kshitij
5 Replies

4. Shell Programming and Scripting

Problem with a AWK Script

Hi I am having some contents in my file like this file1 ########################## pin (PIN1) { direction : input ; capacitance : 121 ; max_transition : 231 ; } pin (PIN2) { direction : input ; capacitance : 124 ; max_transition : 421 ;... (8 Replies)
Discussion started by: kshitij
8 Replies

5. Shell Programming and Scripting

awk script problem

Hi All, I have the following input data: That I'd like to look like this ($2 is the column I'd like it to appear in) where the entries are grouped by date: The code I have at present is: awk 'BEGIN {} { dt = $1 if (dt == dt_prev) { pp = $3 ... (7 Replies)
Discussion started by: pondlife
7 Replies

6. Shell Programming and Scripting

Problem with an awk Script

hello, first, yes i searched the forum , google and read many tutorials but still have a problem with my script. I have great Problems, because i haven't worked with regular expressions before and never had anything to do with shellscripts. i am a complete Newby in this sort of theme. I have... (8 Replies)
Discussion started by: Crashvogel
8 Replies

7. Shell Programming and Scripting

Awk script Problem

Hi , I am having two files FILE1 and FILE2 as shown below I need to search each and every element of Coulumn1 in the FILE1 in FILE2 and Globally replace with the Corresponding element of the Column2 in the FILE2 , For example and1 which is the first element of COl 1 of the FILE1 should be... (4 Replies)
Discussion started by: jaita
4 Replies

8. Shell Programming and Scripting

problem with awk script

Hi, I have two files Hi, I have two files file1 :> Code: val="10" port="localhost:8080"httpadd="http:\\192.168.0.239" file2 :> Code: val=${val} val="pdssx" port=${port}port="1324"httpadd=${httpadd}httpadd="raamraav"fileloc=${fileloc} file3(or file2) should have following... (1 Reply)
Discussion started by: nitin.pathak
1 Replies

9. Shell Programming and Scripting

Awk Script Problem

Can someone please explain to me what is wrong with this awk script? echo 74 85 | awk '{ if ( $1 > $2 ) PRESULTS = ( $1 - $2 ); print $0,"=>","P"PRESULTS ; else if ( $1 > $2 ) NRESULTS = ( $2 - $1... (3 Replies)
Discussion started by: SkySmart
3 Replies

10. Shell Programming and Scripting

awk script problem

Hello guys i have following problem. I'm trying to copy content of one file and paste this content in all .txt files in directory, but at line 15. My script copy the content at first line, not 15. I'm confused how to do this. Thank you in advance for your help! This is my script: ARGS=2 ... (9 Replies)
Discussion started by: r00ty
9 Replies
BB-CSVINFO.CGI(1)					      General Commands Manual						 BB-CSVINFO.CGI(1)

NAME
bb-csvinfo.cgi - CGI program to show host information from a CSV file SYNOPSIS
bb-csvinfo.cgi DESCRIPTION
bb-csvinfo.cgi is invoked as a CGI script via the bb-csvinfo.sh CGI wrapper. Based on the parameters it receives, it searches a comma- separated file for the matching host, and presents the information found as a table. bb-csvinfo.cgi is passed a QUERY_STRING environment variable with the following parameters: key (string to search for, typically hostname) column (columnnumber to search - default 0) db (name of the CSV database file in $BBHOME/etc/, default hostinfo.csv) delimiter (delimiter character for columns, default semi-colon) CSV files are easily created from e.g. spreadsheets, by exporting them in CSV format. You should have one host per line, with the first line containing the column headings. Despite their name, the default delimiter for CSV files is the semi-colon - if you need a different delimiter, invoke bb-csvinfo.cgi with the "delimiter=<character>" in the query string. Example usage This example shows how you can use the bb-csvinfo CGI. It assumes you have a CSV-formatted file with information about the hosts stored as $BBHOME/etc/hostinfo.csv, and the hostname is in the first column of the file. Use with the bbgen --docurl The --docurl option to bbgen(1) sets up all of the hostnames on your Xymon webpages to act as links to a CGI script. To invoke the bb-csvinfo CGI script, run bbgen with the option --docurl=/cgi-bin/bb-csvinfo.sh?db=hostinfo.csv&key=%s SEE ALSO
bb-hosts(5), hobbitserver.cfg(5), bbgen(1) Xymon Version 4.2.3: 4 Feb 2009 BB-CSVINFO.CGI(1)
All times are GMT -4. The time now is 02:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy