Hi all,
I appreciate the enormous amount of knowledge that flows in this forum.
I am an average UNIX user. I have many files with lines like the below. I have separated each line with space for ease of reading. I need to replace the first occurance of "/00" with null on those lines that have... (6 Replies)
Hi,
I have a Line input for awk as follows
DROP MATERIALIZED VIEW MCR.COMM_STACK;
CREATE MATERIALIZED VIEW "MCR"."COMM_STACK"
ON PREBUILT TABLE WITHOUT REDUCED PRECISION
USING INDEX
REFRESH FAST ON DEMAND START WITH sysdate+0 NEXT SYSDATE + 7
WITH PRIMARY KEY USING DEFAULT... (3 Replies)
Hi,
I have a folder which contains multiple config.xml files and one input file, Please see the below format.
Config Files format looks like :-
Code:
<application name="SAMPLE-ARCHIVE">
<NVPairs name="Global Variables">
<NameValuePair>
... (0 Replies)
Given this row:
|lastname1|middlename1|firstname1|lastname2|middlename2|firstname2
produce this result:
|lastname|middlename|firstname
where the resultant names are based on the presence of the #2 names above. I.e., if a #2 name is passed (usually will be null,) use that - otherwise... (8 Replies)
I'm looking for an awk or (preferably) sed solution to search a pipe delimited file for any occurrence of an email address that does not include a designed domain, and replace the email address with a blank. E.g.
hello|smith@designateddomain.com|jones@anotherdomain.edu|1234|
turns into:
... (2 Replies)
I have a environment property file which contains:
Input file:
value1 = url1
value2 = url2
value3 = url3 and so on.
I need to search all *.xml files under directory for value1 and replace it with url1.
Same thing I have to do for all values mentioned in input file. I need script in unix bash... (7 Replies)
Hi Guys,
I am having below file which holds data like this
file.txt
name,id,flag
apple,1,Y
apple,2,N
mango,1,Y
mango,2,Y
I need to read the above file and frame a query like this
hive -s -e "create apple_view as select 1 from main_table;"
hive -s -e "create mango_view as select... (11 Replies)
Discussion started by: rohit_shinez
11 Replies
LEARN ABOUT DEBIAN
hierarch28
hierarch28(1) General Commands Manual hierarch28(1)NAME
hierarch28 - header conversion from ESO to standard FITS
SYNOPSIS
hierarch28 [options] <FITS> [table]
DESCRIPTION
hierarch28 will convert keyword names in a FITS header to new names, using a user-provided ASCII conversion table. It is especially aimed
at removing no-standard FITS features, such as the HIERARCH ESO keyword names.
hierarch28 can also perform a translation to the IRAF convention on the following four keywords: 'RA', 'DEC', 'UT' and 'LST'. IRAF requires
these keywords to contain the string representation of their values, e.g.
RA = ' 09:45:14.594'
DEC = '-33:47:09.420'
UT = ' 01:17:21.950'
LST = ' 08:19:59.688'
The ESO standard (see http://archive.eso.org/dicb) defines these keywords as floating point values with the units degrees for RA/DEC and
elapsed seconds since midnight for UT/LST.
In order to have this translation performed, add
RA = RA
DEC = DEC
UT = UT
LST = LST
to the conversion table.
OPTIONS -g This option is used to generate default translation tables.
FILES
hierarch28 expects a conversion table in input. Default name for this table is table.conv in the current working directory. Indicate
another name for this file as last argument on the command-line. This ASCII file contains a list of keywords to replace, in the following
format:
#
# Comment lines start with a hash '#'
# Blank lines would be ignored
#
OLDKEYWORD1 = NEWKEYWORD1
OLDKEYWORD2 = NEWKEYWORD2
etc.
Input keywords are character strings, they may contain blanks. Example:
HIERARCH ESO DET DIT = DETDIT
One important restriction is that the new keyword name may not be longer than the initial one. The program will complain that it cannot
achieve search and replace if this is the case.
IMPORTANT
This programs achieves maximal speed to operate because it modifies the input file directly. Be aware that using hierarch28 on a file will
modify its contents in an irreversible way!
29 May 2000 hierarch28(1)