I am currently using the sed and awk commands to filter a file that has multiple sets of data in different columns. An example of part of the file i am filtering is as follows;
when i filter the file i get the following result;
The sed and awk commands i am using are as follows;
I am trying to figure out how to filter the data so that, for example, instead of getting;
i would like to get;
Could i use the sed command twice so that i would get;
first and then use the sed command to remove the "TRAIN_" part to get;
This is only a suggestion but a much better method could probably be used.
Unfotunately i am new to unix so i am only just getting used to all the commands
If i have made anything unclear please let me know and i will try to explain the problem better.
Goo afternoon Sir'sould like to ask your help reagrding in this scenario using sed and awk.
********************************************************
Host:CDRMSAPPS1
Operating System:Linux 2.6.9-42.ELsmp
Machine Type:UNIX
Host Type:Client
Version:5.1... (2 Replies)
The problem I have is that I have 23,000 records I need to sort through to pull out LEN: XXXX XX XX XX XX and NCOS: XXX entries from so I can insert them into a database. But some of my records include TYPE: ISDN, THE DN IS UNASSIGNED, or INVALID entries in between some records and I would like... (2 Replies)
Hello folks,
I have 2 files one( file1) contains the ddl for a view and file 2 contains the view defination/alias columns.
I want to merge the 2 into a third file using awk/sed as follows:
cheers !
:b:
FILE1
-----
PROMPT FIRST_VIEW
CREATE OR REPLACE FORCE VIEW FIRST_VIEW
AS
SELECT... (2 Replies)
Hi All,
I have huge trade file with milions of trades.I need to remove duplicate records (e.g I have following records)
30/10/2009,trdeId1,..,..
26/10/2009.tradeId1,..,..,,
30/10/2009,tradeId2,..
In the above case i need to filter duplicate recods and I should get following output.... (2 Replies)
Hi,
I am new to unix and would greatly appreciate some help.
I have a file containing multiple colums containing different sets of data e.g.
File 1:
John Ireland 27_December_69
Mary England 13_March_55
Mike France 02_June_80
I am currently using the awk... (10 Replies)
Hi,
I am currently filtering a file that has multiple sets of data. An example of some of the data is as follows;
Sat Oct 2 07:42:45 2010 01:33:46 R1_CAR_12.34
Sun Oct 3 13:09:53 2010 00:02:34 R2_BUS_56.78
Sun Oct 3 21:11:39 2010 00:43:21 R3_TRAIN_COACH_90.12
Mon Oct 4... (1 Reply)
Experts Good day,
I want to filter multiple lines of same error of same day , to only 1 error of each day, the first line from the log.
Here is the file:
May 26 11:29:19 cmihpx02 vmunix: NFS write failed for server cmiauxe1: error 5 (RPC: Timed out)
May 26 11:29:19 cmihpx02 vmunix: NFS... (4 Replies)
Hello,
Does anyone know an easy way to filter this type of file? I want to get everything that has score (column 2) 100.00 and get rid of duplicates (for example gi|332198263|gb|EGK18963.1| below), so I guess uniq can be used for this?
gi|3379182634|gb|EGK18561.1| 100.00... (6 Replies)