Sponsored Content
Top Forums Shell Programming and Scripting SFTP - accidentally removing unprocessed records Post 302714797 by jeffvansan on Friday 12th of October 2012 05:08:13 PM
Old 10-12-2012
Thanks for the quick response. And thanks for fixing my code tags.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Removing spaces between records

Hi I have an XML file. Which has spaces between different records.... current file( Has many lines, like this... I want to delete all the spaces between > and <, if there are only spaces between them) input file <xyzr> <abc>1234</xyzr> <aaa> <bbb> ayz mnz</bbb> <sen>KEA... (6 Replies)
Discussion started by: thanuman
6 Replies

2. Linux

Need awk script for removing duplicate records

I have huge txt file having millions of trade data. For e.g Trade.txt (first 8 lines in the file is header info) COB_DATE,TRADE_ID,SOURCE_SYSTEM_TRADE_ID,TRADE_GROUP_ID, TRADE_TYPE,DEALER_NAME,EXTERNAL_COUNTERPARTY_ID, EXTERNAL_COUNTERPARTY_NAME,DB_COUNTERPARTY_ID,... (6 Replies)
Discussion started by: nmumbarkar
6 Replies

3. Shell Programming and Scripting

Removing bad records from a text file

Hi, I have an requirement where i need to remove few bad records(bad records I mean email id's are part of the 1st field, where a numeric value expected) from the text file delimited by ",". file1.txt --------- 1234,,DAVID,MAX abc@email.com,,JOHN,SMITH 234,,ROBERT,SEN I need to remove... (3 Replies)
Discussion started by: naveen_sangam
3 Replies

4. Shell Programming and Scripting

Removing duplicate records from 2 files

Can anyone help me to removing duplicate records from 2 separate files in UNIX? Please find the sample records for both the files cat Monday.dat 3FAHP0JA1AR319226MOHMED ATEK 966504453742 SAU2010DE 3LNHL2GC6AR636361HEA DEUK CHOI 821057314531 KOR2010LE 3MEHM0JG7AR652083MUTLAB NAL-NAFISAH... (4 Replies)
Discussion started by: zooby
4 Replies

5. Linux

Need awk script for removing duplicate records

I have log file having Traffic line 2011-05-21 15:11:50.356599 TCP (6), length: 52) 10.10.10.1.3020 > 10.10.10.254.50404: 2011-05-21 15:11:50.652739 TCP (6), length: 52) 10.10.10.254.50404 > 10.10.10.1.3020: 2011-05-21 15:11:50.652558 TCP (6), length: 89) 10.10.10.1.3020 >... (1 Reply)
Discussion started by: Rastamed
1 Replies

6. Shell Programming and Scripting

Removing duplicate records in a file based on single column

Hi, I want to remove duplicate records including the first line based on column1. For example inputfile(filer.txt): ------------- 1,3000,5000 1,4000,6000 2,4000,600 2,5000,700 3,60000,4000 4,7000,7777 5,999,8888 expected output: ---------------- 3,60000,4000 4,7000,7777... (5 Replies)
Discussion started by: G.K.K
5 Replies

7. Shell Programming and Scripting

Removing non matching records

Hi all I have a file with records starting with "Page" has a first column. some of the records have some other junk characters has first column. so pls help me to remove rows which is not having "Page" has a first column. Thanks, Baski (2 Replies)
Discussion started by: baskivs
2 Replies

8. Shell Programming and Scripting

removing duplicate records comparing 2 csv files

Hi All, I want to remove the rows from File1.csv by comparing a column/field in the File2.csv. If both columns matches then I want that row to be deleted from File1 using shell script(awk). Here is an example on what I need. File1.csv: RAJAK,ACTIVE,1 VIJAY,ACTIVE,2 TAHA,ACTIVE,3... (6 Replies)
Discussion started by: rajak.net
6 Replies

9. Shell Programming and Scripting

Removing duplicate records in a file based on single column explanation

I was reading this thread. It looks like a simpler way to say this is to only keep uniq lines based on field or column 1. https://www.unix.com/shell-programming-scripting/165717-removing-duplicate-records-file-based-single-column.html Can someone explain this command please? How are there no... (5 Replies)
Discussion started by: cokedude
5 Replies

10. Shell Programming and Scripting

Removing specific records from files when duplicate key

Hello I have been trying to remove a row from a file which has the same first three columns as another row - I have tried lots of different combinations of suggestion on this forum but can't get it exactly right. what I have is 900 - 1000 = 0 900 - 1000 = 2562 1000 - 1100 = 0 1000 - 1100... (7 Replies)
Discussion started by: tinytimmay
7 Replies
HTTP::Proxy::BodyFilter::htmltext(3pm)			User Contributed Perl Documentation		    HTTP::Proxy::BodyFilter::htmltext(3pm)

NAME
HTTP::Proxy::BodyFilter::htmltext - A filter to transmogrify HTML text SYNOPSIS
use HTTP::Proxy::BodyFilter::tags; use HTTP::Proxy::BodyFilter::htmltext; # could it be any simpler? $proxy->push_filter( mime => 'text/html', response => HTTP::Proxy::BodyFilter::tags->new, response => HTTP::Proxy::BodyFilter::htmltext->new( sub { tr/a-zA-z/n-za-mN-ZA-M/ } ) ); DESCRIPTION
The HTTP::Proxy::BodyFilter::htmltext is a filter spawner that calls the callback of your choice on any HTML text (outside "<script>" and "<style>" tags, and entities). The subroutine should modify the content of $_ as it sees fit. Simple, and terribly efficient. METHODS
The filter defines the following methods, called automatically: init() Ensures that the filter is initialised with a CODE reference. begin() Per page parser initialisation. filter() A simple HTML parser that runs the given callback on the text contained in the HTML data. Please look at HTTP::Proxy::BodyFilter::htmlparser if you need something more elaborate. SEE ALSO
HTTP::Proxy, HTTP::Proxy::BodyFilter, HTTP::Proxy::BodyFilter::htmlparser. AUTHOR
Philippe "BooK" Bruhat, <book@cpan.org>. COPYRIGHT
Copyright 2003-2005, Philippe Bruhat. LICENSE
This module is free software; you can redistribute it or modify it under the same terms as Perl itself. perl v5.12.4 2011-07-03 HTTP::Proxy::BodyFilter::htmltext(3pm)
All times are GMT -4. The time now is 08:08 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy