Sponsored Content
Top Forums Shell Programming and Scripting Help with removal of numericals in a file Post 302520216 by ctsgnb on Friday 6th of May 2011 06:27:47 AM
Old 05-06-2011
They shouldn't .
Double check the command you entered (check that you didn't forget any coma ",")
 

10 More Discussions You Might Find Interesting

1. Solaris

UNIX File removal without conformation

I need to remove permanently some 3 GB of temp folder which contains Log file, simulation files from my disk. if i use "rm - rf <filename>" command it asks about conformation for accessing each folder and for removing every file and giving yes to every message in terminal window is very... (2 Replies)
Discussion started by: rajharvijay
2 Replies

2. Shell Programming and Scripting

Removal of Duplicate Entries from the file

I have a file which consists of 1000 entries. Out of 1000 entries i have 500 Duplicate Entires. I want to remove the first Duplicate Entry (i,e entire Line) in the File. The example of the File is shown below: 8244100010143276|MARISOL CARO||MORALES|HSD768|CARR 430 KM 1.7 ... (1 Reply)
Discussion started by: ravi_rn
1 Replies

3. Shell Programming and Scripting

Removal of carriage returns from a comma delimited file

Hi, I have a file which is having some carriage return in one of the field for which single line is coming in multiple lines. I want to combine all those multiple lines of that field into one line. Eg: Input: Id, Name, Location, Comments, Dept 2, John, US, I am from US. I... (5 Replies)
Discussion started by: mahish20
5 Replies

4. Shell Programming and Scripting

Removal of file extension question

All, I know that this will remove a file extension from a file name, but reading on the documentation on how it works confuses me. ${filename%.*}Can anyone explain what exactly is going on here? Filename is the pattern and % says to cut anything that starts with .? Also, can I run this... (4 Replies)
Discussion started by: markdjones82
4 Replies

5. Shell Programming and Scripting

Help with removal of blank spaces in a file

Hello.. I have a text file. I want to remove all the blank spaces(except tab) from the file.. I tried using sed command as shown below sed 's/ //g' file1 But the problem with the above command is that it also eliminates 'tab' which is between the columns.. For example if the contents... (7 Replies)
Discussion started by: abk07
7 Replies

6. Shell Programming and Scripting

Removal of HTML ASCII Codes from file

Hi all, I have a file with extended ASCII codes in the description which needs to be removed. List of extended ascii codes "Œ", "œ", "Š", "š", "Ÿ", "ƒ", "-", "-", "‘", "'", "‚", "“", "”", "„","†", "‡", "•", "...", "‰", "€", "™" Sample data: Test Details-HAVE BEEN PUBLISHED... (1 Reply)
Discussion started by: btt3165
1 Replies

7. Solaris

Removal of zip file permanently

Hi Everyone, I see some peculier thing happening on my server. I have one zipped file created long back as a normal user and trying to remove it now. When i tried to remove as that particular user, i was not able to do that. So i logged in as a root user and removed that successfully. But it... (8 Replies)
Discussion started by: Sricharan21
8 Replies

8. Shell Programming and Scripting

String removal from file

Dear all From below mention input file I needed op file as show below. I am using below code but not worked. I/p file BSCBCH1 EXAL-1-4 WO* SMPS MAINS FAIL BSCBCH1 EXAL-1-5 WO* SMPS RECTIFIER FAIL BSCBCH1 EXAL-1-6 WO* SMPS MAJOR ALARM BSCBCH2 EXAL-1-10 WO* ... (5 Replies)
Discussion started by: jaydeep_sadaria
5 Replies

9. Shell Programming and Scripting

Honey, I broke awk! (duplicate line removal in 30M line 3.7GB csv file)

I have a script that builds a database ~30 million lines, ~3.7 GB .cvs file. After multiple optimzations It takes about 62 min to bring in and parse all the files and used to take 10 min to remove duplicates until I was requested to add another column. I am using the highly optimized awk code: awk... (34 Replies)
Discussion started by: Michael Stora
34 Replies

10. Shell Programming and Scripting

Removing string from CSV file by provide removal string from other file

What I need is to remove the text from Location_file.txt from each line matching all entries from Remove_location.txt Location_file.txt FlowPrePaid, h3nmg1cm2,Jamaica_MTAImageFileFlowPrePaid,h0nmg1cm1, Flow_BeatTest,FlowRockTest FlowNewTest,FlowNewTest,h0nmg1cm1 PartiallySubscribed,... (3 Replies)
Discussion started by: ketanraut
3 Replies
MooseX::Storage::Engine::Trait::DisableCycleDetection(3pUser Contributed Perl DocumentatMooseX::Storage::Engine::Trait::DisableCycleDetection(3pm)

NAME
MooseX::Storage::Engine::Trait::DisableCycleDetection - A custom trait to bypass cycle detection SYNOPSIS
package Double; use Moose; use MooseX::Storage; with Storage( traits => ['DisableCycleDetection'] ); has 'x' => ( is => 'rw', isa => 'HashRef' ); has 'y' => ( is => 'rw', isa => 'HashRef' ); my $ref = {}; my $double = Double->new( 'x' => $ref, 'y' => $ref ); $double->pack; DESCRIPTION
"MooseX::Storage" implements a primitive check for circular references. This check also triggers on simple cases as shown in the Synopsis. Providing the "DisableCycleDetection" traits disables checks for any cyclical references, so if you know what you are doing, you can bypass this check. This trait is applied to an instance of MooseX::Storage::Engine, for the user-visible version shown in the SYNOPSIS, see MooseX::Storage::Traits::DisableCycleDetection METHODS
Introspection meta BUGS
All complex software has bugs lurking in it, and this module is no exception. If you find a bug please either email me, or add the bug to cpan-RT. AUTHOR
Stevan Little <stevan.little@iinteractive.com> COPYRIGHT AND LICENSE
Copyright 2007-2008 by Infinity Interactive, Inc. <http://www.iinteractive.com> This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.14.2 2009-08-11 MooseX::Storage::Engine::Trait::DisableCycleDetection(3pm)
All times are GMT -4. The time now is 10:51 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy