Sponsored Content
Top Forums Shell Programming and Scripting How to delete lines from a file in PERL? Post 302562758 by durden_tyler on Saturday 8th of October 2011 04:42:46 AM
Old 10-08-2011
Quote:
Originally Posted by vanitham
...
I have 500 MB of file.
I want to retain first line and last line of the file.
...How can i do it in PERL?
...
For a file of that size the shell's head/tail commands should be much faster than opening and iterating through it using Perl.
I'd do something like -

Code:
(head -1 myfile >myfile.tmp; tail -1 myfile >>myfile.tmp) && mv myfile.tmp myfile

tyler_durden
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

delete all lines in file

how can i delete all lines in file by using "vi" ? (6 Replies)
Discussion started by: strok
6 Replies

2. Shell Programming and Scripting

delete lines from a file.

I have a file which has about 500K records and I need to delete about 50 records from the file. I know line numbers and am using sed '13456,13457,......d' filename > new file. It does not seem to be working. Any help will greatly appreciated. (5 Replies)
Discussion started by: oracle8
5 Replies

3. Shell Programming and Scripting

Delete few lines from a file.

Hi there, Want a help. I have a file with 10,000 lines.Want to delete first 50,000 lines thru script.What will be the comand. :confused: Thanks Satadru (3 Replies)
Discussion started by: Satadru
3 Replies

4. UNIX for Dummies Questions & Answers

delete lines in a file

I've got a file like this: Grid-ref= 443, 229 167 169 204 233 290 309 308 326 300 251 194 161 148 189 228 251 296 329 331 338 308 263 219 179 178 203 215 252 277 319 327 335 312 264 196 149 120 172 226 253 297 329 323 322 305 242 203 136 ... (20 Replies)
Discussion started by: su_in99
20 Replies

5. Shell Programming and Scripting

How to delete lines in a file that have duplicates or derive the lines that aper once

Input: a b b c d d I need: a c I know how to get this (the lines that have duplicates) : b d sort file | uniq -d But i need opossite of this. I have searched the forum and other places as well, but have found solution for everything except this variant of the problem. (3 Replies)
Discussion started by: necroman08
3 Replies

6. UNIX for Dummies Questions & Answers

How get only required lines & delete the rest of the lines in file

Hiiii I have a file which contains huge data as a.dat: PDE 1990 1 9 18 51 28.90 24.7500 95.2800 118.0 6.1 0.0 BURMA event name: 010990D time shift: 7.3000 half duration: 5.0000 latitude: 24.4200 longitude: 94.9500 depth: 129.6000 Mrr: ... (7 Replies)
Discussion started by: reva
7 Replies

7. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

8. Shell Programming and Scripting

Delete lines in an array Using perl

im having an array @check which contains text ..i want to open the array and i have to delete lines starting from a word called "check1" till "check2" for eg:- check1 Use descriptive titles when posting. For example, do not post questions with subjects like "Help Me!", "Urgent!!" or "Doubt".... (0 Replies)
Discussion started by: rajkrishna89
0 Replies

9. Shell Programming and Scripting

Delete Lines : after pattern1 and between pattern2 and pattern3 using awk/sed/perl

Hi I need to delete lines from a file which are after pattern1 and between pattern 2 and patter3, as below: aaaaaaaa bbbbbbbb pattern1 cdededed ddededed pattern2 fefefefe <-----Delete this line efefefef <-----Delete this line pattern3 adsffdsd huaserew Please can you suggest... (6 Replies)
Discussion started by: vk2012
6 Replies

10. Shell Programming and Scripting

Delete 40 lines after every 24 lines from a file

Hello, I have file of more than 10000 lines. I want to delete 40 lines after every 20 lines. e.g from a huge file, i want to delete line no from 34 - 74, then 94 - 134 etc and so on. Please let me know how i can do it. Best regards, (11 Replies)
Discussion started by: nehashine
11 Replies
unaccent(1)						      General Commands Manual						       unaccent(1)

NAME
unaccent - remove accents from input stream or a string SYNOPSIS
unaccent [--debug_low] [--debug_high] [-h] charset [string] [expected] DESCRIPTION
With a single argument, unaccent reads data from stdin, replaces accented letters by their unaccented equivalent and writes the result on stdout. If the second argument ('string') is provided unaccent transforms it by replacing accented letters by their unaccented equivalent. The result is printed on the standard output. The charset of the input string or the data read from stdin is specified by the 'charset' argument (ISO-8859-15 for instance). The output is printed using the same charset. If the 'expected' argument is provided, the output string is compared to it. If they are not equal unaccent exits on error. unaccent relies on the iconv(3) library to convert from the specified charset to UTF-16BE (or UTF-16 if UTF-16BE is not available). You should check the manual pages for available charsets. On GNU/Linux the command iconv -l shows all available charsets. OPTIONS
--debug_low Prints human readable information about the unaccentuation process. See unac(3) for more information. --debug_high Prints very detailed information about the unaccentuation process. See unac(3) for more information. --help -h Prints a short usage and exits. EXAMPLES
Remove accents from the string ete and check that the result is ete. unaccent ISO-8859-1 ete ete Remove accents from file myfile and put the result in file myfile.unaccent unaccent ISO-8859-1 < myfile > myfile.unaccent SEE ALSO
unac(3), iconv(3) AUTHOR
Loic Dachary loic@senga.org http://www.senga.org/unac/ local unaccent(1)
All times are GMT -4. The time now is 12:19 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy