Deleting duplicated chunks in a file using awk/sed
Hi all,
I'd always appreciate all helps from this site.
I would like to delete duplicated chunks of strings on the same row(?).
One chunk is comprised of four lines such as:
path name
starting point
ending point
voltage number
I would like to delete duplicated chunks on the same row(?) if "ending point" is duplicated.
For example, ending points of the first and the second chunk are same in the first row and I would like to only keep the first chunk. Therefore, the second chunk is removed on the first row.
In the second row, ending points of the first and the third chunk are same and keep the first chunk.
input.txt:
Expected_output.txt:
The number of columns can be up to 20 in a file.
Actually, I have posted the same question on other website to get a help, and somebody posted replies, but did not work correctly. Any help is appreciated.
I can not figure out this one, so I turn to unix.com for help, I have a file, in which there are some lines containing continuously duplicate columns, like the following
adb abc abc asd adfj
123 123 123 345
234 444 444 444 444 444 23
and the output I want is
adb abc asd adfj
123 345... (5 Replies)
hi there,
i have a text file like that one:
I like to delete the second block with the Start and End Line!
Does anyone have a idea?
Thanks for your help,
Roland (4 Replies)
cat file.txt
fvnuiehuewf
ruevhxncvkjrh
zxjvurhfuwe
jkhvBEGINvfnvf
ijrgioe
Trying to delete a line that has the pattern "BEGIN"
cat sedtest
filename=file.txt
pattern=BEGIN
sed "/^$pattern/d" "$filename" (9 Replies)
Input:
:: gstreamer
:: xine-lib
:: xine-lib-extras
Output should be:
gstreamer xine-lib xine-lib-extras
How can it be done with sed or perl? (12 Replies)
Hello
I have a unique problem of needing to delete large files slowly off of an XSan.
I was wondering if there is a script I could use to delete 100gb chunks of files and folders that get placed in to a watch folder, slowly so as not to disrupt the other users. I would like to use Automator in... (0 Replies)
hi,
Here is excerpt from my xml file
<!-- The custom module to do the authentication for LDAP
-->
</login-module>
<login-module code="com.nlayers.seneca.security.LdapLogin" flag="sufficient">
<module-option... (1 Reply)
First time poster, but the forum has saved my bacon more times than... Lots.
Anyway, I have a text file, and wanted to use Awk (or any other sensible program) to print out overlapping sections, or arbitrary length. To describe by example, for file
1
2
3
4
5
etc...
I want the out put... (3 Replies)
Hi gurus,
I wanted to split main file in 20 files with 2500 lines in each file. My main file conatins total 2500*20 lines. Following awk I made, but it is breaking with error.
awk '{ for (i = 1; i <= 20; i++) { starts=2500*$i-1; ends=2500*$i; NR>=starts && NR<=ends {f=My$i".txt"; print >> f;... (10 Replies)
Dear all,
I always appreciate your help.
I would like to delete lines containing duplicated strings in the second column.
test.txt
658 invert_d2e_q_reg_0_/Qalu_ecl_zlow_e 0.825692
659 invert_d2e_q_reg_0_/Qalu_byp_rd_data_e 0.825692
660 invert_d2e_q_reg_0_/Qalu_byp_rd_data_e 0.825692... (1 Reply)
Discussion started by: jypark22
1 Replies
LEARN ABOUT CENTOS
pack_fopen_chunk
pack_fopen_chunk(3alleg4) Allegro manual pack_fopen_chunk(3alleg4)NAME
pack_fopen_chunk - Opens a sub-chunk of a file. Allegro game programming library.
SYNOPSIS
#include <allegro.h>
PACKFILE *pack_fopen_chunk(PACKFILE *f, int pack);
DESCRIPTION
Opens a sub-chunk of a file. Chunks are primarily intended for use by the datafile code, but they may also be useful for your own file rou-
tines. A chunk provides a logical view of part of a file, which can be compressed as an individual entity and will automatically insert and
check length counts to prevent reading past the end of the chunk. The PACKFILE parameter is a previously opened file, and `pack' is a bool-
ean parameter which will turn compression on for the sub-chunk if it is non-zero. Example:
PACKFILE *output = pack_fopen("out.raw", "w!");
...
/* Create a sub-chunk with compression. */
output = pack_fopen_chunk(output, 1);
if (!output)
abort_on_error("Error saving data!");
/* Write some data to the sub-chunk. */
...
/* Close the sub-chunk, recovering parent file. */
output = pack_fclose_chunk(output);
The data written to the chunk will be prefixed with two length counts (32-bit, a.k.a. big-endian). For uncompressed chunks these will both
be set to the size of the data in the chunk. For compressed chunks (created by setting the `pack' flag), the first length will be the raw
size of the chunk, and the second will be the negative size of the uncompressed data.
To read the chunk, use the following code:
PACKFILE *input = pack_fopen("out.raw", "rp");
...
input = pack_fopen_chunk(input, 1);
/* Read data from the sub-chunk and close it. */
...
input = pack_fclose_chunk(input);
This sequence will read the length counts created when the chunk was written, and automatically decompress the contents of the chunk if it
was compressed. The length will also be used to prevent reading past the end of the chunk (Allegro will return EOF if you attempt this),
and to automatically skip past any unread chunk data when you call pack_fclose_chunk().
Chunks can be nested inside each other by making repeated calls to pack_fopen_chunk(). When writing a file, the compression status is
inherited from the parent file, so you only need to set the pack flag if the parent is not compressed but you want to pack the chunk data.
If the parent file is already open in packed mode, setting the pack flag will result in data being compressed twice: once as it is written
to the chunk, and again as the chunk passes it on to the parent file.
RETURN VALUE
Returns a pointer to the sub-chunked PACKFILE, or NULL if there was some error (eg. you are using a custom PACKFILE vtable).
SEE ALSO pack_fclose_chunk(3alleg4), pack_fopen(3alleg4)Allegro version 4.4.2 pack_fopen_chunk(3alleg4)