Sponsored Content
Top Forums Shell Programming and Scripting Remove duplicated records and update last line record counts Post 303032051 by RudiC on Sunday 10th of March 2019 06:45:15 AM
Old 03-10-2019
On top of what Don Cragun said, the last approach would not account for "duplicate duplicates".


Illogic nonsense... please disregard.

Last edited by RudiC; 03-10-2019 at 08:33 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove duplicated xml record in a file under unix

Hi, If i have a file with xml format, i would like to remove duplicated records and save to a new file. Is it possible...to write script to do it? (8 Replies)
Discussion started by: happyv
8 Replies

2. Shell Programming and Scripting

remove duplicated columns

hi all, i have a file contain multicolumns, this file is sorted by col2 and col3. i want to remove the duplicated columns if the col2 and col3 are the same in another line. example fileA AA BB CC DD CC XX CC DD BB CC ZZ FF DD FF HH HH the output is AA BB CC DD BB CC ZZ FF... (6 Replies)
Discussion started by: kamel.seg
6 Replies

3. Shell Programming and Scripting

Help to Add and Remove Records only from first line/last line

Hi, I need help with a maybe total simple issue but somehow I am not getting it. I am not able to etablish a sed or awk command which is adding to the first line in a text and removing only from the last line the ",". The file is looking like follow: TABLE1, TABLE2, . . . TABLE99,... (4 Replies)
Discussion started by: enjoy
4 Replies

4. Shell Programming and Scripting

Sending e-mail of record counts in 3 or more files

I am trying to load data into 3 tables simultaneously (which is working fine). Then when loaded, it should count the total number of records in all the 3 input files and send an e-mail to the user. The script is working fine, as far as loading all the 3 input files into the database tables, but... (3 Replies)
Discussion started by: msrahman
3 Replies

5. Shell Programming and Scripting

Split a single record to multiple records & add folder name to each line

Hi Gurus, I need to cut single record in the file(asdf) to multile records based on the number of bytes..(44 characters). So every record will have 44 characters. All the records should be in the same file..to each of these lines I need to add the folder(<date>) name. I have a dir. in which... (20 Replies)
Discussion started by: ram2581
20 Replies

6. UNIX for Dummies Questions & Answers

Hardcoding & Record counts in a file

HI , I am having a huge comma delimiter file, I have to append the following four lines before the starting of the file through a shell script. FILE NAME = TEST_LOAD DATETIME = CURRENT DATE TIME LOAD DATE = CURRENT DATE RECORD COUNT = TOTAL RECORDS IN FILE Source data 1,2,3,4,5,6,7... (7 Replies)
Discussion started by: shruthidwh
7 Replies

7. Shell Programming and Scripting

New file should store all the 7 existing filenames and their record counts and ftp th

Hi, I need help regarding below concern. There is a script and it has 7 existing files(in a path say,. usr/appl/temp/file1.txt) and I need to create one new blank file say “file_count.txt” in the same script itself. Then the new file <file_count.txt> should store all the 7 filenames and... (1 Reply)
Discussion started by: pr293
1 Replies

8. Shell Programming and Scripting

How to Remove the new line character inbetween a record

I have a file, in which a single record spans across multiple lines, File 1 ==== 14|\n leave request \n accepted|Yes| 15|\n leave request not \n acccepted|No| I wanted to remove the '\n charecters. I used the below code (foudn somewhere in this forum) perl -e 'while (<>) { if... (1 Reply)
Discussion started by: machomaddy
1 Replies

9. Shell Programming and Scripting

How to remove duplicated lines?

Hi, if i have a file like this: Query=1 a a b c c c d Query=2 b b b c c e . . . (7 Replies)
Discussion started by: the_simpsons
7 Replies

10. Shell Programming and Scripting

Join files, omit duplicated records from one file

Hello I have 2 files, eg more file1 file2 :::::::::::::: file1 :::::::::::::: 1 fromfile1 2 fromfile1 3 fromfile1 4 fromfile1 5 fromfile1 6 fromfile1 7 fromfile1 :::::::::::::: file2 :::::::::::::: 3 fromfile2 5 fromfile2 (4 Replies)
Discussion started by: CHoggarth
4 Replies
hardlink(1)						      General Commands Manual						       hardlink(1)

NAME
hardlink - Consolidate duplicate files via hardlinks SYNOPSIS
hardlink [-c] [-n] [-v] [-vv] [-h] directory1 [ directory2 ... ] DESCRIPTION
This manual page documents hardlink, a program which consolidates duplicate files in one or more directories using hardlinks. hardlink traverses one or more directories searching for duplicate files. When it finds duplicate files, it uses one of them as the mas- ter. It then removes all other duplicates and places a hardlink for each one pointing to the master file. This allows for conservation of disk space where multiple directories on a single filesystem contain many duplicate files. Since hard links can only span a single filesystem, hardlink is only useful when all directories specified are on the same filesystem. OPTIONS
-c Compare only the contents of the files being considered for consolidation. Disregards permission, ownership and other differ- ences. -f Force hardlinking across file systems. -n Do not perform the consolidation; only print what would be changed. -v Print summary after hardlinking. -vv Print every hardlinked file and bytes saved. Also print summary after hardlinking. -h Show help. AUTHOR
hardlink was written by Jakub Jelinek <jakub@redhat.com>. Man page written by Brian Long. Man page updated by Jindrich Novy <jnovy@redhat.com> BUGS
hardlink assumes that its target directory trees do not change from under it. If a directory tree does change, this may result in hardlink accessing files and/or directories outside of the intended directory tree. Thus, you must avoid running hardlink on potentially changing directory trees, and especially on directory trees under control of another user. hardlink(1)
All times are GMT -4. The time now is 06:47 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy