Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Finding duplicates then copying, almost there, maybe? Post 302582428 by balajesuri on Friday 16th of December 2011 12:24:59 AM
Old 12-16-2011
Primitive, but works.

bash code:
  1. #! /bin/bash
  2.  
  3. for x in `ls /path/folderA`
  4. do
  5.     [ ! -f /path/folderA/$x ] && continue
  6.     for y in `ls /path/folderB`
  7.     do
  8.         [ "$x" == "$y" ] && cp /path/folderB/$y /path/folderC/
  9.     done
  10. done
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

finding duplicates with perl

I have a huge file (over 30mb) that I am processing through with perl. I am pulling out a list of filenames and placing it in an array called @reports. I am fine up till here. What I then want to do is go through the array and find any duplicates. If there is a duplicate, output it to the screen.... (3 Replies)
Discussion started by: dangral
3 Replies

2. UNIX for Dummies Questions & Answers

finding and copying files !

Hi , I have a question relating to finding and copying files. i need to find the .pdf files from the specified directory which has subdirectories too. I only need .pdf files and not the directories and need to copy those files into my current directory. copy files from :... (5 Replies)
Discussion started by: bregoty
5 Replies

3. Shell Programming and Scripting

finding duplicates in columns and removing lines

I am trying to figure out how to scan a file like so: 1 ralphs office","555-555-5555","ralph@mail.com","www.ralph.com 2 margies office","555-555-5555","ralph@mail.com","www.ralph.com 3 kims office","555-555-5555","kims@mail.com","www.ralph.com 4 tims... (17 Replies)
Discussion started by: totus
17 Replies

4. UNIX for Dummies Questions & Answers

Finding and Copying Email

I have to create a bash script that will find Feedback emails and copy them to a labFeedback folder in my mail directory. I have an idea in my head on what commands can be used for this (find obviously among them). However, I have no idea where to start. I'm not sure what info needs to be given,... (1 Reply)
Discussion started by: Joesgrrrl
1 Replies

5. Shell Programming and Scripting

Finding duplicates from positioned substring across lines

I have million's of records each containing exactly 50 characters and have to check the uniqueness of 4 character substring of 50 character (postion known prior) and report if any duplicates are found. Eg. data... AAAA00000000000000XXXX0000 0000000000... upto50 chars... (2 Replies)
Discussion started by: gapprasath
2 Replies

6. Shell Programming and Scripting

Help finding non duplicates

I am currently creating a script to find filenames that are listed once in an input file (find non duplicates). I then want to report those single files in another file. Here is the function that I have so far: function dups_filenames { file2="" file1="" file="" dn="" ch="" pn="" ... (6 Replies)
Discussion started by: chipblah84
6 Replies

7. Shell Programming and Scripting

finding duplicates in csv based on key columns

Hi team, I have 20 columns csv files. i want to find the duplicates in that file based on the column1 column10 column4 column6 coulnn8 coulunm2 . if those columns have same values . then it should be a duplicate record. can one help me on finding the duplicates, Thanks in advance. ... (2 Replies)
Discussion started by: baskivs
2 Replies

8. Shell Programming and Scripting

Finding duplicates in a file excluding specific pattern

I have unix file like below >newuser newuser <hello hello newone I want to find the unique values in the file(excluding <,>),so that the out put should be >newuser <hello newone can any body tell me what is command to get this new file. (7 Replies)
Discussion started by: shiva2985
7 Replies

9. Shell Programming and Scripting

UNIX scripting for finding duplicates and null records in pk columns

Hi, I have a requirement.for eg: i have a text file with pipe symbol as delimiter(|) with 4 columns a,b,c,d. Here a and b are primary key columns.. i want to process that file to find the duplicates and null values are in primary key columns(a,b) . I want to write the unique records in which... (5 Replies)
Discussion started by: praveenraj.1991
5 Replies

10. Shell Programming and Scripting

Copying files from one directory to another, renaming duplicates.

Below is the script i have but i would like simplified but still do the same job. I need a script to copy files not directories or sub-directories into a existing or new directory. The files, if have the same name but different extension; for example 01.doc 01.pdf then only copy the .doc file. ... (1 Reply)
Discussion started by: Gilljambo
1 Replies
Geometry::Primitive::Polygon(3pm)			User Contributed Perl Documentation			 Geometry::Primitive::Polygon(3pm)

NAME
Geometry::Primitive::Polygon - Closed shape with an arbitrary number of points. DESCRIPTION
Geometry::Primitive::Polygon represents a two dimensional figure bounded by a series of points that represent a closed path. SYNOPSIS
use Geometry::Primitive::Polygon; my $poly = Geometry::Primitive::Polygon->new; $poly->add_point($point1); $poly->add_point($point2); $poly->add_point($point3); # No need to close the path, it's handled automatically ATTRIBUTES
points Set/Get the arrayref of points that make up this Polygon. METHODS
new Creates a new Geometry::Primitive::Polygon area Area of this polygon. Assumes it is non-self-intersecting. add_point Add a point to this polygon. clear_points Clears all points from this polygon. point_count Returns the number of points that bound this polygon. get_point Returns the point at the specified offset. point_end Get the end point. Provided for Shape role. point_start Get the start point. Provided for Shape role. scale ($amount) Scale this this polygon by the supplied amount. AUTHOR
Cory Watson <gphat@cpan.org> COPYRIGHT &; LICENSE You can redistribute and/or modify this code under the same terms as Perl itself. perl v5.10.1 2010-01-10 Geometry::Primitive::Polygon(3pm)
All times are GMT -4. The time now is 04:03 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy