Sponsored Content
Top Forums Shell Programming and Scripting Shell Script - find, recursively, all files that are duplicated Post 302360336 by danmero on Thursday 8th of October 2009 04:20:18 PM
Old 10-08-2009
Check if you have fdupes in your OS.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

find and replace a search string recursively in files

Hi , I have a directory structure as dir and subdirectories and files under it and so on.now I need to find the files which contain the search string under every dir and subdir and replace . my search string is like searchstring=/a/b string to be replaced=/a/c/b please help. ... (7 Replies)
Discussion started by: mohanpadamata
7 Replies

2. Shell Programming and Scripting

Shell script to find files

Hi Experts, I am trying to write a shell script that should 1. Find files with name (ab030.txt,Ab030.TXT,AB030.TXT,ab030.TXT,AB030.txt) 2. If any of the above found, rename it to AB030.TXT Thanks. (4 Replies)
Discussion started by: welldone
4 Replies

3. Shell Programming and Scripting

Need to find recursively all shell script in the /xyz directory

Pls. advise how to find or used grep recursively all shell script files. Some files doesnt have a .sh or .ksh extension name. find / -name "*" |xargs grep bin |grep sh ?? TIA (1 Reply)
Discussion started by: budz26
1 Replies

4. Shell Programming and Scripting

Recursively *.ext files using find

HI, Getting the syntax error " find: missing conjunction" for the below code D1_DIR=/x/y/z D1_NAME=file_name FILE_DIR=pset for file in `find ${D1_DIR}/${D1_NAME} -name "*\.${FILE_DIR}" /dev/null {} \;` do echo $file done #Trying to find all the files with *.pset... (5 Replies)
Discussion started by: cvsanthosh
5 Replies

5. Shell Programming and Scripting

Shell script to copy particular file from directories recursively

I have directory path in which there are several sub directories. In all these sub dir there will be one env.cnf file. I want to copy this env.cnf file from each sub dir's and place them in destination path by creating same filename as sub dir_env.cnf. After copying env.cnf files from source... (4 Replies)
Discussion started by: Optimus81
4 Replies

6. Shell Programming and Scripting

Script to compare files recursively using sdiff

Hi All, I have been surfing to get some idea on how to compare same files from two different paths. one path will have oldfiles directory and another path will have newfiles directory. Each main directories will have sub-directories in them and each sub-directories inturn will have... (3 Replies)
Discussion started by: Optimus81
3 Replies

7. Shell Programming and Scripting

Help on case to call recursively in UNIX Shell Script

Hi, I am New to Unix Shell Scripting basically, i need some help in achieving a case statement in Shell script to call recursively That is if case having like 1 2 3 4 options , if user inputs 1 and gets executed case should ask for options again but user should not input the same input value 1,... (7 Replies)
Discussion started by: karthikram
7 Replies

8. Shell Programming and Scripting

How to recursively /usr/bin/find only readonly files?

I'm having trouble because, for some reason, cp -R missed a few files. And so did xcopy/s. Since I'm running Cygwin on Win10, I decided to see if robocopy would be more effective. The trouble is someone, maybe xcopy/s or cp -R dutifully set certain files to be read only so when I try a... (6 Replies)
Discussion started by: siegfried
6 Replies

9. Shell Programming and Scripting

Find Large Files Recursively From Specific Directory

Hi. I found many scripts in the web of achieving this. But I like to use this one find /EDWH-DMT03 -xdev -size +10000 -exec ls -la {} \;|sort -n -k 5 > LARGE.rst But the problem is, why it still list out files with 89 bytes as the output? Is there anything wrong with the command? My... (7 Replies)
Discussion started by: aimy
7 Replies

10. UNIX for Beginners Questions & Answers

Find and removing the old files and zipping the files using shell script

Hi, I am trying to removing the old files which were older than 10 days and same g zipping the files using the shell script. script was return as follows. find /jboss7_homes/JBOSS7/SKYLIV??/SKYLIV??_CRM/jboss-eap-7.0/standalone/log -mtime +10 -type f | xargs rm -f find /cer_skyliv??/log... (6 Replies)
Discussion started by: venkat918
6 Replies
URI::URL(3pm)						User Contributed Perl Documentation					     URI::URL(3pm)

NAME
URI::URL - Uniform Resource Locators SYNOPSIS
$u1 = URI::URL->new($str, $base); $u2 = $u1->abs; DESCRIPTION
This module is provided for backwards compatibility with modules that depend on the interface provided by the "URI::URL" class that used to be distributed with the libwww-perl library. The following differences exist compared to the "URI" class interface: o The URI::URL module exports the url() function as an alternate constructor interface. o The constructor takes an optional $base argument. The "URI::URL" class is a subclass of "URI::WithBase". o The URI::URL->newlocal class method is the same as URI::file->new_abs. o URI::URL::strict(1) o $url->print_on method o $url->crack method o $url->full_path: same as ($uri->abs_path || "/") o $url->netloc: same as $uri->authority o $url->epath, $url->equery: same as $uri->path, $uri->query o $url->path and $url->query pass unescaped strings. o $url->path_components: same as $uri->path_segments (if you don't consider path segment parameters) o $url->params and $url->eparams methods o $url->base method. See URI::WithBase. o $url->abs and $url->rel have an optional $base argument. See URI::WithBase. o $url->frag: same as $uri->fragment o $url->keywords: same as $uri->query_keywords o $url->localpath and friends map to $uri->file. o $url->address and $url->encoded822addr: same as $uri->to for mailto URI o $url->groupart method for news URI o $url->article: same as $uri->message SEE ALSO
URI, URI::WithBase COPYRIGHT
Copyright 1998-2000 Gisle Aas. perl v5.14.2 2012-02-11 URI::URL(3pm)
All times are GMT -4. The time now is 03:50 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy