Search Results

Search: Posts Made By: cue
3,342
Posted By pandeesh
use sed 's/^/"/ ; s/$/"/' file.txt | tr -d "\n" ...
use sed 's/^/"/ ; s/$/"/' file.txt | tr -d "\n" > newfile.txt
3,342
Posted By chapeupreto
Hello cue You may use sed and tr for achieving...
Hello cue
You may use sed and tr for achieving this goal.
Try something like:

sed 's/^/"/ ; s/$/"/' file.txt | tr "\n" " " > newfile.txt
By doing that, newfile.txt has all the paths from...
18,453
Posted By alister
It checks to see if the string you're searching...
It checks to see if the string you're searching for (stored in the variable x) is present in the current line (stored in $0). If so, index() returns a non-zero value which in AWK is equivalent to a...
18,453
Posted By alister
In that case, the value of x inside AWK is...
In that case, the value of x inside AWK is vulnerable to regular expression metacharacters. Say, for example, that you wanted to match a pathname that had a dot. The dot would not be treated...
4,635
Posted By admin_xor
Why using those fancy things when you can use a...
Why using those fancy things when you can use a smooth ssh session and can do endless things with a terminal? ;) Just kidding!

Try with -depth 32; if that does not work you can decrease it...
2,142
Posted By yazu
sed -r 's/, +,/,/' removes the first empty...
sed -r 's/, +,/,/'
removes the first empty field and so on. I think it doesn't scale for your real needs. Try to reformulate your problem.
2,142
Posted By yazu
In the third line you just remove columns 2-4...
In the third line you just remove columns 2-4 with spaces in it, not extracting any data. And you have different "Granysmithes" in your input and output files. Anyway you can try:
sed -r 's/, +,/,/;...
10,237
Posted By thegeek
alternatively you can use finddup also: Finddup -...
alternatively you can use finddup also: Finddup - Find duplicate files by content, name (http://finddup.sourceforge.net/usage.html)

Find the duplicate files by name
./finddup -n

Displays files...
9,304
Posted By frans
I'm searching for a similar tool. I found a...
I'm searching for a similar tool.
I found a simple way to print the duplicate file names
#!/bin/bash
FILES=/dev/shm/filelist
find -type f | awk -F'/' '{print $NF}' | sort | uniq -d > $FILES...
9,304
Posted By radoulov
Perl has all of that builtin: perl...
Perl has all of that builtin:

perl -MFile::Find -e'
$d = shift || die "$0 dir\n";
find {
wanted => sub {
push @{$u{$_}}, $File::Find::name if -f and -s > 10000;
}
...
10,237
Posted By DGPickett
xargs is a very nice way to get economy of scale...
xargs is a very nice way to get economy of scale in shell scripting, like calling grep once for every 99 files, not for every file. -n99 does 2 things, recommends trying to fit 99 on the command line...
10,237
Posted By DGPickett
Keep a list of your fie cksums, and use that to...
Keep a list of your fie cksums, and use that to filter new files (still cksums all Stuff every time):#!/usr/bin/bash

# first time only # ( cd AllMyFiles ; find * -type f | xargs -n99 cksum >...
1,274
Posted By vbe
Sorry for the delete... unintentional... Lets...
Sorry for the delete... unintentional...
Lets try again:
I suppose it has something to do with the way grep works (parses and search a matching string) and the result of "$@" which is all the...
1,274
Posted By markdark
vbe is right, but maybe the extra info will be...
vbe is right, but maybe the extra info will be that " make sure that the shell won't tokenize the string
1,644
Posted By Scrutinizer
You can check this with a case statement, e.g.: ...
You can check this with a case statement, e.g.:
case $input in)
ignore*) bla bla 2 ;;
*) for i in $input
do
delete corresponding file or directory
done ;;
esac
1,644
Posted By Scrutinizer
I do not think there is a need for trap, since...
I do not think there is a need for trap, since you are looping through a list of files and/or directories and checking user input anyway for every delete. So instead of checking for y/n only you...
Showing results 1 to 16 of 16

 
All times are GMT -4. The time now is 04:36 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy