Search Results

Search: Posts Made By: Nexeu
11,547
Posted By Scrutinizer
Hi, in order for xargs to use NUL character...
Hi,

in order for xargs to use NUL character as separator, the utility before the pipe needs to produce them.

try:

mdfind -0 -onlyin . "Note" | xargs -0 -I {} echo {}
or
tag -0 -f "Note" ....
11,547
Posted By Scrutinizer
If you use this: s/^/\./ Then the . is added...
If you use this:
s/^/\./
Then the . is added at the beginning of the path.
To add it to the last item in the path, try:
s|.*/|$0/.|
11,547
Posted By Scrutinizer
Try this instead: s|(.*)/|$1/.|
Try this instead:
s|(.*)/|$1/.|
11,192
Posted By Don Cragun
Assuming you don' t have any tab characters in...
Assuming you don' t have any tab characters in your filenames, this should work for you. I added a function to process filenames before giving them to xargs to appropriately quote the single-quotes...
11,192
Posted By Don Cragun
OK. So you aren't using a Linux system, and you...
OK. So you aren't using a Linux system, and you aren't using a BSD system, and you aren't using OS X. What operating system are you using? Show us the output from the commands:
uname -a
type awk...
11,192
Posted By Don Cragun
In post #5 in this thread you showed us that you...
In post #5 in this thread you showed us that you were using the prompt:
Untitleds-MacBook-Pro:~ Nexeu$
I made the obviously bad assumption that that meant you were running OS X on a MacBook Pro...
11,192
Posted By Don Cragun
No, you can't look at the names of files and...
No, you can't look at the names of files and magically guess how many hyphens are in the 3rd lines of those files. And, as noted before using find | ... | mv ... may miss files depending on...
11,192
Posted By Don Cragun
Remember that we're processing a single directory...
Remember that we're processing a single directory containing 690,000 files. So, we have some constraints...

In theory for i in *.txt should work, but even though no exec is involved, we are still...
11,192
Posted By bakunin
Yes, but the original problem was: read a lot...
Yes, but the original problem was: read a lot (~700k) files and extract only a certain part of line 3. Shell expansion can extract that part but it is not easy to interrupt the reading process after...
11,192
Posted By RudiC
Got you. That was the only method that came to my...
Got you. That was the only method that came to my mind; I thought your had another method.

---------- Post updated at 18:37 ---------- Previous update was at 18:29 ----------

You could do it...
11,192
Posted By bakunin
By replacing a line with a certain number of...
By replacing a line with a certain number of hyphens with a word i can easily parse. Like, in the following (simplified, just proof-of-concept) example:

sed -n 's/^x$/one/p
...
11,192
Posted By RudiC
The -t option is for specifying the target...
The -t option is for specifying the target directory BEFORE a list of files, e.g. with, but not limited to, xargs.
11,192
Posted By wisecracker
Hi Don Cragun... I am not sure if I am...
Hi Don Cragun...

I am not sure if I am missing something but -t is not needed if the target directory ends with a / ...
Last login: Sat May 16 15:10:40 on ttys000
AMIGA:barrywalker~> cp...
11,192
Posted By RudiC
Please help me out: How would you do that in...
Please help me out:
How would you do that in sed?

If not in sed, this might be a possible wayfor FN in file[12]
do TMP=$(sed -n '3 {s/[^[]*\[|\].*//; s/,.*//; s/[^-]*//gp;q;}' $FN)
echo...
11,192
Posted By bakunin
This is not a complete solution, just a detail...
This is not a complete solution, just a detail for one:

Despite the preference of the majority here for awk i suppose that sed is the fastest to search for something at exactly line 3:

sed -n...
11,192
Posted By Don Cragun
OK. I'll see what I can do this weekend. ...
OK. I'll see what I can do this weekend.

Off to bed for now...
11,192
Posted By Don Cragun
The number of hyphens in the 1st word is no...
The number of hyphens in the 1st word is no problem as long as I know that is what you want.

I still have one unanswered question from question #2 in my last post:
11,192
Posted By Don Cragun
Thanks for the information. That helps a lot. I...
Thanks for the information. That helps a lot. I think I'll be able to pull something together this weekend that should work, but I still need a little more information.

I believe that you want...
11,192
Posted By Don Cragun
OK. We're making progress, but I still don't...
OK. We're making progress, but I still don't have clear requirements. Please answer each of the following questions:

You gave an example in post #21 you showed us a line 3 that contained 20...
11,192
Posted By Don Cragun
It looks like awk on OS X has a maximum number of...
It looks like awk on OS X has a maximum number of open files (not counting standard input or standard output) of 17. I think we can create an awk script for you that will be able to create shell...
11,192
Posted By RudiC
Use smaller groups of files, e.g. aa*, ab*, ac*,...
Use smaller groups of files, e.g. aa*, ab*, ac*, etc. You can create these with nested for loops.
11,192
Posted By RudiC
Is it possible that there are only 14 files out...
Is it possible that there are only 14 files out of 600000 whose file names start with "file"? That's what the OS selected based on the file* parameter to awk.

Try awk 'FNR==3 {printf "mv %s...
11,192
Posted By RudiC
HOW did it miss the purpose?
HOW did it miss the purpose?
11,192
Posted By RudiC
Did you test any of the proposals? Are there...
Did you test any of the proposals? Are there errors? If yes, post output.

You can run the proposals on groups of files, e.g. all starting with "A", then "B", etc. to keep execution times...
11,192
Posted By Don Cragun
There are a few things we could do to speed up...
There are a few things we could do to speed up processing 690,000 files, but we need a few more details to come up with something that stands a chance of working correctly and reasonably quickly. ...
Showing results 1 to 25 of 43

 
All times are GMT -4. The time now is 09:05 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy