Search Results

Search: Posts Made By: chapeupreto
3,541
Posted By chapeupreto
Hello there! Maybe you could use a shell script...
Hello there!
Maybe you could use a shell script in order to accomplish that task.
This is how I thought about it:

From 5 a.m to 7.am there are 2 hours, then 7200 seconds.
You want to run the...
1,954
Posted By chapeupreto
What about: echo "Date range on 5th May is...
What about:

echo "Date range on 5th May is between -010516 and 050516- please continue " | sed 's/.*\(-.*-\).*/\1/'
1,226
Posted By chapeupreto
Hey there! If you have the rename utility,...
Hey there!
If you have the rename utility, which is a script written in Perl, you could do this:

rename 's/\.packed\.dir//' *.packed.dir
1,846
Posted By chapeupreto
a few more ways: sed '/#/d' infile.txt >...
a few more ways:
sed '/#/d' infile.txt > newfile.txt

sed -n '/#/!p' infile.txt > newfile.txt

grep -v '#' infile.txt > newfile.txt
952
Posted By chapeupreto
If you need Lane, but not lane (i.e.:...
If you need Lane, but not lane (i.e.: case-sensitive matters)
sed '/Lane/d' input_file.txt > output_file.txt
810
Posted By chapeupreto
while loop
Hey digitalviking.
Try doing something like this:


#!/opt/local/bin/bash
echo -n "Enter some text here: "
read text
while [ -z "$text" ]; do
echo -n "You enter nothing. Enter something:...
1,899
Posted By chapeupreto
Another sugestion. This time using sed: ...
Another sugestion. This time using sed:

#!/bin/bash
for i in file*; do
mv "$i" `echo "$i" | sed 's/file/file-/ ; s/\.dat//'`
done
However, I think this solution might be a little bit slower...
2,944
Posted By chapeupreto
As gary__w said, further details are required. ...
As gary__w said, further details are required.
However, assuming you're in the directory where your text files are, I think you could use something like:
grep -l string1 * | grep string2 - | cut...
Forum: UNIX and Linux Applications 04-02-2012
1,739
Posted By chapeupreto
Hi there. Showing results formatted with html...
Hi there.
Showing results formatted with html table tags isn't the default behavior of MySQL.
Perhaps your MySQL configuration file (my.cnf) was modified.
Try calling the MySQL client command with...
3,314
Posted By chapeupreto
Maybe you could do the following: files=$(<...
Maybe you could do the following:

files=$(< newfile.txt)
montage "$files"
1,414
Posted By chapeupreto
Not quite sure, but try using the full path for...
Not quite sure, but try using the full path for grep, like /usr/bin/grep and see what happens.
3,314
Posted By chapeupreto
Hello cue You may use sed and tr for achieving...
Hello cue
You may use sed and tr for achieving this goal.
Try something like:

sed 's/^/"/ ; s/$/"/' file.txt | tr "\n" " " > newfile.txt
By doing that, newfile.txt has all the paths from...
Forum: OS X (Apple) 03-25-2012
2,238
Posted By chapeupreto
try using ls with -G flag
ls -G also does the work for you.
Give it a try.
28,437
Posted By chapeupreto
Hi there. Maybe you should use sed. Try doing...
Hi there.
Maybe you should use sed.
Try doing something like:

sed -i.bkp 's/^/filenameU/' your_input_file.txt

Is your columns delimited by tab or spaces?
In case of tabs, and, if you're...
12,076
Posted By chapeupreto
Hi folks. You're right. That's the key. ...
Hi folks.

You're right. That's the key.
Just for studying purposes, I've created two bash scripts using different for loop syntax for solving this problem.

Here we go:

...
2,884
Posted By chapeupreto
Hi methyl Thanks for replying me. I...
Hi methyl
Thanks for replying me.

I mentioned about csh because sometimes I use FreeBSD and csh is its default shell.
I understand what you suggested me to do and that works fine.
However, I'm...
2,884
Posted By chapeupreto
Using cp for copying multiples files
Hi all,

I've got this question about using cp for copying multiples files from the same source directory to another directory, considering that my working directory ain't the same of the source...
Showing results 1 to 17 of 17

 
All times are GMT -4. The time now is 12:49 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy