Search Results

Search: Posts Made By: allthanksquery
Forum: What is on Your Mind? 05-03-2020
11,174
Posted By Neo
Community.UNIX.com login tips
Here are some tips for logging in to our new Community (https://community.unix.com/):


If you are a forum member with a valid email address in your profile and you have not logged in to...
15,374
Posted By vbe
the only ways are either add lines in your job...
the only ways are either add lines in your job script to echo anything in a log file or to output the execution of the script to a log file you open in another terminal using tail -f which will show...
7,696
Posted By vbe
Welcome on board! start by removing the \n...
Welcome on board!

start by removing the \n of you printf on first line... maybe replacing with :
8,905
Posted By RudiC
Redirection works identical for input as well as...
Redirection works identical for input as well as for output. Try
$ mv /tmp/new_file /tmp/old_file
$ while read product id features
do printf "${product}, %s\n" ${features%#*}
done <...
7,476
Posted By RudiC
Would this...
Would this (https://www.unix.com/303046306-post5.html) point you in the right direction to adapt your script?
67,854
Posted By apmcd47
Try appending this to your command: 2>...
Try appending this to your command:


2> >(grep -v 'No such file or directory' >&2)
It should remove the above message from the STDERR but allow other messages to be printed.


Works for bash...
13,181
Posted By RudiC
It doesn't. It would remove any line with an...
It doesn't. It would remove any line with an emtpy $1, of which there is none. You can leave it out entirely. The $4 < 7000 removes the header line. $4 > 7000 would not; we'd need a different...
8,905
Posted By apmcd47
That is close - try this while read product id...
That is close - try this
while read product id features
do
printf "${product}, %s\n" ${features%#*}
done < Check_FileYour example was using only two variables (f1, f3) to read the fields...
9,158
Posted By rbatte1
Could you store a 'previous last line' value...
Could you store a 'previous last line' value somewhere to use between runs? That way you could:-
Copy the log file to a temporary directory (to get a fixed file to work with)
Read the 'previous...
13,181
Posted By RudiC
Try this: awk '$1 == "Name"; $4 < 7000 {print |...
Try this:
awk '$1 == "Name"; $4 < 7000 {print | "sort -k4"}' file
Name tDesignation tDepartment tSalary
Thomas Manager Sales 5000
Jason Developer Technology 5500
Vicky...
16,313
Posted By MadeInGermany
A shell script with only shell builtins: ...
A shell script with only shell builtins:
#!/bin/sh
set -f # unquoted $f2
IFS=";"
while IFS="," read f1 f2
do
printf "$f1, %s\n" $f2
done < file
9,701
Posted By aigles
A possible solution : $ cat ./eshaqur.sh awk...
A possible solution :
$ cat ./eshaqur.sh
awk '
function printValues() {
if (Values) {
print S, Date, Value["NSMSSMRLTOT"],
...
16,313
Posted By nezabudka
Hi How about sed? sed -r...
Hi
How about sed?
sed -r ':1;s/^([^,]+)\s*,\s*([^;]+);/\1, \2\n\1, /;t1' file
Name , Company_Worked (Header)
Asley, IBM
Asley, Amazon
Asley, BOA
Asley, Google
King.Jr, Wipro
King.Jr,...
13,181
Posted By sea
I slightly modified your code... Was not my...
I slightly modified your code...
Was not my first attempt, just one of the... lets do this variant with the current variable... moments...

I have no idea WHY this works, because I wanted to...
16,313
Posted By RudiC
Or, awk: awk -F"[,;]+" 'NR == 1 {print; next}...
Or, awk:
awk -F"[,;]+" 'NR == 1 {print; next} {for (i=2; i<=NF; i++) print $1, " " $i}' OFS=, file
Name , Company_Worked (Header)
Asley, IBM
Asley, Amazon
Asley, BOA
Asley, Google
King.Jr,...
13,181
Posted By sea
Dont do BOLD for code. Do CODE for code. ...
Dont do BOLD for code.
Do CODE for code.

And proper writing helps.... tSalary....
Or use grep -i ...

And btw...
Your code would not show the 'header' on occasion, but always...
Unless your...
16,313
Posted By Skrynesaver
$ cat ~/tmp.dat Name , Company_Worked (Header) ...
$ cat ~/tmp.dat
Name , Company_Worked (Header)
Asley,IBM;Amazon;BOA;Google
King.Jr,Wipro;Microsoft;AMZ

$ perl -ne 'chomp;($name,$emp)=split/,/;for (split/;/,$emp){print "$name, $_\n";}'...
13,181
Posted By sea
Something like this? [~/tmp] 0 $ awk '$4 <=...
Something like this?
[~/tmp] 0 $ awk '$4 <= 7000 {print}' "file.txt" | grep -v tSalary | sort -k 4

Thomas Manager Sales 5000
Jason Developer Technology 5500
Vicky DBA Technology 6000...
9,701
Posted By RudiC
That's because if your first printout trigger /-/...
That's because if your first printout trigger /-/ is hit, only that array element is set yet. On top, all further date/time stamps do not match the data printed with them; the data are shifted one...
63,738
Posted By wisecracker
Not sure if this is what you re after but fully...
Not sure if this is what you re after but fully POSIX compliant:-
Longhand OSX 10.14.3, default bash terminal calling dash.
Last login: Wed Apr 29 21:46:11 on ttys000
AMIGA:amiga~> dash...
67,854
Posted By nezabudka
Hi Try closing the stderr descriptor at the...
Hi
Try closing the stderr descriptor at the beginning of the script.
exec 2>&-
or redirect
exec 2> /dev/null
16,940
Posted By nezabudka
Hi So, passed by ... grep...
Hi
So, passed by ...
grep --include=*.{org,texi}

--- Post updated at 22:03 ---


Maybe I misunderstood something
grep -hrm8 '.' --include=*.{org,texi} ./
7,186
Posted By RudiC
32 bit overflow? Calculate 2^32 -...
32 bit overflow? Calculate


2^32 - 1841560902 = 2453406394


.
9,220
Posted By RudiC
No surprise. Shells don't expand variables...
No surprise. Shells don't expand variables enclosed in single quotes. Use awk's standard mechanism to convey variables: the -v option. Like
awk -F "," -v"USER=$usr" ' $1 == USER {print $2}'...
16,940
Posted By RudiC
Unfortunately, awk doesn't have the --recursive...
Unfortunately, awk doesn't have the --recursive option that grep provides. But we can resort to bash's "brace expansion" for sweeping across the directory tree and "extended globbing" ("extended...
Showing results 1 to 25 of 500

 
All times are GMT -4. The time now is 08:11 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy