bash logging al $() command lines


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting bash logging al $() command lines
# 8  
Old 12-09-2011
I was testing how the restart of daemons reacted to a shortage of mysql connections, so I drop the limit, kill a daemon, wait for new messages, then raise the limit, wait for the message count to quiesce and move to the next daemon. It works fine except for the bash blowing out junk from $() onto the console. I guess I am singularly cursed.

Maybe it comes from the ENV?
Code:
$ set
BASH=/bin/bash
BASH_ARGC=()
BASH_ARGV=()
BASH_LINENO=()
BASH_SOURCE=()
BASH_VERSINFO=([0]="3" [1]="00" [2]="15" [3]="1" [4]="release" [5]="x86_64-redhat-linux-gnu")
BASH_VERSION='3.00.15(1)-release'
COLORS=/etc/DIR_COLORS.xterm
COLUMNS=81
CYCLOPS_HOME=/var/xxx/6_1_22_rc4
DIRSTACK=()
EDITOR=vim
EUID=10103
GROUPS=()
G_BROKEN_FILENAMES=1
HISTFILE=/var/xxx/.bash_history
HISTFILESIZE=1000000
HISTSIZE=1000000
HOME=/var/xxx
HOSTNAME=xxx.yyy.zzz
HOSTTYPE=x86_64
IFS=$' \t\n'
INPUTRC=/etc/inputrc
LANG=en_US.UTF-8
LESSOPEN='|/usr/bin/lesspipe.sh %s'
LINES=25
LOGNAME=xxx
LS_COLORS='no=00:fi=00:di=00;34:ln=00;36:pi=40;33:so=00;35:bd=40;33;01:cd=40;33;01:or=01;05;37;41:mi=01;05;37;41:ex=00;32:*.cmd=00;32:*.exe=00;32:*.com=00;32:*.btm=00;32:*.bat=00;32:*.sh=00;32:*.csh=00;32:*.tar=00;31:*.tgz=00;31:*.arj=00;31:*.taz=00;31:*.lzh=00;31:*.zip=00;31:*.z=00;31:*.Z=00;31:*.gz=00;31:*.bz2=00;31:*.bz=00;31:*.tz=00;31:*.rpm=00;31:*.cpio=00;31:*.jpg=00;35:*.gif=00;35:*.bmp=00;35:*.xbm=00;35:*.xpm=00;35:*.png=00;35:*.tif=00;35:'
MACHTYPE=x86_64-redhat-linux-gnu
MAIL=/var/spool/mail/c64adm
MAILCHECK=60
OLDPWD=/var/c64adm
OPTERR=1
OPTIND=1
OSTYPE=linux-gnu
PATH=/home2/xxx/6_1_22_rc4/test-local/bin:/home2/xxx/6_1_22_rc4/mon-local/bin:/home2/xxx/6_1_22_rc4/host-local/bin:/home2/xxx/6_1_22_rc4/local/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/home2/xxx/bin:/sbin:/usr/sbin:/home/yyy/bin
PIPESTATUS=([0]="0")
PPID=19816
PS1='\[\e]0;\u@\h: \w\a\]\[\033[01;32m\]\u@\h\[\033[00m\]:\[\033[01;34m\]\w\[\033[00m\]\$ '
PS2='> '
PS4='+ '
PWD=/var/xxx/6_1_22_rc4
SHELL=/bin/bash
SHELLOPTS=braceexpand:hashall:histexpand:history:interactive-comments:monitor:vi
SHLVL=1
SSH_CLIENT='::ffff:192.168.9.73 46673 22'
SSH_CONNECTION='::ffff:192.168.9.73 46673 ::ffff:192.168.9.109 22'
SSH_TTY=/dev/pts/0
SUPPORTED=en_US.UTF-8:en_US:en
TERM=xterm
UID=10103
USER=xxx
_=--no-diag-kern
f=local/bin/cyclops64-linux-elf-sim
i=host-local/var/log/swras-master.log

I never heard of the input generating shell caring that it was feeding mysql. I suspect the messages are on stderr or tty, since they cannot pipe through mysql! It is a very obnoxious thing for a shell to do. Maybe I have to take it to bash-dev-land.
# 9  
Old 12-09-2011
Yes, I too had a theory that the last command was sitting in a buffer but I couldn't account for the selective repetition, the overlap and the fact that it stops.
Imho it's something to do with MySQL because the issue starts when MySQL gets its first program line. I suspect that it is by-passing unix I/O mechanisms and messing with the raw device (and upsetting the Shell in the process).
If it doesn't need a terminal context, maybe try backgrounding the MySQL program.

I seem to remember you advising me that the following ksh construct doesn't work in bash (but if it does, it's worth a try):
Code:
(
# shell lines
) | program

# 10  
Old 12-09-2011
in that case, maybe mysql --raw or --batch?
This User Gave Thanks to Corona688 For This Post:
# 11  
Old 12-12-2011
Quote:
Originally Posted by methyl
I seem to remember you advising me that the following ksh construct doesn't work in bash (but if it does, it's worth a try):
Code:
(
# shell lines
) | program

Nothing wrong with that construct. ksh evaluates pipes in a different order is all, so read on the end of a pipe-chain would set values in the subshell in a bourne shell, but the parent shell in ksh.
This User Gave Thanks to Corona688 For This Post:
# 12  
Old 12-12-2011
Yes, bash is less handy at
Code:
(....)|read x

which has to be recoded, often to:
Code:
x=$(....|line)

The 'read x' saves calling 'line', if your LINUX/UNIX has 'line' installed, else 'head -1'. I suspect 'head -1' is not fully equivalent to 'line', like when reading a fd with many commands, but the 'head -1' is equivalent, if an extra fork and exec, to 'read x' (both have a FILE* stdin that reads ahead, which is fine for throughput and reduced overhead but no good for reading the same fd with many commands). Bash is mostly just as handy with
Code:
(....)|while read x
do
 ...
done

as long as you do not, later, want variables set in there including x, as it is in a subshell. Sometimes a bash script can be fixed with just a more explicit sub-shell where the variable is read and used (only)):
Code:
....|(read x; ... $x ....)

It is an article of faith in bash that any ksh thing that malfunctions is a feature. Smilie

If I could make the $(...) echoing '...' out happen on demand, I could take a shot at seeing what and where it is writing that, but alas, it just comes and goes. I am not buying the mysql reaches up the stdin pipe theory. Pipes have a very few and bland set of behaviors, thank goodness.

I have been wishing for a revised stdio, unified FILE* library that would flush all output FILE* when blocking any, and read input ahead when blocking on other FILE*. Then, there would always be low latency, and fflush() or buffering restrictions like line or none would not be necessary for most programs, so you get good buffering during high throughput and yet low latency during data drought, all with no code. You would only need such controls when many threads/processes write the same fd or file, to avoid splitting/mixing lines/packets.
# 13  
Old 12-12-2011
Quote:
I am not buying the mysql reaches up the stdin pipe theory.
Try this:
Code:
$ echo mypassword | mysql -uroot -p
Enter password:

Commandline mysql features a full-fledged line editor complete with arrow keys and history recall. It can and does mess with your terminal. It doesn't have to "reach up stdin" to do this -- it can open /dev/tty and mess with it direct. Any interactive program has the potential to do this, really. How else would more work when stuck on the end of a pipe chain?

Quote:
Originally Posted by DGPickett
Yes, bash is less handy at
Code:
(....)|read x

which has to be recoded, often to:
Code:
x=$(....|line)

You can also do

Code:
read x <<<$(....)

Quote:
It is an article of faith in bash that any ksh thing that malfunctions is a feature. Smilie
It's not a BASH problem, or at least not just a BASH problem. Lots of non-bash bourne shells have the same shortcoming. ksh's behavior is an extension.

---------- Post updated at 01:29 PM ---------- Previous update was at 01:17 PM ----------

Quote:
Originally Posted by DGPickett
I have been wishing for a revised stdio, unified FILE* library that would flush all output FILE* when blocking any, and read input ahead when blocking on other FILE*. Then, there would always be low latency, and fflush() or buffering restrictions like line or none would not be necessary for most programs, so you get good buffering during high throughput and yet low latency during data drought, all with no code.
Like the memory buffer used for pipes, and the caching+read-ahead Linux and UNIX do for files? That's handled decently well at the kernel level. (I've noticed some changes lately, actually, in pipes being flushed more often.) The trouble I've found is how to tell the kernel when not to do that. Any operation on a large file fills the file cache with junk that'll probably not be reused, and you need to posix_fadvise every step of the way to prevent it...

I think the default line-buffering for printf() and such is to make situations like this more efficient:
Code:
for(x=0; x<10; x++) printf("asdf");
printf("\n");

With buffering, that's only one context switch, when write() is called by printf("\n"); Turn off all buffering and it's 11 context switches.
This User Gave Thanks to Corona688 For This Post:
# 14  
Old 12-14-2011
I suppose a FILE* could be double buffered and use aio to move the data! Maybe I will write a lib! Double buffering (i/o in one buffer area while you memory write/read another buffer area) got seriously neglected in UNIX except within TCP sockets.

I have never used the 'cmd <<< word', being pretty close to 'echo word | cmd' or 'cmd <<!
word
!', and in this '<<<$(...)' case, takes data from stdout a to stdin b, which is normally a pipe's simple job.

I suppose that if all non-ENV shell variables of a session were in an mmap'd file, all sub-shells could see and change all variables. Of course, it might get a bit tricky with the locking and moving as things expand, soon becoming another heap in need of GC. Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

BASH logging to file, limit lines

BASH Gurus: Anyone know how to append continuous output command appending to a file, but limit that file to no more than 20 lines? The program I have running is simply monitoring my UDP port 53 for incoming packets endlessly. I just need to keep this file from going over 20 lines. Once the file... (3 Replies)
Discussion started by: scorpius2k1
3 Replies

2. Shell Programming and Scripting

Command Logging

I searched the forums for command logging and the user "Driver" seemed to provide a script for logging shell commands with related info like date and time. The subject was "logging command invocations -cmdlog" . I would be interested in this script. Thanks (0 Replies)
Discussion started by: starcraft
0 Replies

3. Shell Programming and Scripting

running a bash script even after logging out from the current session

HI , I have a simple script that moves files from one folder to another folder, I have already done the open-ssh server settings and the script is working fine and is able to transfer the files from one folder to another but right now I myself execute this script by using my creditianls to... (4 Replies)
Discussion started by: nks342
4 Replies

4. Shell Programming and Scripting

Logging ALL standard out of a bash script to a log file, but still show on screen

Is it possible to store all standard-out of a bash script and the binaries it calls in a log file AND still display the stdout on screen? I know this is possible to store ALL stdout/stderr of a script to a single log file like: exec 1>&${logFile} exec 2>&1 But running a script with the... (3 Replies)
Discussion started by: ckmehta
3 Replies

5. Shell Programming and Scripting

bash telnet session logging

I'm looking at allowing remote telnet into my server. like any security-minded administrator, I want to log what my users type on the telnet session. I'm using the script command to generate transcripts of the users session. I have /etc/profile set to automatically start the script command... (2 Replies)
Discussion started by: ramnet
2 Replies

6. Shell Programming and Scripting

Reading lines from a file, using bash, "at" command

Hi. I have the script shown below. If I execute it form the command line it seems to work properly, but when I fun it using the unix "at" command "at -m now < ./kill-at-job.sh" It appears to hang. Below is the script, the input file, and the execution as reported in the e-mail from the "at"... (3 Replies)
Discussion started by: jbsimon000
3 Replies

7. Cybersecurity

Full Command Logging?

I am looking for a really good command logging tool to improve the auditing of my servers. I have previously used snoopy but this is currently a bit flaky and causing serious problems for me, it doesn't look like it's been maintained since 2004, it didn't even want to compile until I added -fPIC... (1 Reply)
Discussion started by: humbletech99
1 Replies

8. Solaris

shell command logging

Does anyone have a simple method of logging all shell commands typed by a user (csh in our case)? - I could enable auditing, but this would be overkill - I could enable process accounting, but AFAIK, this does not log arguments Thanks all. (2 Replies)
Discussion started by: minkie
2 Replies

9. UNIX for Dummies Questions & Answers

What is command for logging?

Hi, I am trying to recollect the command used to log a file. We use this command just before starting, say, installation. At the end you get a file capturing the series of commands you used during the course of time and sytems response. Could anybody please help. Thanks, Dasa (3 Replies)
Discussion started by: dtamminx
3 Replies

10. UNIX for Advanced & Expert Users

SSH and command logging

Hi all... I've completed the task of deploying SSH over my 400 servers. I don't know if i'm right or wrong, but ssh doesn't do any command-logging, does it? Is there a app i can use to log all commands passed ( besides the usual .sh_history), whith no modification possible by the user, and how... (2 Replies)
Discussion started by: penguin-friend
2 Replies
Login or Register to Ask a Question