so i have a script that runs across many servers. i'd like to know how many times this script is being used on each server.
the only straight forward, non-intrusive way i can think of doing this is to include a line in the script to make a webcall to a central server. and from that central server, i can count how many times each server is running the script.
i have the following basic code that i'm looking into:
Only problem with this is, i'm not sure if "nc" is installed by default on most unix systems.
if it isn't, is there an alternative to it?
Or, is there another straight-forward, and simplistic way to make an external call to a central server?
..... or syslog messages being routed to a central place?
Probably plenty more options too. I'd probably shy away from a single log file on an NFS server because of competing updates. You could have a shared directory and a log file for each server though.
You didn't mention the shell version you use. Recent shells (e.g. bash, ksh) offer redirection to a remote port:
Quote:
/dev/tcp/host/port
If host is a valid hostname or Internet address, and port is an integer port number or service name, bash attempts to open the corresponding TCP socket.
/dev/udp/host/port
If host is a valid hostname or Internet address, and port is an integer port number or service name, bash attempts to open the corresponding UDP socket.
these servers are remote and i do not have any type of privileged access on them. i cant use nfs. the only way is through something similar to a web call.
I am facing issue related to performance of one customized application running on RHEL 5.9. The application stalls for some unknown reason that I need to track. For that I require some tool or shell scripts that can monitor the CPU usage statistics (what we get in TOP or in more detail by other... (6 Replies)
Greetings all,
I have came up with some code, to read an input file and generate a list, here is a sample of what I am working with. The variable $USER will be inputted by the person running the utility.
do
folder=${line%}
echo "$folder $USER
done < list.txt
sample of list.txt-
... (15 Replies)
Hello world!
I need to gather all the data from different folders and copy it in an one unique text file in columns format.
Let me explain, letīs say "a, b, c" are 3 data files of thousands and thousands lines (in the reality, I have nearly one hundred).
"a, b, c" are located in the folders... (5 Replies)
Hi everyone,
I would like to ask if it is possible to gather SAR data on a specified time. let say yesterdays report, i want to get data at around 12PM and 5PM.
thank you. (2 Replies)
how can I find cpu usage memory usage swap usage and
I want to know CPU usage above X% and contiue Y times and memory usage above X % and contiue Y times
my final destination is monitor process
logical volume usage above X % and number of Logical voluage above
can I not to... (3 Replies)
Hello!
I would like to introduce a tool for gathering information on the HP-
UX operating system. I would like to hear experts opinions about this
utility, its prospects and usefulness. Any feedbacks, suggestions, bug
reports, feature requests etc are welcome.
The tool project web page:... (0 Replies)
I am pretty new at shell scripting, but I managed to make a primative shell script to connect from my webserver (bluehost) to an external data server via ftp or sftp or whatever I need. This script is :
#!/bin/sh
HOST='ftp.host.com'
USER='username'
PASSWD='password'
FILE='file'
ftp -n... (7 Replies)
Hi there, I have been asked to write a script that gathers enough information on our Sun Solaris machines to be able to rebuild and configure them if they should go pop.
My question is does anybody have any suggestions on the files that I need to take a copy of, to ensure that everything is... (4 Replies)
Hi all again,
here is the file i am working on :
GREIMBAJ00;BAN_CAV;Loader.sh;2003/06/13;17:04:04
GREIMBAJ00;BAN_CAV;Loader.sh;2003/06/13;17:04:06
GREIMBAJ00;BAN_PAK;Loader.sh;2003/06/13;17:04:11
GREIMBAJ00;BAN_PAK;Loader.sh;2003/06/13;17:04:18... (5 Replies)