I use plink.exe to automate remote commands that return data to Windows machines. This works well on newer servers running Red Hat since the commands were developed for bash and the designated user's login shell is bash. I need to also support older servers which are running Solaris 10 but the designated user's login shell is sh. I would like to use the same commands and be able to use array variables which apparently are not supported by sh. The commands are saved in a local text file and the file name is used in plink's -m option. Each command is then executed via SSH. An example of some of the file contents are below.
I cannot figure out how to format these commands to run under a bash subshell and return stdout. I thought the following would do it but I get "undefined variable" messages and no return. I also tried other variations with similar results.
How can I run a series of commands in a bash shell from sh login shell and return stdout. Extra credit: I also need an alternative to strftime in the nawk statements since these Solaris 10 boxes do not have this function.
I don't think plink.exe will harm anything.
Did you execute either script in an interactive session? What be the results, errors? I don't see an immediate reason (besides what vgersh99 said) that script(s) should not run in both bash and sh, as there's nothing shell dependent in them, but I may overlook sonething as the script is complex and difficult to read, esp. without sample data.
Why do you use those perl scripts? Why that many (n)awk scripts? With some insight, people in here might propose a single awk script to you. Post sample data, directory structures, and desired results.
The remote servers have thousands of CSV files organized into 3 groups. The user supplies a start and end date and the script returns the records between the supplied dates/times. The script must first determine which files should contain the correct data (determined by file name = $filelist) then determine which lines from those files are between the start and end date (determined by unix timestamp field). The perl functions are to convert the user-supplied start and end date/time to epoch timestamp so time is relative to the server, not the user. This enabled me to skip time zone and DST implementation in the GUI since users can be in different time zones.
I did run the commands from an interactive session, which is a convenient way to test since it mimics what plink is doing. This is where I was seeing the undefined variable messages. When i included a pwd command, I found that the cd command was not executed either.
Since the initial post though, I learned that quoting the heredoc delimiter works. All the commands appeared to have run in bash and provided an output.
I'm still not sure why this works but glad it does. I'm still faced with the timestamp conversion issue though. I played around with similar perl commands inside the nawk statement but am not having much luck. This issue doesn't have as big of an impact as did running the commands in bash instead of sh but it would be nice to resolve it.
---------- Post updated at 12:15 PM ---------- Previous update was at 11:57 AM ----------
Edit: I just realized that the login shell for these Solaris 10 boxes is csh rather than sh. Not sure is that changes things much
Hi
I need to track what commands run in login session in solaris whether it is root or any normal users in bash shell.
My actual requirement is that when a user (nomal/root) login into the system, whatever commands he run, it should log into file on specified path . I don't require command... (4 Replies)
How to run several bash commands put in bash command line without needing and requiring a script file.
Because I'm actually a windows guy and new here so for illustration is sort of :
$ bash "echo ${PATH} & echo have a nice day!"
will do output, for example:... (4 Replies)
Hi all. On X11 I'm on a shell ...shell_1 (/bin/bash). From here I want to open another shell window shell_2 who executes commands like "ls -l" or programs like ". /program"... so the "result" of commands shows in shell_2 window and not in shell_1. Is that possible ? (4 Replies)
run_xfs_fsr is a xfs filesystem maintenance script designed to run under cron. The system is a home theater personal computer running mythbuntu 10.10, and is accessed remotely for these tests. cron runs a script, (xfs_fsr.sh) at 02:30 that runs the subject script under BASH and sets the... (3 Replies)
Hello All. I suspect that this will be a clear noob question, but I haven't been able to figure it out using the usual methods, so I turn to you.
I've written a script to create input files for the quantum chemistry program NWCHEM. Generally you create an input file and then execute it by... (12 Replies)
I want to write a script which would run from one host say A and connect to other remote host B and then run rest of commands in that host. I tried connecting from A host to B with SSH but after connecting to host B it just getting me inside Host B command prompt. Rest of the script is not running... (6 Replies)
I want to log into a remote server transfer over a new config and then backup the existing config, replace with the new config.
I am not sure if I can do this with BASH scripting.
I have set up password less login by adding my public key to authorized_keys file, it works.
I am a little... (1 Reply)
Hi friends this is first post i am very new to shell scripting so i require your expertise to do the following thank u
I need to write a shell script which will run the following commands
pg_dump bank > backup(Enter)
Wait for bash prompt to appear coz it indicates that the command is... (23 Replies)
Hi ,
I am having one situation in which I need to run some simple unix commands after doing "chroot" command in a shell script. Which in turn creates a new shell.
So scenario is that
- I need to have one shell script which is ran as a part of crontab
- in this shell script I need to do a... (2 Replies)