Hello,
I am under Ubuntu 18.04 Bionic.
I have one shell script run.sh (which is out of my topic) to run files under multiple directories and one file to control all processes running under those directories (control.sh).
I set a cronjob task to check each of them with two minutes of intervals. When a process is dead or when all processes are dead during ramp-up, each process waits for the completion of previous one. That's okay but due to time consuming process of run.sh
it keeps me waiting. To write separate scripts just dedicated to each folder does not sound logic.
Is there a modern way to do this without entering all directory names ?
Just wish say to script: "implement your code to all subfolders"
control.sh
Thanks in advance
Boris
Last edited by baris35; 10-28-2018 at 02:11 PM..
Reason: typo error fix
im using the sunstudio but it is very slow , is there ant other GUI debugger
for sun Solaris or at list some ways to make it faster ?
im using to debug throw telnet connection connected to remote server
thanks (0 Replies)
One of our servers runs Solaris 8 and does not have "ls -lh" as a valid command. I wrote the following script to make the ls output easier to read and emulate "ls -lh" functionality. The script works, but it is slow when executed on a directory that contains a large number of files. Can anyone make... (10 Replies)
hii everyone ,
i have a file in which i have line numbers.. file name is file1.txt
aa bb cc "12" qw
xx yy zz "23" we
bb qw we "123249" jh
here 12,23,123249. is the line number
now according to this line numbers we have to print lines from other file named... (11 Replies)
Hi,
Can any one help me out in solving the problem i have a linux database server it is tooo slow that i am unable to open even the terminial is there any solution to get rid of this problem.How to make this server faster.
Thanks & Regards
Venky (0 Replies)
I am trying to copy a folder which contains a list of C executables.
It takes 2 mins for completion,where as the entire script takes only 3 more minutes for other process.
Is there a way to copy the folder faster so that the performance of the script will improve? (2 Replies)
Hi All,
I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Hi all,
In bash scripting, I use to read files:
cat $file | while read line; do
...
doneHowever, it's a very slow way to read file line by line.
E.g. In a file that has 3 columns, and less than 400 rows, like this:
I run next script:
cat $line | while read line; do ## Reads each... (10 Replies)
I have script like below, who is picking number from one file and and searching in another file, and printing output.
Bu is is very slow to be run on huge file.can we modify it with awk
#! /bin/ksh
while read line1
do
echo "$line1"
a=`echo $line1`
if
then
echo "$num"
cat file1|nawk... (6 Replies)
Hi Guys,
I need to access multiple directories whcih is following similar structure and need to copy those files in desitination path.
for eg :
if ]
then
cd ${DIR}/Mon/loaded
echo "copying files to $GRS_DIR"
cp * ${DIR}/Mon/
echo "Files of Monday are Copied"
fi
if ]
then... (5 Replies)
I have the below command which is referring a large file and it is taking 3 hours to run. Can something be done to make this command faster.
awk -F ',' '{OFS=","}{ if ($13 == "9999") print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12 }' ${NLAP_TEMP}/hist1.out|sort -T ${NLAP_TEMP} |uniq>... (13 Replies)
Discussion started by: Peu Mukherjee
13 Replies
LEARN ABOUT DEBIAN
dh_compress
DH_COMPRESS(1) Debhelper DH_COMPRESS(1)NAME
dh_compress - compress files and fix symlinks in package build directories
SYNOPSIS
dh_compress [debhelperoptions] [-Xitem] [-A] [file...]
DESCRIPTION
dh_compress is a debhelper program that is responsible for compressing the files in package build directories, and makes sure that any
symlinks that pointed to the files before they were compressed are updated to point to the new files.
By default, dh_compress compresses files that Debian policy mandates should be compressed, namely all files in usr/share/info,
usr/share/man, files in usr/share/doc that are larger than 4k in size, (except the copyright file, .html and other web files, image files,
and files that appear to be already compressed based on their extensions), and all changelog files. Plus PCF fonts underneath
usr/share/fonts/X11/
FILES
debian/package.compress
These files are deprecated.
If this file exists, the default files are not compressed. Instead, the file is ran as a shell script, and all filenames that the shell
script outputs will be compressed. The shell script will be run from inside the package build directory. Note though that using -X is a
much better idea in general; you should only use a debian/package.compress file if you really need to.
OPTIONS -Xitem, --exclude=item
Exclude files that contain item anywhere in their filename from being compressed. For example, -X.tiff will exclude TIFF files from
compression. You may use this option multiple times to build up a list of things to exclude.
-A, --all
Compress all files specified by command line parameters in ALL packages acted on.
file ...
Add these files to the list of files to compress.
CONFORMS TO
Debian policy, version 3.0
SEE ALSO debhelper(7)
This program is a part of debhelper.
AUTHOR
Joey Hess <joeyh@debian.org>
9.20120909 2012-05-19 DH_COMPRESS(1)