can just be
Also, I'm not entirely sure what this line is doing:
...but if you're guarding against blank lines:
Or better yet, do this. It will skip blank lines without another layer of nested if at all:
Constructs like these are extremely slow since they can run cut uncountable numbers of times.
Instead, since you're using a shell that supports arrays, just split it into an array once then use the array. This should split fine on spaces:
You can also split on other characters by changing the IFS variable but be aware that this affects read too.
You're running grep many, many times per loop. This is slow. Instead of
try
This reads the file only once and doesn't execute four extra processes. Note that the ~= regular expression operator only works in bash.
Whenever you have VAR=`something | grep something | grep something | grep something` that's an enormous performance waster, and likely possible with shell built-ins, though exactly how depends on what bits you want to get.
...and so forth and so forth. Your script is enormous. You might want to break it into functions so you can tell what's happening where. Functions are easy:
They act like processes in that they return numbers, not strings, and output to stdin/stdout/stderr. But they can set global variables (as long as they're not behind a pipe).
Hello All,
I am brand new to the UNIX world and so far and very intrigued and enjoy scripting. This is just a new language for me. I would really like assistance with the below request. Any help would be greatly appreciated!
I want to create a flat file in Vi that has a header field and... (0 Replies)
Hi All
I am reading a huge file of size 2GB atleast. I am reading each line and cutting certain columns and writing it to another file.
Here is the logic.
int main()
{
string u_line;
string Char_List;
string u_file;
int line_pos;
string temp_form_u_file;
... (10 Replies)
Hi!
Thank you for the help yesterday
This is the finished product
There is one more thing I would like to do to it but I’m not to certain
On how to proceed I would like to log all output to a log in order to
Be able to roll back
This script is meant to be used in repairing a... (4 Replies)
can anyone help to share the knowledge on linux os improvement?
1) os account
- use window AD authentication, such as ldap, but how to set /etc/passwd, where to put user home?
2) user account activity
- how to log os user activity
share the idea and what tools can do that...thx (5 Replies)
Heyas
I've been working on my project TUI (Text User Interface) for quite some time now, its a hobby project, so nothing i sit in front of 8hrs/day.
Since the only 'real' programming language i knw is Visual Basic, based upon early steps with MS-Batch files. When i 'joined' linux 3 years ago,... (7 Replies)
Hello Coders
Some time ago i was asking about python and bash performances, and i was told i could post the regarding code, and someone would kindly help to make it faster (if possible).
If you have noted, i'm on the way to finalize, finish, stable TUI - Text(ual) User Interface.
It is a... (6 Replies)
Below script is used to search numeric data from around 400 files in a folder. I have 300 such folders. Need help in performance improvement in the script.
Below Script searches 20 such folders ( 300 files in each folder) simultaneously. This increases cpu utilization upto 90% What changes... (3 Replies)
Hi,
another little question...
"sn" is an array whose elements can vary from about 55,000 to about 150,000 elements. Each element consists of an integer between 0-255, eg: ${sn} contain the value: 103 . For a decrypt-procedure I need scroll all the elements 4 or 5 times. Here is an example of... (15 Replies)
Hi guys and gals...
MacBook Pro.
OSX 10.13.2, default bash terminal.
I have a flat file 1920 bytes in size of whitespaces only. I need to put every single whitespace character into a bash array cell.
Below are two methods that work, but both are seriously ugly.
The first one requires that I... (7 Replies)
Hello,
For several of our scripts we are using awk to search patterns in files with data from other files. This works almost perfectly except that it takes ages to run on larger files. I am wondering if there is a way to speed up this process or have something else that is quicker with the... (15 Replies)