Search Results

Search: Posts Made By: before4
1,874
Posted By before4
Get average and percentage non zero value
How to calculate percentage non zero value occurrence base on value col 1 and 2
2017 a 0
2017 a 2
2017 a 4
2017 a 2
2017 a 0
2017 b 2
2017 b 6
2016 a 2
2016 a 2
2016 b 2
2016 b 8
2016 b...
3,383
Posted By before4
Thanks, that much cleaner. ow=$(date +"%s") ...
Thanks, that much cleaner.
ow=$(date +"%s")
echo "NOW = ${now}";
file=$(ls -1r *.txt|head -1)
echo "file name is $file"
#filename 201710030549.txt
echo ${file:0:8}
fdate=$(date -d ${file:0:8}...
1,273
Posted By before4
not awk solution but simple shell approach ...
not awk solution but simple shell approach
#!/bin/bash
read -p "Enter the name : " name
echo "Hi, $name"
command=`curl -v -H 'Authorization: WS 00eWxhTuhoHMBcA' -H 'Accept: application/json' ...
3,383
Posted By before4
Calculate datediff
I'm trying to calculate datediff between string from filename and current date,
i got problem with storing variable from date command

now=$(date +"%s")
echo "NOW = ${now}";
file=$(ls -1Ap...
2,794
Posted By before4
Redirect script output to file after grep
i have simple program that generate log file 1 line every sec, i need to do grep for specific record then redirect to another file.

#!/bin/bash
for i in `seq 1 20`;
do
...
3,145
Posted By before4
Convert string to date and add 1 hours
i have some set of date data inside csv files and need to convert the timezone,

08302016113611861
08302016113623442
08302016113541570
08302016113557732
08302016113548439
08302016112853115...
1,655
Posted By before4
Work perfect, many thanks root@svr:/var/tmp#...
Work perfect, many thanks
root@svr:/var/tmp# more sample.txt
151442
15144200113571
190370299
2010
212526
212527
212529
212533
212534
212538
212546
212547

root@svr:/var/tmp# awk...
1,655
Posted By before4
many thanks Pravin, Is it possible to print...
many thanks Pravin,

Is it possible to print 179 as single line,

171 176
179
182 183
187 189
1900 1901
1903 1907
1,655
Posted By before4
Work perfect, and good finding for my mistake...
Work perfect, and good finding for my mistake many thanks
1,655
Posted By before4
Detect continuous number as range
I have 100k data like this bellow , i want to group data to range

171
172
173
174
175
176
179
182
183
187
188
189
1900
1901
1903
1904
1905
1906
1,128
Posted By before4
Working perfect .. Many thanks
Working perfect .. Many thanks
1,128
Posted By before4
Strip some text and format with new delimited
Sample input
19:08:12.172; Cat1 74598; Cat2 1366; Cat3 227; Cat4 389; Cat5 572; Cat6 2228; Cat7 1039; Cat8 25;
19:08:22.173; Cat1 75589; Cat2 1388; Cat3 233; Cat4 393; Cat5 582; Cat6 2253; Cat7...
903
Posted By before4
thanks it's working well and also very fast....
thanks it's working well and also very fast. never think it's can be done by single line awk. i want to know also how to calculate the number of record base on value on column 1 and 2 , so the data...
903
Posted By before4
Need help with regex
i have some giga texts data, samples data below


803,7282012,343
703,7282013,0
600,7282012,0
600,7282012,0
600,7282012,0
600,7282012,0
803,7282012,277
600,7282012,0
403,7282012,0...
4,766
Posted By before4
Converting decimal to hex
How to convert decimal value to hex and than take 1st digits as variable
sample data

84844294,5,6
51291736,2,3
84844294,5,6
51291736,2,3


i can use {printf "%x,%d\n",$1,$2} but than i...
1,960
Posted By before4
a. i mean file scanning will be done in hourly ...
a. i mean file scanning will be done in hourly basis ( the cron job )
b. i am not sure with the standard format .. just guessing
-2 digit day 2 digit month 4 digit year 2 digit hour 2 digit...
2,808
Posted By before4
Did you get your reverse tunnel working? Can...
Did you get your reverse tunnel working? Can your share any error during execute last command?
1,960
Posted By before4
How to check missing sequence?
I want to listed files every hours and check the missing sequence

my file format is

CV.020220131430.txt
CV.020220131440.txt
CV.020220131450.txt

CV.ddmmyyhhm.txt


how to check if i...
2,335
Posted By before4
lftp give very good solution for this case , but...
lftp give very good solution for this case , but i still have some issue with lftp

I want to download only the files that newer than latest local files( last downloaded from previous task) ,...
Forum: UNIX and Linux Applications 02-12-2013
2,676
Posted By before4
Single table or split the table
I have an application that collect data from 10 server every minutes and stored to mysql db,. there are about 10K record from each server on every minutes. that data will need to stored in 1 month....
2,335
Posted By before4
Replicate remote directory to local directory with ftp
I have system that generate files every 1 hours , i only have ftp connection from my local server to remote .


$ ls -al
-rw-r--r-- 1 water None 0 Feb 7 18:09 a.0800
-rw-r--r-- 1 water None 0...
5,011
Posted By before4
How to combine print and printf on awk
[root@localhost ~]# cat t.txt
2,3,4,5,A,2012-01-01 00:00:28
2,6,4,5,A,2012-01-02 00:00:28
2,7,4,5,A,2012-01-02 02:00:28


[root@localhost ~]# awk -F"," '{OFS=",";print $2,"";printf("%s",...
Forum: Red Hat 09-05-2012
2,523
Posted By before4
Unable to copy files due to many files in directory
I have directory that has some billion file inside , i tried copy some files for specific date but it's always did not respond for long time and did not give any result.. i tried everything with...
2,498
Posted By before4
Thank you.. but i can got the wrong output and...
Thank you.. but i can got the wrong output and also did not work for Name field more than 1 word( ie. Andi ruby) and when the field has empty value
$ awk -V | head -1
GNU Awk 4.0.0

$ cat...
2,498
Posted By before4
thank you can not imagine if we can use regex as...
thank you can not imagine if we can use regex as field separator,

sometimes the script is not working i check there is special character for each record set...
Showing results 1 to 25 of 29

 
All times are GMT -4. The time now is 12:07 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy