Sponsored Content
Top Forums Shell Programming and Scripting Parsing log files, displaying logs between specific dates Post 302294879 by haris on Friday 6th of March 2009 04:44:04 AM
Old 03-06-2009
hi i have tried your previous code but it didnt wrk
thats the mean reason for previous post

so could u please help me with this
i just want to know how to convert

figure to text

if u can please explain me with example
as i m quite new to unix scripting


thanks
haris
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

reading files for specific dates

assume files are in a directory /data $ ls -ltr Just displaying Data and file name: 01/01/2004 file_3434_typea.dat 01/01/2004 file_3423_typea.dat 01/01/2004 file_3436_typea.dat 01/01/2004 file_3434_typeb.dat 01/01/2004 file_3423_typeb.dat 01/01/2004 file_3436_typeb.dat ... (3 Replies)
Discussion started by: siva_jm
3 Replies

2. Shell Programming and Scripting

Getting list of all the log files between two dates

I need to get the list of all the log files for a particular duration, say between two dates,i.e I need to get the list of all the log files between date1 and date2.The two dates are entered by the user. The format of the log file is : /path_name/graph_name_20080801180308.log. I dont... (1 Reply)
Discussion started by: avishekp
1 Replies

3. Shell Programming and Scripting

Parsing out the logs and generating report

My file will contain following(log.txt): start testcase: config loading ...... error XXXX ..... end testcase: config loading, result failed start testcase: ping check ..... error ZZZZZ ..... error AAAAA end testcase: Ping check, result failed I am expecting below output. ... (4 Replies)
Discussion started by: shellscripter
4 Replies

4. UNIX for Dummies Questions & Answers

To find files with specific dates and cp another folder.

Hi all, We have an existing script: find /u03/oraprod/perpcomn/admin/out -type f -ctime +7 \ -exec cp {} "/u08/oraprod/backup/cout" \; Which is to find all files more than 7 days and copy to another folder. However I would like to only list files with Sep 29, and cp to another folder. ... (2 Replies)
Discussion started by: *Jess*
2 Replies

5. Shell Programming and Scripting

Help parsing logs maybe with menu and variables?

I would like to parse through some logs looking for things like exception or failed (grep -i failed). Ideal would be if it were in a menu format so someone without unix ability could just choose option 1 2 or 3 etc. If I could pass the hostname to a variable also that would be awesome, so someone... (5 Replies)
Discussion started by: taekwondo
5 Replies

6. UNIX and Linux Applications

Parsing Tuxedo Logs

Right now I am parsing Tuxedo logs to calculate response times for various services. I was hoping to find a log tool that had support for Tuxedo and would generate drill down html reports. ---------- Post updated at 02:35 PM ---------- Previous update was at 02:33 PM ---------- I just wanted... (0 Replies)
Discussion started by: Lurch
0 Replies

7. Debian

Logrotate truncated my log files to 0 bytes and no logs are written

Hi, Yesterday I installed and configured logrotate on my Debian machine. I was expecting this to run at 06:25 in the morning and it actually did. All my old logs were compressed and zipped but the new logs were all with size equal to 0 bytes. Processes, while still running ok, they were not... (2 Replies)
Discussion started by: pmatsinopoulos
2 Replies

8. Shell Programming and Scripting

Redirect all logs files contents into a single log file

Hi , I have a Data cleansing process which creates different log file for each step , when the process runs it creates following log files in below order: p1_tranfrmr_log.txt p1_tranfrmr_stats.txt p2_globrtr_log.txt p2_globrtr_stats.txt p3_cusparse_log.txt p3_cusparse_stats.txt ' '... (8 Replies)
Discussion started by: sonu_pal
8 Replies

9. Shell Programming and Scripting

Search for logs traced between specific date and time from log file

HI, I want to search for a logs which are trace between specific date and time from logs file. My logs are generated like this :- Tue Jun 18 05:00:02 EEST 2013 | file_check.sh| Message:script has files to process. Thu Jun 20 05:00:02 EEST 2013 | file_check.sh| Message:script has files to... (5 Replies)
Discussion started by: ketanraut
5 Replies

10. Red Hat

Moving files with specific dates

Hi, These are the list of files in one directory in the server : # ls -lrt total 10120 -rw-r--r-- 1 root root 4484 Jul 8 2011 install.log.syslog -rw-r--r-- 1 root root 51890 Jul 8 2011 install.log -rw------- 1 root root 3140 Jul 8 2011 anaconda-ks.cfg drwxr-xr-x 2 root root... (2 Replies)
Discussion started by: anaigini45
2 Replies
CONVERT-LY(1)							   User Commands						     CONVERT-LY(1)

NAME
convert-ly - manual page for convert-ly 2.14.2 SYNOPSIS
convert-ly [OPTION]... FILE DESCRIPTION
Update LilyPond input to newer version. By default, update from the version taken from the version command, to the current LilyPond ver- sion.Examples: $ convert-ly -e old.ly $ convert-ly --from=2.3.28 --to=2.5.21 foobar.ly > foobar-new.ly OPTIONS
--version show version number and exit -h, --help show this help and exit -f, --from=VERSION start from VERSION [default: version found in file] -e, --edit edit in place -n, --no-version do not add version command if missing -c, --current-version force updating version number to 2.14.2 -d, --diff-version-update only update version number if file is modified -s, --show-rules show rules [default: -f 0, -t 2.14.2] -t, --to=VERSION convert to VERSION [default: 2.14.2] -w, --warranty show warranty and copyright REPORTING BUGS
Report bugs via http://post.gmane.org/post.php?group=gmane.comp.gnu.lilypond.bugs SEE ALSO
The full documentation for convert-ly is maintained as a Texinfo manual. If the info and convert-ly programs are properly installed at your site, the command info convert-ly should give you access to the complete manual. convert-ly 2.14.2 February 2013 CONVERT-LY(1)
All times are GMT -4. The time now is 10:39 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy