Sponsored Content
Top Forums Shell Programming and Scripting Script to Gather data from logs and export to a CSV file Post 302986335 by Chubler_XL on Wednesday 23rd of November 2016 08:05:03 PM
Old 11-23-2016
Yes cfg is the configuration file. also not I change the full path to ./ while testing here so it is currently looking for ./File_YYYMMDD* input files not /path has you originally had.

Have you checked the Error_Report_YYYMMDD.csv file?

When you say nothing is appearing does that include the headings?

If you are just getting headings and not data records, this means the your Pattern set on line 1 in PAT variable is not matching any line.

If you get lines with all blank totals then PAT matches the line fields 28-30 mismatch OR no error codes match.
This User Gave Thanks to Chubler_XL For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

unix script to export data from csv file to oracle database

Hello people, Need favour. The problem I have is that, I need to develop a unix shell script that performs recurring exports of data from a csv file to an oracle database. Basically, the csv file contains just the first name and last name will be dumped to an Unix server. The data from these... (3 Replies)
Discussion started by: vinayagan
3 Replies

2. Shell Programming and Scripting

Shell Script to Load data into the database using a .csv file and .ctl file

Since i'm new to scripting i'm findind it difficult to code a script. The script has to be an executable with 2 paramters passed to it.The Parameters are 1. The Control file name(.ctl file) 2. The Data file name(.csv file) Does anybody have an idea about it? :confused: (3 Replies)
Discussion started by: Csmani
3 Replies

3. Shell Programming and Scripting

FTP script to gather logs.

Hi, I am currently working on a FTP script which would ftp to extranet servers from intranet server and gather logs based on timestamps in archive logs and by using a wildcard character in the present logs. I have the following in place AAA="Extranet Server 1:Instance Extranet Server... (2 Replies)
Discussion started by: openspark
2 Replies

4. Web Development

script to load data from csv file

hello i want a script to load the data line by line from a csv file into a mysql table (3 Replies)
Discussion started by: srpa01red
3 Replies

5. Shell Programming and Scripting

Exporting data as a CSV file from Unix shell script

Friends...This is the first time i am trying the report generation using shell script... any suggestions are welcome. Is there a way to set the font size & color when i am exporting the data from unix shell script as a CSV file ? The following sample data is saved as a .csv file in the... (2 Replies)
Discussion started by: appu2176
2 Replies

6. Shell Programming and Scripting

Run script to export the data to ixf file in loop

Hi, I am trying to export the data to an .ixf file. I have read the table names from a .dat file and those table name should be passed to the select * from schema.TABLENAME query . I am trying the below loop while read TABLE; do db2 EXPORT TO ~/data_export/$TABLE.ixf OF IXF MESSAGES... (5 Replies)
Discussion started by: vikyalex4
5 Replies

7. UNIX for Dummies Questions & Answers

Shell script to extract data from csv file

Hi Guys, I am new to shell script.I need your help to write a shell script. I need to write a shell script to extract data from a .csv file where columns are ',' separated. The file has 7 columns having values say column 1,column 2.....column 7 as below along with their values. Name, Address,... (7 Replies)
Discussion started by: Vivekit82
7 Replies

8. Shell Programming and Scripting

Shell script to extract data from csv file

Hi everyone, I have a csv file which has data with different heading and column names as below. Static Data Ingested ,,,,,,,,,,,,Known Explained Rejections Column_1,column_2,Column_3,Column_4,,Column_6,Column_7,,% Column_8,,Column_9 ,Column_10 ,... (14 Replies)
Discussion started by: Vivekit82
14 Replies

9. Shell Programming and Scripting

Compare 2 files of csv file and match column data and create a new csv file of them

Hi, I am newbie in shell script. I need your help to solve my problem. Firstly, I have 2 files of csv and i want to compare of the contents then the output will be written in a new csv file. File1: SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0713.JPG,2015:02:17 11:14:07... (8 Replies)
Discussion started by: refrain
8 Replies

10. Linux

Parsing - export html table data as .csv file?

Hi all, Is there any out there have a brilliant idea on how to export html table data as .csv or write to txt file with separated comma and also get the filename of link from every table and put one line per rows each table. Please see the attached html and PNG of what it looks like. ... (7 Replies)
Discussion started by: lxdorney
7 Replies
COMBINEDIFF(1)							     Man pages							    COMBINEDIFF(1)

NAME
combinediff - create a cumulative unified patch from two incremental patches SYNOPSIS
combinediff [[-p n] | [--strip-match=n]] [[-U n] | [--unified=n]] [[-d PAT] | [--drop-context=PAT]] [[-q] | [--quiet]] [[-z] | [--decompress]] [[-b] | [--ignore-space-change]] [[-B] | [--ignore-blank-lines]] [[-i] | [--ignore-case]] [[-w] | [--ignore-all-space]] [[--interpolate] | [--combine]] diff1 diff2 combinediff {[--help] | [--version]} DESCRIPTION
combinediff creates a unified diff that expresses the sum of two diffs. The diff files must be listed in the order that they are to be applied. For best results, the diffs must have at least three lines of context. Since combinediff doesn't have the advantage of being able to look at the files that are to be modified, it has stricter requirements on the input format than patch(1) does. The output of GNU diff will be okay, even with extensions, but if you intend to use a hand-edited patch it might be wise to clean up the offsets and counts using recountdiff(1) first. Note, however, that the two patches must be in strict incremental order. In other words, the second patch must be relative to the state of the original set of files after the first patch was applied. The diffs may be in context format. The output, however, will be in unified format. OPTIONS
-p n, --strip-match=n When comparing filenames, ignore the first n pathname components from both patches. (This is similar to the -p option to GNU patch(1).) -q, --quiet Quieter output. Don't emit rationale lines at the beginning of each patch. -U n, --unified=n Attempt to display n lines of context (requires at least n lines of context in both input files). (This is similar to the -U option to GNU diff(1).) -d pattern, --drop-context=PATTERN Don't display any context on files that match the shell wildcard pattern. This option can be given multiple times. Note that the interpretation of the shell wildcard pattern does not count slash characters or periods as special (in other words, no flags are given to fnmatch). This is so that "*/basename"-type patterns can be given without limiting the number of pathname components. -i, --ignore-case Consider upper- and lower-case to be the same. -w, --ignore-all-space Ignore whitespace changes in patches. -b, --ignore-space-change Ignore changes in the amount of whitespace. -B, --ignore-blank-lines Ignore changes whose lines are all blank. -z, --decompress Decompress files with extensions .gz and .bz2. --interpolate Run as "interdiff". See interdiff(1) for more information about how the behaviour is altered in this mode. --combine Run as "combinediff". This is the default. --help Display a short usage message. --version Display the version number of combinediff. BUGS
The -U option is a bit erratic: it can control the amount of context displayed for files that are modified in both patches, but not for files that only appear in one patch (which appear with the same amount of context in the output as in the input). SEE ALSO
interdiff(1) AUTHOR
Tim Waugh <twaugh@redhat.com> Package maintainer patchutils 23 Jan 2009 COMBINEDIFF(1)
All times are GMT -4. The time now is 05:59 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy