Sponsored Content
Full Discussion: CSV file issue
Top Forums Shell Programming and Scripting CSV file issue Post 302479269 by gwrm on Friday 10th of December 2010 07:30:49 AM
Old 12-10-2010
A EMP 58
; NO EMP"
NO EMPL"
TXN 58
; NO EMPL"
output to be
A EMP 58 ; NO EMP"
NO EMPL" -- No change as it ends with double quotes
TXN 58 ; NO EMPL"
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need to compare two csv files values and write into another csv file

Hi all, Am new to scripting. So i just need your ideas to help me out. Here goes my requirement. I have two csv files 1.csv 2.csv abc,1.24 abc,1 def,2.13 def,1 I need to compare the first column of 1.csv with 2.csv and if matches then need to compare... (2 Replies)
Discussion started by: chinnahyd
2 Replies

2. UNIX for Advanced & Expert Users

Issue reading csv file

HI All I have csv file containing the data like this Electrical Equipment,ElecEquip "Engineering, Machinery & Equipment",Engineerin Entertainment & Broadcasting,Entertain The first and third record are fine,The issue with second records as it has comma enclosed with in inverted... (1 Reply)
Discussion started by: mohdtausifsh
1 Replies

3. Shell Programming and Scripting

2 problems: Mailing CSV file / parsing CSV for display

I have been trying to find a good solution for this seemingly simple task for 2 days, and I'm giving up and posting a thread. I hope someone can help me out! I'm on HPUX, using sqlplus, mailx, awk, have some other tools available, but can't install stuff that isn't already in place (without a... (6 Replies)
Discussion started by: soldstatic
6 Replies

4. Shell Programming and Scripting

csv file to excel issue

Hi, I am trying to attach and email a csv file in the form of an excel sheet. And I have been successful in doing this. But after some days I realised that some fields in the csv file are also having commas because of which this field is getting splitted in columns in the excel sheet. ... (5 Replies)
Discussion started by: girish1428
5 Replies

5. Shell Programming and Scripting

Compare 2 csv files in ksh and o/p the difference in a new csv file

(say) I have 2 csv files - file1.csv & file2.csv as mentioned below: file1.csv ID,version,cost 1000,1,30 2000,2,40 3000,3,50 4000,4,60 file2.csv ID,version,cost 1000,1,30 2000,2,45 3000,4,55 6000,5,70 The... (7 Replies)
Discussion started by: Naresh101
7 Replies

6. Shell Programming and Scripting

Match columns from two csv files and update field in one of the csv file

Hi, I have a file of csv data, which looks like this: file1: 1AA,LGV_PONCEY_LES_ATHEE,1,\N,1,00020460E1,0,\N,\N,\N,\N,2,00.22335321,0.00466628 2BB,LES_POUGES_ASF,\N,200,200,00006298G1,0,\N,\N,\N,\N,1,00.30887539,0.00050312... (10 Replies)
Discussion started by: djoseph
10 Replies

7. Shell Programming and Scripting

Compare 2 files of csv file and match column data and create a new csv file of them

Hi, I am newbie in shell script. I need your help to solve my problem. Firstly, I have 2 files of csv and i want to compare of the contents then the output will be written in a new csv file. File1: SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0713.JPG,2015:02:17 11:14:07... (8 Replies)
Discussion started by: refrain
8 Replies

8. Shell Programming and Scripting

Save output of updated csv file as csv file itself

Hi, all I want to sort a csv file based on timestamp from oldest to newest and save the output as csv file itself. Here is an example of my csv file. test.csv SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0739.JPG,2015:02:17 11:32:21 /home/intannf/foto/IMG_0749.JPG,2015:02:17 11:37:28... (10 Replies)
Discussion started by: refrain
10 Replies

9. Shell Programming and Scripting

Save output of updated csv file as csv file itself, part 2

Hi, I have another problem. I want to sort another csv file by the first field. result.csv SourceFile,Airspeed,GPSLatitude,GPSLongitude,Temperature,Pressure,Altitude,Roll,Pitch,Yaw /home/intannf/foto5/2015_0313_090651_219.JPG,0.,-7.77223,110.37310,30.75,996.46,148.75,180.94,182.00,63.92 ... (2 Replies)
Discussion started by: refrain
2 Replies
KiokuDB::Backend(3pm)					User Contributed Perl Documentation				     KiokuDB::Backend(3pm)

NAME
KiokuDB::Backend - Backend interface role SYNOPSIS
package KiokuDB::Backend::Foo; use Moose; # load the core api and additional interfaces based on backend capabilities with qw( KiokuDB::Backend KiokuDB::Backend::Role::TXN KiokuDB::Backend::Role::Clear KiokuDB::Backend::Role::Scan KiokuDB::Backend::Role::UnicodeSafe KiokuDB::Backend::Role::BinarySafe ); sub insert { ... } sub get { ... } sub delete { ... } sub exists { ... } # use the backend like this: my $dir = KiokuDB->new( backend => KiokuDB::Backend::Foo->new( ); ); DESCRIPTION
KiokuDB is designed to be fairly backend agnostic. This role defines the minimal API for writing new backends. TRANSACTIONS
This role is supplemented by KiokuDB::Backend::Role::TXN, a role for first class transaction support that issues rollbacks using the KiokuDB::Entry objects. QUERYING
This role is supplemented by KiokuDB::Backend::Role::Query, a role for backend specific queries. KiokuDB::Backend::Role::Query::Simple provides a universal query api for backends that can perform property based lookup. KiokuDB::Backend::Role::Query::GIN is a role for using Search::GIN based indexing/querying with backends that do not natively support querying. REQUIRED METHODS
get @ids Retrieve the KiokuDB::Entry objects associated with the @ids. If any other error is encountered, this method should die. The backend may store private data in "backend_data", to be used in a subsequent update. Returns a list of KiokuDB::Entry, with the order corresponding to @ids. If an entry does not exist then "undef" should be returned in place of it. The backend may abort retrieval on the first non existent entry. insert @entries Insert entries to the store. If the backend is transactional this operation should be atomic with respect to the inserted/updated data. The backend is required to store the data in the fields "data", "class" using the key in "id". Entries which have an entry in "prev" denote updates (either objects that have been previously stored, or objects that were looked up). The previous entry may be used to compare state for issuing a partial update, and will contain the value of "backend_data" for any other state tracking. "object" is a weak reference to the object this entry is representing, and may be used for high level indexing. Do not use this field for storage. If this backend implements some form of garbage collection, "root" denotes that the objects is part of the root set. After all entries have been successfully written, "backend_data" should be set if necessary just as in "get". Has no return value. If "insert" does not die the write is assumed to be successful. delete @ids_or_entries Delete the specified IDs or entries. If the user provided objects then entries will be passed in. Any associated state the entries may have (e.g. a revision) should be used in order to enforce atomicity with respect to the time when the objects were loaded. After all entries have been successfully deleted, "deleted" should be set. The entry passed in is the same one as was loaded by "get" or last written by "insert", so it is already up to date in the live objects. Has no return value. If "delete" does not die the write is assumed to be successful. exists @ids Check for existence of the specified IDs, without retrieving their data. Returns a list of true or false values. METHODS
These methods are provided by the KiokuDB::Backend role, and may be overridden. new_from_dsn Parses the second half of the DSN using "parse_dsn_params" and instantiates a new object using "new_from_dsn". See KiokuDB::Util. new_from_dsn_params @args Takes DSN parameters and converts them to arguments suitable for "new" parse_dsn_params $str The string is split on ";" to produce arguments. Arguments in the form "foo=bar" are split on "=" into a key/value pair, and other arguments are treated as a boolean key and returned as "$arg => 1". ADDITIONAL INTERFACES
Your backend may include more roles, based on its capabilities. KiokuDB::Backend::Serialize KiokuDB::Backend::Serialize::Delegate For the actual serialization of entries, there are a number of serialization roles. KiokuDB::Backend::Role::Clear API for clearing all entries. KiokuDB::Backend::Role::Scan API for enumerating entries. KiokuDB::Backend::Role::BinarySafe KiokuDB::Backend::Role::UnicodeSafe If your serialization is able to store arbitrary binary data and/or unicode strings, these informational roles should be included. KiokuDB::Backend::Role::TXN If your storage supports nested transactions ("txn_begin", "txn_commit" etc) this role provides the api to expose that functionality to the high level KiokuDB api. KiokuDB::Backend::Role::Query KiokuDB::Backend::Role::Query::GIN If your backend supports querying of some sort, these are the roles to include. The querying API uses backend specific lookups to fetch entries, which KiokuDB will then relink into result objects. SHARED BACKENDS
A backend may be shared by several KiokuDB instances, each with its own distinct live object set. The backend may choose to share cached entry data, as that is not mutated by KiokuDB::Linker, but not the KiokuDB::Entry instances themselves. perl v5.12.4 2010-10-11 KiokuDB::Backend(3pm)
All times are GMT -4. The time now is 11:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy