Sponsored Content
Top Forums Shell Programming and Scripting Re-usable function to parse csv files with different number of fields Post 302177380 by jy2k7ca on Thursday 20th of March 2008 10:26:41 PM
Old 03-20-2008
Re-usable function to parse csv files with different number of fields

Hi there, been pondering how to deal with this and hoping someone would give me an insight on this.

I need help on creating a reusable bash funtion to parse csv files containing different number of fields (comma-seperated).

My initial thought is to create function for each input csv file (20+ of them, which means I have to create 20+ functions).

Appreciate any help.

Thanks.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Parse apart strings of comma separated data with varying number of fields

I have a situation where I am reading a text file line-by-line. Those lines of data contain comma separated fields of data. However, each line can vary in the number of fields it can contain. What I need to do is parse apart each line and write each field of data found (left to right) into a file.... (7 Replies)
Discussion started by: 2reperry
7 Replies

2. UNIX for Dummies Questions & Answers

Fields in csv files using sed

Hi, I am working right now with a csv file and I want to insert an excel formula say to the 6th column. sample csv file: 1234,lag,0,77,544,,7 1234,lag,222,0,7,,7 at first i used a simple command: sed 's/^\(.\{17\}\)/\1word/' file.csv but the result is this: ... (2 Replies)
Discussion started by: paoie
2 Replies

3. Shell Programming and Scripting

How to read and parse the content of csv file containing # as delimeter into fields using Bash?

#!/bin/bash i=0 cat 1.csv | while read fileline do echo "$fileline" IFS="#" flds=( $fileline ) nrofflds=${#flds} echo "noof fields$nrofflds" fld=0 while do echo "noof counter$fld" echo "$nrofflds" #fld1="${flds}" trying to store the content of line to fields but i... (4 Replies)
Discussion started by: barani75
4 Replies

4. Shell Programming and Scripting

How to (n)awk lines of CSV with certain number of fields?

I have a CSV file with a variable number of fields per record. How do I print lines of a certain number of fields only? Several permutations of the following (including the use of escape characters) have failed to retrieve the line I'm after (1,2,3,4)... $ cat myfile 1,2,3,4 1,2,3 $ # Print... (1 Reply)
Discussion started by: cs03dmj
1 Replies

5. Shell Programming and Scripting

How to create a CSV File by reading fields from separate files

SHELL SCRIPT Hi, I have 3 separate files within a folder. Every File contains data in a single column like File1 contains data mayank sushant dheeraj File2 contains DSA_AT MG_AT FLAT_09 File3 contains data 123123 232323 (2 Replies)
Discussion started by: mayanksargoch
2 Replies

6. Shell Programming and Scripting

split a csv file into specified number of files (not lines)

hi, i really need it ...it's not simple to explain but as it's part of a crontab i can't split the file manually...and the file can change every day so the lines are not a good base. example: how to split 1 csv file in 15 files? thank you very much regards :b: (4 Replies)
Discussion started by: 7stars
4 Replies

7. Shell Programming and Scripting

checking csv files with empty fields..!

Hi! I need to learn that how a shell script can transverse a csv file n check if any field is empty or not. means its contains two comma or space b/w commas i.e., "" or " ". can anyone help me out how I can do that.... (10 Replies)
Discussion started by: sukhdip
10 Replies

8. Shell Programming and Scripting

Parse csv files by their names

HI all I have multiple csv files with the names VAR1_VAR2_VAR3_VAR4.csv All the files have the same structure inside just values change. I am trying to retrieve data from those files by fixing at each time one or more VAR. I tried to write a script but I have 2 problems: 2-... (1 Reply)
Discussion started by: Jhon.c
1 Replies

9. Shell Programming and Scripting

Matching two fields in two csv files, create new file and append match

I am trying to parse two csv files and make a match in one column then print the entire file to a new file and append an additional column that gives description from the match to the new file. If a match is not made, I would like to add "NA" to the end of the file Command that Ive been using... (6 Replies)
Discussion started by: dis0wned
6 Replies

10. UNIX for Beginners Questions & Answers

Is there a UNIX command that can compare fields of files with differing number of fields?

Hi, Below are the sample files. x.txt is from an Excel file that is a list of users from Windows and y.txt is a list of database account. $ head -500 x.txt y.txt ==> x.txt <== TEST01 APP_USER_PROFILE USER03 APP_USER_PROFILE TEST02 APP_USER_EXP_PROFILE TEST04 APP_USER_PROFILE USER01 ... (3 Replies)
Discussion started by: newbie_01
3 Replies
MONGOEXPORT(1)							  Mongo Database						    MONGOEXPORT(1)

NAME
mongoexport - the Mongo export tool SYNOPSIS
mongoexport [OPTIONS] DESCRIPTION
mongoexport is a tool to export a MongoDB collection to either JSON or CSV. The query can be filtered or a list of fields to output can be given. If the output is CSV, the fields must be specified in order. EXAMPLES
mongoexport -d test -c test1 --csv -f name,num export documents from test.test1 in CSV format OPTIONS
--help show usage information -h, --host HOST server to connect to (default HOST=localhost) -d, --db DATABASE database to use -c, --c COLLECTION collection to use -q, --query QUERY query filter -f, --fields FIELDS comma-separated list of field names --csv export to CSV instead of JSON -o, --out FILE output file, if not specified, stdout is used --dbpath PATH directly access mongod data files in this path, instead of connecting to a mongod instance COPYRIGHT
Copyright 2007-2009 10gen SEE ALSO
For more information, please refer to the MongoDB wiki, available at http://www.mongodb.org. AUTHOR
Kristina Chodorow 10gen June 2009 MONGOEXPORT(1)
All times are GMT -4. The time now is 05:16 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy