Sponsored Content
Top Forums Shell Programming and Scripting CSV Split field to check multiple codes Post 302963700 by RavinderSingh13 on Tuesday 5th of January 2016 05:01:01 AM
Old 01-05-2016
Hello SDohmen,

I am confuse here by seeing your sample output, as follows are the comments on same.
Code:
Barcode;Art_ours;Art_Sup;SKU;stock;price;manufacturer
4960999865300;testnderp;8935214;0023030102;555;342.70;REV
### Above line you are comparing the first columns of File1 and File2.
1230000000010, 1240000000010;ND010;12345;99263999;555;33.01;Pac
### Above line you are NOT comparing the first column of File1 and File2?

If you want to compare always the first columns of both the Input_files then following may help you.
Code:
awk -F"[,|;]" 'FNR==NR{A[$1]=$0;next} ($1 in A){q=$1;sub($1,X);print A[q] $0}' Input_file1 Input_file2

Output will be as follows for above command.
Code:
 4960999865300;testnderp;8935214;0023030102;555;11.70;REV

If you have some other conditions with your query, I request you to please let us know the complete details on same, it will be helpful for us to help you. Hope this helps.

Thanks,
R. Singh
This User Gave Thanks to RavinderSingh13 For This Post:
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Matching lines across multiple csv files and merging a particular field

I have about 20 CSV's that all look like this: "","","","","","","","","","","","","","","",""What I've been told I need to produce is the exact same thing, but with each file now containing the start_code from every other file where the email matches. It doesn't matter if any of the other... (1 Reply)
Discussion started by: Demosthenes
1 Replies

2. Shell Programming and Scripting

Field validations in multiple files CSV

Hi, I am regular reader of this forum. My advanced thanks to everyone. Below given are the sample files INDATA (Main data) Fild1Çfld2Çfld3….. Fild1Çfld2Çfld3….. Fild1Çfld2Çfld3….. Fild1Çfld2Çfld3….. Fild1Çfld2Çfld3….. . . N records (140000) eg GRPDATA (Reference file) (2 Replies)
Discussion started by: hyperion.krish
2 Replies

3. Shell Programming and Scripting

awk to split one field and print the last two fields within the split part.

Hello; I have a file consists of 4 columns separated by tab. The problem is the third fields. Some of the them are very long but can be split by the vertical bar "|". Also some of them do not contain the string "UniProt", but I could ignore it at this moment, and sort the file afterwards. Here is... (5 Replies)
Discussion started by: yifangt
5 Replies

4. Shell Programming and Scripting

Split a file into multiple files based on field value

Hi, I've one requirement. I have to split one comma delimited file into multiple files based on one of the column values. How can I achieve this Unix Here is the sample data. In this case I have split the files based on date column(c4) Input file c1,c2,c3,c4,c5... (1 Reply)
Discussion started by: manasvi24
1 Replies

5. Shell Programming and Scripting

How to split file into multiple files using awk based on 1 field in the file?

Good day all I need some helps, say that I have data like below, each field separated by a tab DATE NAME ADDRESS 15/7/2012 LX a.b.c 15/7/2012 LX1 a.b.c 16/7/2012 AB a.b.c 16/7/2012 AB2 a.b.c 15/7/2012 LX2 a.b.c... (2 Replies)
Discussion started by: alexyyw
2 Replies

6. Linux

How do I format a Date field of a .CSV file with multiple commas in a string field?

I have a .CSV file (file.csv) whose data are all enclosed in double quotes. Sample format of the file is as below: column1,column2,column3,column4,column5,column6, column7, Column8, Column9, Column10 "12","B000QRIGJ4","4432","string with quotes, and with a comma, and colon: in... (3 Replies)
Discussion started by: dhruuv369
3 Replies

7. Shell Programming and Scripting

Split a .csv File into Multiple Files

Hi guys, I have a requirement where i need to split a .csv file into multiple files. Say for example i have data.csv file and i have splitted that into multiple files based on some conditions i.e first file should have 100, last file 50 and other files 1000 each. Am passing the values in... (2 Replies)
Discussion started by: azherkn3
2 Replies

8. Shell Programming and Scripting

Match columns from two csv files and update field in one of the csv file

Hi, I have a file of csv data, which looks like this: file1: 1AA,LGV_PONCEY_LES_ATHEE,1,\N,1,00020460E1,0,\N,\N,\N,\N,2,00.22335321,0.00466628 2BB,LES_POUGES_ASF,\N,200,200,00006298G1,0,\N,\N,\N,\N,1,00.30887539,0.00050312... (10 Replies)
Discussion started by: djoseph
10 Replies

9. Shell Programming and Scripting

awk - CSV file - field with single or multiple spaces

Hi, In a csv file, I want to select records where first column has zero or multiple spaces. Eg: abc.csv ,123,a ,22,b ,11,c a,11,d So output should be: ,123,a ,22,b ,11,c Please advise (5 Replies)
Discussion started by: vegasluxor
5 Replies
MONGOEXPORT(1)							  Mongo Database						    MONGOEXPORT(1)

NAME
mongoexport - the Mongo export tool SYNOPSIS
mongoexport [OPTIONS] DESCRIPTION
mongoexport is a tool to export a MongoDB collection to either JSON or CSV. The query can be filtered or a list of fields to output can be given. If the output is CSV, the fields must be specified in order. EXAMPLES
mongoexport -d test -c test1 --csv -f name,num export documents from test.test1 in CSV format OPTIONS
--help show usage information -h, --host HOST server to connect to (default HOST=localhost) -d, --db DATABASE database to use -c, --c COLLECTION collection to use -q, --query QUERY query filter -f, --fields FIELDS comma-separated list of field names --csv export to CSV instead of JSON -o, --out FILE output file, if not specified, stdout is used --dbpath PATH directly access mongod data files in this path, instead of connecting to a mongod instance COPYRIGHT
Copyright 2007-2009 10gen SEE ALSO
For more information, please refer to the MongoDB wiki, available at http://www.mongodb.org. AUTHOR
Kristina Chodorow 10gen June 2009 MONGOEXPORT(1)
All times are GMT -4. The time now is 12:59 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy