Awk to convert a flat file to CSV file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Awk to convert a flat file to CSV file
# 15  
Old 07-17-2008
If you know the field widths already and they don't change, you can just use one large printf statement, e.g. printf "%-9s,%-7s, ... ,%-20s\n",$1,$2,$3, ...etc.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Converting csv file to flat file

Hi All, I have a csv file which is comma seperated. I need to convert to flat file with preferred column length country,id Australia,1234 Africa,12399999 Expected output country id Australia 1234 Africa 12399999 the flat file should predefined length on respective... (8 Replies)
Discussion started by: rohit_shinez
8 Replies

2. UNIX for Dummies Questions & Answers

Convert flat file to csv

Hi I have a file like this: a=1 b=2 c=3 a=4 b=2 d=3 a=3 c=4 How can I change this to csv format a,b,c,d 1,2,3,, 4,2,,3 3,,4,, Please use code tags next time for your code and data. Thanks (10 Replies)
Discussion started by: sandip_2014
10 Replies

3. Shell Programming and Scripting

How to convert excel file to csv file or text file?

Hi all, I need to find a way to convert excel file into csv or a text file in linux command. The reason is I have hundreds of files to convert. Another complication is the I need to delete the first 5 lines of the excel file before conversion. so for instance input.xls description of... (6 Replies)
Discussion started by: johnkim0806
6 Replies

4. Shell Programming and Scripting

Awk to convert a text file to CSV file with some string manipulation

Hi , I have a simple text file with contents as below: 12345678900 971,76 4234560890 22345678900 5971,72 5234560990 32345678900 71,12 6234560190 the new csv-file should be like: Column1;Column2;Column3;Column4;Column5 123456;78900;971,76;423456;0890... (9 Replies)
Discussion started by: FreddyDaKing
9 Replies

5. Shell Programming and Scripting

reading a csv file and creating a flat file

hi i have written a script for reading a csv file and creating a flat file, suggest if this script can be optimized #---------------- FILENAME="$1" SCRIPT=$(basename $0) #-----------------------------------------// function usage { echo "\nUSAGE: $THIS_SCRIPT file_to_process\n"... (3 Replies)
Discussion started by: mprakasheee
3 Replies

6. Shell Programming and Scripting

Convert CSV file (with double quoted strings) to pipe delimited file

Hi, could some help me convert CSV file (with double quoted strings) to pipe delimited file: here you go with the same data: 1,Friends,"$3.99 per 1,000 listings",8158here " 1,000 listings " should be a single field. Thanks, Ram (8 Replies)
Discussion started by: Ram.Math
8 Replies

7. Programming

awk script to convert a text file into csv format

hi...... thanks for allowing me to start a discussion i am collecting usb usage details of all users and convert it into csv files so that i can export it into some database.. the input text file is as follows:- USB History Dump by nabiy (c)2008 (1) --- Kingston DataTraveler 130 USB... (2 Replies)
Discussion started by: certteam
2 Replies

8. Shell Programming and Scripting

Flat file to csv conversion

Hi Guy's can someone help me in converting the following I have a flat text file which has several thousand lines which I need to convert to a csv it's got a consistent format but basically want every time it hit's txt to create a new line with the subsequent lines comma delimited for example ... (6 Replies)
Discussion started by: p1_ben
6 Replies

9. Shell Programming and Scripting

Convert case on specified position of flat file

Please help Need a script which will do the following : Search on fixed width file , go to position (25,2) which means 25th and 26th position, Find if there are any char in lower case: For example 25,2 can be (9T) or (9w) or (Ww) or (wW)....The two positions can be numeric or alpha...no... (13 Replies)
Discussion started by: ssantoshss
13 Replies

10. Shell Programming and Scripting

Need help to convert Flat file to HTML

Hello I need help to convert flat file data to HTML Table format. I am generating everyday Flat file and want to convert into HTML Table format. The format of my file is: version host Total YRS NO APPS PSD 10 Sun 30 2 4 6 7 and flat... (11 Replies)
Discussion started by: getdpg
11 Replies
Login or Register to Ask a Question
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)