Sponsored Content
Full Discussion: Converting txt file in csv
Top Forums Shell Programming and Scripting Converting txt file in csv Post 302441269 by mkashif on Friday 30th of July 2010 03:52:32 AM
Old 07-30-2010
Converting txt file in csv

HI All,


I have a text file memory.txt which has following values.


Code:
   Average:       822387   7346605     89.93    288845   4176593   2044589     51883      2.47      7600

i want to convert this file in csv format and i am using following command to do it.


Code:
  sed s/_/\./g < memory.txt | awk -F. '{ print $1","$2","$3","$4}' > test.csv

but I am getting output in just 3 columns. I want every value to be printed in new column.

I have also used this one but result is not accurate.

Code:
awk -F_ '{ print $1","$2","$3}' < memory.txt >  MemoryTemp.csv

plz tell me where i am doing wrong. i got this code from google.

Thanks in advance.

Last edited by zaxxon; 07-30-2010 at 05:35 AM.. Reason: added 1 more pair of missing code tags
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

AWK CSV to TXT format, TXT file not in a correct column format

HI guys, I have created a script to read 1 column in a csv file and then place it in text file. However, when i checked out the text file, it is not in a column format... Example: CSV file contains name,age aa,11 bb,22 cc,33 After using awk to get first column TXT file... (1 Reply)
Discussion started by: mdap
1 Replies

2. Shell Programming and Scripting

converting .txt to comma delimeted file

Dear all, I have a file with 5L records. one of the record in the file is as shown below. MARIA THOMAS BASIL 1000 FM 1111 MD GHANA YY 77354 4774 99999999 1234567 I need to convert this record in below format "","","","","MARIA","THOMAS","BASIL","","1000 FM 1111 MD","STE... (1 Reply)
Discussion started by: OSD
1 Replies

3. UNIX for Advanced & Expert Users

converting a .txt file to comma delimeted file

Dear all, I have a file with 5L records. one of the record in the file is as shown below. MARIA THOMAS BASIL 1000 FM 1111 MD ... (1 Reply)
Discussion started by: OSD
1 Replies

4. Shell Programming and Scripting

converting xls file to txt file and xls to csv

I need to convert an excel file into a text file and an excel file into a CSV file.. any code to do that is appreciated thanks (6 Replies)
Discussion started by: bandar007
6 Replies

5. UNIX for Dummies Questions & Answers

Converting txt file to csv file

Hi, Using rsync, I've sent the output to a text file. This is the text file : Pls help me on converting this text file to a csv file. Probably a script or sth to convert the text file to a csv file. (3 Replies)
Discussion started by: anaigini45
3 Replies

6. Shell Programming and Scripting

txt file to CSV

hi.. I have a text file which looks likes this 2258 4569 1239 258 473 i need to convert it into comma seperated format eg:2258,4569,1239,258,437 pls help (8 Replies)
Discussion started by: born
8 Replies

7. Shell Programming and Scripting

need to save the space when converting to CSV file

Hi, I have a text file with the following format. Some of the fields are blank. 1234 3456 23 45464 327837283232 343434 5654353 34 34343 3434345 434242 .... .... .... I need to convert this file to a CSV file, like 1234, ,23, ... (3 Replies)
Discussion started by: wintersnow2011
3 Replies

8. Shell Programming and Scripting

Converting txt file to Excel file

Hi All, I have a text file with below content. TradeDate Name SecurityMnc ReasonDesc ======================================= 20120501 Robin ABC FO System Defect 20120502 Robin ABC FO System Defect I would want this in an excel file in 4 columns,... (3 Replies)
Discussion started by: robinbannis
3 Replies

9. Shell Programming and Scripting

Converting txt file into CSV using awk or sed

Hello folks I have a txt file of information about journal articles from different fields. I need to convert this information into a format that is easier for computers to manipulate for some research that I'm doing on how articles are cited. The file has some header information and then details... (8 Replies)
Discussion started by: ksk
8 Replies

10. UNIX for Beginners Questions & Answers

Data extraction and converting into .csv file.

Hi All, I have a data file and need to extract and convert it into csv format: 1) Read and extract the line containing string ending with "----" (file sample_linebyline.txt file) and to make a .csv file from this. 2) To read the flat file flatfile_sample.txt which consists of similar data (... (9 Replies)
Discussion started by: abhi_123
9 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 03:09 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy