Converting data for text file to csv


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Converting data for text file to csv
Prev   Next
# 1  
Old 08-24-2014
Converting data for text file to csv

Gents

Using the script attached (raw2csv). i use to create the file .csv.. The input file is called 201.raw.

Kindly can you check if there is easy way to do it. The script works fine but takes a lot time to process

Thanks for your help
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Data extraction and converting into .csv file.

Hi All, I have a data file and need to extract and convert it into csv format: 1) Read and extract the line containing string ending with "----" (file sample_linebyline.txt file) and to make a .csv file from this. 2) To read the flat file flatfile_sample.txt which consists of similar data (... (9 Replies)
Discussion started by: abhi_123
9 Replies

2. Shell Programming and Scripting

Read csv file, convert the data and make one text file in UNIX shell scripting

I have input data looks like this which is a part of a csv file 7,1265,76548,"0102:04" 8,1266,76545,"0112:04" I need to make the output data should look like this and the output data will be part of text file: 7|1265000 |7654899 |A| 8|12660000 |76545999 |B| The logic behind the... (6 Replies)
Discussion started by: RJG
6 Replies

3. Shell Programming and Scripting

Compare 2 files of csv file and match column data and create a new csv file of them

Hi, I am newbie in shell script. I need your help to solve my problem. Firstly, I have 2 files of csv and i want to compare of the contents then the output will be written in a new csv file. File1: SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0713.JPG,2015:02:17 11:14:07... (8 Replies)
Discussion started by: refrain
8 Replies

4. Shell Programming and Scripting

Format problem while converting text file to csv

Hi , I need a help in following scenario.I tried searching in google but couldn't able to find the exact answer. Sorry if i am re-posting already answered query. While i am trying to convert into log file into csv i couldn't able to get the format which i am looking for. I converted file... (4 Replies)
Discussion started by: varmas424
4 Replies

5. Shell Programming and Scripting

Converting variable space width data into CSV data in bash

Hi All, I was wondering how I can convert each line in an input file where fields are separated by variable width spaces into a CSV file. Below is the scenario what I am looking for. My Input data in inputfile.txt 19 15657 15685 Sr2dReader 107.88 105.51... (4 Replies)
Discussion started by: vharsha
4 Replies

6. Shell Programming and Scripting

Text file to CSV with field data separated by blank lines

Hello, I have some data in a text file where fields are separated by blank lines. There are only 6 fields however some fields have several lines of data as I will explain. Also data in a particular field is not consistently the same size but does end on a blank line. The first field start with... (6 Replies)
Discussion started by: vestport
6 Replies

7. Shell Programming and Scripting

Data fetched from text file and save in a csv file

Hi i have wriiten a script which fetches the data from text file, and saves in the output in a text file itself, but i want that the output should save in different columns. I have the output like: For Channel:response_time__24.txt 1547 data points 0.339 0.299 0.448 0.581 7.380 ... (1 Reply)
Discussion started by: rohitkalia
1 Replies

8. Shell Programming and Scripting

converting text to csv format

I am trying to check each line and based on first two digits, the comma needs to be place. I checked in the earlier post where the text is converted to csv with a tab delimited. Here is the test file that needs to be changed to csv 11 051701 22 051701 330123405170105170112345... (13 Replies)
Discussion started by: gthokala
13 Replies

9. Shell Programming and Scripting

Converting a text file to a csv file

I have a text file that is the output of a Netbackup report. The file it generates is just a plain text file with only white space between fields. For example: Date Policy Type Kilobytes Retention 12/5/2005 WinNT Full 18329948 6 Months I... (4 Replies)
Discussion started by: primowalker
4 Replies

10. Shell Programming and Scripting

Exporting text file data to csv

Could any one help me in basic shell script to export text file data to csv. I need to export only particular data from text file to csv column. I am a newbie to UNIX could anyone help me with sample script code (3 Replies)
Discussion started by: l_jayakumar
3 Replies
Login or Register to Ask a Question
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)