Sponsored Content
Top Forums Shell Programming and Scripting Performance issue in UNIX while generating .dat file from large text file Post 302312292 by durden_tyler on Thursday 30th of April 2009 09:59:37 PM
Old 04-30-2009
Try using perl. It was designed for fast text processing.

tyler_durden
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Unix File System performance with large directories

Hi, how does the Unix File System perform with large directories (containing ~30.000 files)? What kind of structure is used for the organization of a directory's content, linear lists, (binary) trees? I hope the description 'Unix File System' is exact enough, I don't know more about the file... (3 Replies)
Discussion started by: dive
3 Replies

2. Shell Programming and Scripting

How to attach an excel file/ dat file thru unix mails

Hi. I want to attach a .xls or .dat file while sending mail thru unix. I have come across diff attachments sending options, but allthose embeds the content in the mail. I want the attachement to be send as such. Please help me out. regards Diwakar (1 Reply)
Discussion started by: diwakar82
1 Replies

3. Shell Programming and Scripting

How to read from a .dat file in Unix

Hi All, I have a .dat file named test.dat where I have stored some process IDs. Now I need to pick a process ID, one by one and then fire kill -9 for each of those. The logic should be: 1. open file <filename.dat> 2. read until last line of file 3. if process ID is found fire kill -9... (5 Replies)
Discussion started by: Sibasish
5 Replies

4. UNIX for Dummies Questions & Answers

How do I delete a data string from a .dat file in unix

I have a .dat file in unix and it keeps failing file validation on line x. How do I delete a data string from a .dat file in UNIX? I tried the following: sed -e 'data string' -e file name and it telling me unrecognized command (4 Replies)
Discussion started by: supergirl3954
4 Replies

5. Shell Programming and Scripting

Severe performance issue while 'grep'ing on large volume of data

Background ------------- The Unix flavor can be any amongst Solaris, AIX, HP-UX and Linux. I have below 2 flat files. File-1 ------ Contains 50,000 rows with 2 fields in each row, separated by pipe. Row structure is like Object_Id|Object_Name, as following: 111|XXX 222|YYY 333|ZZZ ... (6 Replies)
Discussion started by: Souvik
6 Replies

6. Shell Programming and Scripting

Remove <CR><LF> from the dat file in unix

Hi, The source system has created the file in the dat format and put into the linux directory as mentioned below. I want to do foloowing things. a) Delete the Line started with <CR><LF> in the record b)Also line ...........................................................<CR><LF> ... (1 Reply)
Discussion started by: mr_harish80
1 Replies

7. Shell Programming and Scripting

Performance issue in Grepping large files

I have around 300 files(*.rdf,*.fmb,*.pll,*.ctl,*.sh,*.sql,*.prog) which are of large size. Around 8000 keywords(which will be in the file $keywordfile) needed to be searched inside those files. If a keyword is found in a file..I have to insert the filename,extension,catagoery,keyword,occurrence... (8 Replies)
Discussion started by: millan
8 Replies

8. UNIX for Dummies Questions & Answers

Generating a CSV file from a text file

Hi Guys, I have a simple request. I have a file in w3c format. Each file has 2 header lines. Rest of the lines are 16 columns each. They are separated by Tab. I need to discard the first 2 lines and then write each column of the txt file into a seperate column of CSV. I tried the command below... (1 Reply)
Discussion started by: tinkugadu
1 Replies

9. Answers to Frequently Asked Questions

How to split a dat file based on another file ni UNIX?

i have two files , one is var.txt and another res.dat file var.txt contains informaton like below date,request,sales,item 20171015,1,123456,216 20171015,1,123456,217 20171015,2,345678,214 20171015,3,456789,218 and res.dat contains is a one huge file contains information like... (1 Reply)
Discussion started by: pogo
1 Replies

10. Shell Programming and Scripting

Generating xml file from UNIX

i have a unix script which generates the csv file. the data in csv file is dynamic. how can i convert/move the data from csv file to xml. please suggest (1 Reply)
Discussion started by: archana25
1 Replies
textutil::repeat(n)				    Text and string utilities, macro processing 			       textutil::repeat(n)

__________________________________________________________________________________________________________________________________________________

NAME
textutil::repeat - Procedures to repeat strings. SYNOPSIS
package require Tcl 8.2 package require textutil::repeat ?0.7? ::textutil::repeat::strRepeat text num ::textutil::repeat::blank num _________________________________________________________________ DESCRIPTION
The package textutil::repeat provides commands to generate long strings by repeating a shorter string many times. The complete set of procedures is described below. ::textutil::repeat::strRepeat text num This command returns a string containing the text repeated num times. The repetitions are joined without characters between them. A value of num <= 0 causes the command to return an empty string. Note: If the Tcl core the package is loaded in provides the command string repeat then this command will be implemented in its terms, for maximum possible speed. Otherwise a fast implementation in Tcl will be used. ::textutil::repeat::blank num A convenience command. Returns a string of num spaces. BUGS, IDEAS, FEEDBACK This document, and the package it describes, will undoubtedly contain bugs and other problems. Please report such in the category textutil of the Tcllib SF Trackers [http://sourceforge.net/tracker/?group_id=12883]. Please also report any ideas for enhancements you may have for either package and/or documentation. SEE ALSO
regexp(n), split(n), string(n) KEYWORDS
blanks, repetition, string textutil 0.7.1 textutil::repeat(n)
All times are GMT -4. The time now is 10:50 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy