Sponsored Content
Top Forums Shell Programming and Scripting Concatenation of a large number of files Post 302472107 by chriss_58 on Tuesday 16th of November 2010 09:51:02 AM
Old 11-16-2010
Concatenation of a large number of files

Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example
VOICE_0000000000
VOICE_1223o23u0
VOICE_934934927349

I use the following code:
Code:
 cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE

And i get an error that files are too long!

How can i overcome this?

Best Regards,
Christos

Last edited by vbe; 11-16-2010 at 10:56 AM.. Reason: code tags please
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

moving large number of files

I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames I tried the following logics (1) find . -name \*.dat > list for i in `cat list` do mv $i test/ done (2) cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies

2. Shell Programming and Scripting

Script to Compare a large number of files.

I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc. What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up. its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies

3. Shell Programming and Scripting

Creating large number of files of specific size

Hi I am new to shell scripting.I want to create a batch file which creates a desired number of files with a specific size say 1MB each to consume space.How can i go about it using for loop /any other loop condition using shell script? Thanks (3 Replies)
Discussion started by: swatideswal
3 Replies

4. Shell Programming and Scripting

Need help combining large number of text files

Hi, i have more than 1000 data files(.txt) like this first file format: 178.83 554.545 179.21 80.392 second file: 178.83 990.909 179.21 90.196 etc. I want to combine them to the following format: 178.83,554.545,990.909,... 179.21,80.392,90.196,... (7 Replies)
Discussion started by: mr_monocyte
7 Replies

5. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

6. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

7. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

8. UNIX for Dummies Questions & Answers

Rename a large number of files in subdirectories

Hi, I have a large number of subdirectories (>200), and in each of these directories there is a file with a name like "opp1234.dat". I'd like to know how I could change the names of these files to say "out.dat" in all these subdirectories in one go. Thanks! (5 Replies)
Discussion started by: lost.identity
5 Replies

9. Shell Programming and Scripting

Sftp large number of files

Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box) Also need make sure file has been sftped successfully... Please let me know : 1. What is the best / faster way to transfer files? 2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies

10. Shell Programming and Scripting

Removing large number of temp files

Hi All, I am having a situation now to delete a huge number of temp files created during run times approx. 16700+ files. We have never imagined that we will get this this much big list of files during run time. It worked fine for lesser no of files in the list. But when list is huge we are... (7 Replies)
Discussion started by: mad man
7 Replies
flow-cat(1)						      General Commands Manual						       flow-cat(1)

NAME
flow-cat -- Concatenate flow files SYNOPSIS
flow-cat [-aghmp] [-b big|little] [-C comment] [-d debug_level] [-o filename] [-t start_time] [-T start_time] [-z z_level] [file|directory ...] DESCRIPTION
The flow-cat utility processes files and/or directories of files in the flow-tools format. The resulting concatenated data set is written to the standard output or file specified by -o. If file is a single dash (`-') or absent, flow-cat will read from the standard input. OPTIONS
-a Do not ignore filenames that begin with tmp. -b big|little Byte order of output. -C Comment Add a comment. -d debug_level Enable debugging. -g Sort file list by capture start time before processing. -h Display help. -m Disable the use of mmap(). -p Preload headers. Use to preserve meta information such as lost flows. -o file Write to file instead of the standard out. -t start_time Select flow files up to start_time. If used with -T select files between start_time and end_time. -T end_time Select flow files after end_time. If used with -t select files between start_time and end_time. -z z_level Configure compression level to z_level. 0 is disabled (no compression), 9 is highest compression. file|directory... Process the files and/or directory. TIME
/DATE parsing start_time and end_time parsing is implemented with getdate.y, a commonly used function to process free-form time date specifications. Example usage borrowed from cvs: 1 month ago 2 hours ago 400000 seconds ago last year last Monday yesterday a fortnight ago 3/31/92 10:00:07 PST January 23, 1987 10:05pm 22:00 GMT EXAMPLES
Concatenate all flow files begining with ft-v05.2001-05.01, use flow-print to display the results. flow-cat ft-v05.2001-05-01.* | flow-print Concatenate flow files in /flows/krc4, store store the output in compressed.flows at compression level 9 (best). The headers are preloaded so various metadata such as the flow count is correct in the result. Filenames begining with tmp which are typically in-progress flow files from flow-capture are not processed. flow-cat -p -z9 /flows/krc4 > compressed.flows BUGS
None known. AUTHOR
Mark Fullmer maf@splintered.net SEE ALSO
flow-tools(1) flow-cat(1)
All times are GMT -4. The time now is 12:49 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy