Sponsored Content
Top Forums Shell Programming and Scripting awk does not work well with huge data? Post 302912778 by ariesto on Monday 11th of August 2014 10:46:15 PM
Old 08-11-2014
thank you very much , it works well.
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

A variable and sum of its value in a huge data.

Hi Experts, I got a question.. In the following output of `ps -elf | grep DataFlow` I get:- 242001 A mqsiadm 2076676 1691742 0 60 20 26ad4f400 130164 * May 09 - 3:02 DataFlowEngine EAIDVBR1_BROKER 5e453de8-2001-0000-0080-fd142b9ce8cb VIPS_INQ1 0 242001 A mqsiadm... (5 Replies)
Discussion started by: varungupta
5 Replies

2. Shell Programming and Scripting

Split a huge data into few different files?!

Input file data contents: >seq_1 MSNQSPPQSQRPGHSHSHSHSHAGLASSTSSHSNPSANASYNLNGPRTGGDQRYRASVDA >seq_2 AGAAGRGWGRDVTAAASPNPRNGGGRPASDLLSVGNAGGQASFASPETIDRWFEDLQHYE >seq_3 ATLEEMAAASLDANFKEELSAIEQWFRVLSEAERTAALYSLLQSSTQVQMRFFVTVLQQM ARADPITALLSPANPGQASMEAQMDAKLAAMGLKSPASPAVRQYARQSLSGDTYLSPHSA... (7 Replies)
Discussion started by: patrick87
7 Replies

3. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

4. Red Hat

Disk is Full but really does not contain huge data

Hi All, My disk usage show 100 % . When I check “df –kh” it shows my root partition is full. But when I run the “du –skh /” shows only 7 GB is used. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 28G 260MB 100% / How I can identify who is using the 20 GB of memory. Os: Centos... (10 Replies)
Discussion started by: kalpeer
10 Replies

5. Shell Programming and Scripting

Work with huge Zipped files

Hello dear members, I have one general and one specific question which I will be very grateful if you could help me with them. Let's start with my general question: 1. I am working on cluster computer shared with other people and I need to manipulate a big zipped text file of 13 GB. There is... (1 Reply)
Discussion started by: Homa
1 Replies

6. Shell Programming and Scripting

awk - fetch multiple data from huge dump

Hello Experts I have a requirement wherein I need to fetch multiple data from huge dump egrep -f Pattern.txt Dump.txt My pattern file has got like 300 entries and Dump file is like 8GB data. It taking eternity to complete on my machine. Is their a faster way to search pattern like using... (5 Replies)
Discussion started by: navkanwal
5 Replies

7. Shell Programming and Scripting

Aggregation of huge data

Hi Friends, I have a file with sample amount data as follows: -89990.3456 8788798.990000128 55109787.20 -12455558989.90876 I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million. How... (8 Replies)
Discussion started by: Ravichander
8 Replies

8. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

9. Shell Programming and Scripting

Phrase XML with Huge Data

HI Guys, I have Big XML file with Below Format :- Input :- <pokl>MKL=1,FN=1,GBNo=B10C</pokl> <d>192</d> <d>315</d> <d>35</d> <d>0,7,8</d> <pokl>MKL=1,dFN=1,GBNo=B11C</pokl> <d>162</d> <d>315</d> <d>35</d> <d>0,5,6</d> <pokl>MKL=1,dFN=1,GBNo=B12C</pokl> <d>188</d> (4 Replies)
Discussion started by: pareshkp
4 Replies

10. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies
fiocompress(1M) 					  System Administration Commands					   fiocompress(1M)

NAME
fiocompress - file compression utility SYNOPSIS
/sbin/fiocompress -c [-m] [-b block_size] input_file output_file /sbin/fiocompress -d input_file output_file DESCRIPTION
The fiocompress utility is a file compression tool that works together with the dcfs(7FS) file system to perform per-file compression. You can use fiocompress to decompress a compressed file or mark a compressed file as compressed, causing automatic decompression on read. The primary use of fiocompress is to compress files in the boot archive. Note that this utility is not a Committed interface. See attributes(5). OPTIONS
The following options are supported: -b block_size Specify a block size for compression. The default block size is 8192. -c Compress the specified file. -d Decompress the specified file. -m Mark the compressed file for automatic decompression on read. Can be used only in conjunction with -c. EXIT STATUS
0 The command completed successfully. -1 The command exited due to an error. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWcsr | +-----------------------------+-----------------------------+ |Interface Stability |Private | +-----------------------------+-----------------------------+ SEE ALSO
boot(1M), bootadm(1M), dcfs(7FS), ufs(7FS), attributes(5) NOTES
This compression/decompression utility works only with files stored in a UFS file system. There is no obvious way to determine whether a given file is compressed, other than copying the file and comparing the number of disk blocks of the copy against the original. SunOS 5.11 10 Dec 2008 fiocompress(1M)
All times are GMT -4. The time now is 11:50 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy