Sorry Rudic..I I could see already the first line is converting the timestamp to epoch along with seconds.There is something which need to be modified in the body of the script
i tried it... but i am not successful and i am very sorry for that as i am very new to AWK.
could you please help me in this if you dont mind..!
Last edited by Raghuram717; 07-03-2019 at 04:07 AM..
Hi,
I need to split a large file into small files based on a string.
At different palces in the large I have the string ^Job.
I need to split the file into different files starting from ^Job to the last character before the next ^Job.
Also all the small files should be automatically named.... (4 Replies)
Hi
I want to split a file that has 'n' number of records into 16 small files.
Can some one suggest me how to do this using Unix script?
Thanks
rrkk (10 Replies)
I have one large file, after every 200 line i have to split the file and the add header and footer to each small file?
It is possible to add different header and footer to each file? (7 Replies)
Dear All,
Could you please help me to split a file contain around 240,000,000 line to 4 files all equally likely , note that we need to maintain that the end of each file should started by start flage (MSISDN) and ended by end flag (End), also the number of the line between the... (10 Replies)
I have a large zone file dump that consists of
; DNS record for the adomain.com domain
data1
data2
data3
data4
data5
CRLF
CRLF
CRLF
; DNS record for the anotherdomain.com domain
data1
data2
data3
data4
data5
data6
CRLF (7 Replies)
Hi,
I need to split a large array "@sharedArray" into 10 small arrays.
The arrays should be like @sharedArray1,@sharedArray2,@sharedArray3...so on..
Can anyone help me with the logic to do so :(:confused: (6 Replies)
Dear shell experts,
I would like to spilt a txt file into small ones. However, I did not know how to program use shell. If someone could help, it is greatly appreciated!
Specifically, I supposed there is file named A.txt. The content of the file likes this:
Subject run condtion ACC time... (3 Replies)
Dear all,
I have huge txt file with the input files for some setup_code. However for running my setup_code, I require txt files with maximum of 1000 input files
Please help me in suggesting way to break down this big txt file to small txt file of 1000 entries only.
thanks and Greetings,
Emily (12 Replies)
Dears,
Need you help with the below file manipulation. I want to split the file into 8 smaller files but without cutting/disturbing the entries (meaning every small file should start with a entry and end with an empty line). It will be helpful if you can provide a one liner command for this... (12 Replies)
Split large xml into mutiple files and with header and footer in file
tried below
it splits unevenly and also i need help in adding header and footer
command :
csplit -s -k -f my_XML_split.xml extrfile.xml "/<Document>/" {1}
sample xml
<?xml version="1.0" encoding="UTF-8"?><Recipient>... (36 Replies)
Discussion started by: karthik
36 Replies
LEARN ABOUT LINUX
largefile
largefile(5) Standards, Environments, and Macros largefile(5)NAME
largefile - large file status of utilities
DESCRIPTION
A large file is a regular file whose size is greater than or equal to 2 Gbyte ( 2**31 bytes). A small file is a regular file whose size is
less than 2 Gbyte.
Large file aware utilities
A utility is called large file aware if it can process large files in the same manner as it does small files. A utility that is large file
aware is able to handle large files as input and generate as output large files that are being processed. The exception is where additional
files are used as system configuration files or support files that can augment the processing. For example, the file utility supports the
-m option for an alternative "magic" file and the -f option for a support file that can contain a list of file names. It is unspecified
whether a utility that is large file aware will accept configuration or support files that are large files. If a large file aware utility
does not accept configuration or support files that are large files, it will cause no data loss or corruption upon encountering such files
and will return an appropriate error.
The following /usr/bin utilities are large file aware:
adb awk bdiff cat chgrp
chmod chown cksum cmp compress
cp csh csplit cut dd
dircmp du egrep fgrep file
find ftp getconf grep gzip
head join jsh ksh ln
ls mdb mkdir mkfifo more
mv nawk page paste pathchck
pg rcp remsh rksh rm
rmdir rsh sed sh sort
split sum tail tar tee
test touch tr uncompress uudecode
uuencode wc zcat
The following /usr/xpg4/bin utilities are large file aware:
awk cp chgrp chown du
egrep fgrep file grep ln
ls more mv rm sed
sh sort tail tr
The following /usr/xpg6/bin utilities are large file aware:
getconf ls tr
The following /usr/sbin utilities are large file aware:
install mkfile mknod mvdir swap
See the USAGE section of the swap(1M) manual page for limitations of swap on block devices greater than 2 Gbyte on a 32-bit operating sys-
tem.
The following /usr/ucb utilities are large file aware:
chown from ln ls sed
sum touch
The /usr/bin/cpio and /usr/bin/pax utilities are large file aware, but cannot archive a file whose size exceeds 8 Gbyte - 1 byte.
The /usr/bin/truss utilities has been modified to read a dump file and display information relevant to large files, such as offsets.
cachefs file systems
The following /usr/bin utilities are large file aware for cachefs file systems:
cachefspack cachefsstat
The following /usr/sbin utilities are large file aware for cachefs file systems:
cachefslog cachefswssize cfsadmin fsck
mount umount
nfs file systems
The following utilities are large file aware for nfs file systems:
/usr/lib/autofs/automountd /usr/sbin/mount
/usr/lib/nfs/rquotad
ufs file systems
The following /usr/bin utility is large file aware for ufs file systems:
df
The following /usr/lib/nfs utility is large file aware for ufs file systems:
rquotad
The following /usr/xpg4/bin utility is large file aware for ufs file systems:
df
The following /usr/sbin utilities are large file aware for ufs file systems:
clri dcopy edquota ff fsck
fsdb fsirand fstyp labelit lockfs
mkfs mount ncheck newfs quot
quota quotacheck quotaoff quotaon repquota
tunefs ufsdump ufsrestore umount
Large file safe utilities
A utility is called large file safe if it causes no data loss or corruption when it encounters a large file. A utility that is large file
safe is unable to process properly a large file, but returns an appropriate error.
The following /usr/bin utilities are large file safe:
audioconvert audioplay audiorecord comm diff
diff3 diffmk ed lp mail
mailcompat mailstats mailx pack pcat
red rmail sdiff unpack vi
view
The following /usr/xpg4/bin utilities are large file safe:
ed vi view
The following /usr/xpg6/bin utility is large file safe:
ed
The following /usr/sbin utilities are large file safe:
lpfilter lpforms
The following /usr/ucb utilities are large file safe:
Mail lpr
The following /usr/lib utility is large file safe:
sendmail
SEE ALSO lf64(5), lfcompile(5), lfcompile64(5)SunOS 5.10 7 Nov 2003 largefile(5)