Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Help splitting files greater than 250mb Post 302775871 by tomj5141 on Tuesday 5th of March 2013 02:15:25 PM
Old 03-05-2013
ksh[7]: suff++: 0403-053 Expression is not complete; more tokens expected. Not sure what is missing but I guess I can just leave it end with the letter then apply the filetype later. I do have a question. When we have a split -b 250m why do files split larger than 250mb???? :

-rw-rw-r-- 1 dfelp dfelpdev 262144000 Mar 05 13:52 P8.PECO_Energy.Residential.PA.12.2010_.txta

-rw-rw-r-- 1 dfelp dfelpdev 184936511 Mar 05 13:52 P8.PECO_Energy.Residential.PA.12.2010_.txtb

I just decreased the size to 200m to compensate but was wondering.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Deleting specific files greater then 90 days

I have a directory that contains a number of history files for a process. Every file starts with the string "EVACK". FOr example EVACK0101.000001 EVACK0102.095940 EVACKmmdd.hhmiss I want to erase in the specific directory ONLY(no sub-directories) all EVACK files older then 90 days. I... (2 Replies)
Discussion started by: beilstwh
2 Replies

2. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

3. Shell Programming and Scripting

Important finding --- find files greater than 1 MB

as we can find file greater than 1 MB with find command as: find /dir -name '*' -size +1M find /dir/* -name '*' -size +1M but wats its doing is , its finding files only in current directory not in sub-directories. i want files from sub-directories too. Please help... Thanx in... (3 Replies)
Discussion started by: manoj_dahiya22
3 Replies

4. Shell Programming and Scripting

Trying to find files equal to and greater than

Hi Guys and Gals, I'm having some difficulty putting this check into a shell script. I would like to search a particular directory for a number of files. The logic I have is pretty simple: Find file named *.txt that are newer than <this file> and count them If the number of files is equal to... (4 Replies)
Discussion started by: bbbngowc
4 Replies

5. UNIX for Dummies Questions & Answers

List of Files which are Greater then a specific date

A newbie question... I need to get a list of the Files and folders which are greater then a specific date. I want write the output to a Text file. What I know ls -lrt gives me list of all the files ordered by date. Also ls > fileName will write the results to a text file. Please help (6 Replies)
Discussion started by: rkaif
6 Replies

6. Shell Programming and Scripting

Find files greater than a particular date in filename.

I need a unix command which will find all the files greater that a particular date in the file name. say for example I have files like(filenaming cov : filename.YYDDMMSSSS.txt) abc.201206015423.txt abc.201207013456.txt abc.201202011234.txt abc.201201024321.txt efg.201202011234.txt... (11 Replies)
Discussion started by: lijjumathew
11 Replies

7. How to Post in the The UNIX and Linux Forums

Very Urgent ---Need to delete the log files when the disk used% greater than 85% using df -k*

Hi, I am new to Shell scripts. I have an urgent requirement to find the disk space using "df -k". from that output,I need to check the used% whether greater than 85%. if it is greater than 85% then need to delete my log files. It is very urgent please some one help me. Thanks in Advance... (1 Reply)
Discussion started by: Anandbarnabas
1 Replies

8. Shell Programming and Scripting

Need to delete the log files when the disk used% greater than 85% using df -k

Hi, I am new to Shell scripts. I have an urgent requirement to find the disk space using "df -k". from that output,I need to check the used% whether greater than 85%. if it is greater than 85% then need to delete my log files. It is very urgent please some one help me. Thanks in Advance... (2 Replies)
Discussion started by: Anandbarnabas
2 Replies

9. Shell Programming and Scripting

Please help list/find files greater 1G move to different directory

I have have 6 empty directory below. I would like write bash scipt if any files less "1000000000" bytes then move to "/export/home/mytmp/final" folder first and any files greater than "1000000000" bytes then move to final1, final2, final3, final4, final4, final5 and that depend see how many files,... (6 Replies)
Discussion started by: dotran
6 Replies

10. UNIX for Beginners Questions & Answers

Automate splitting of files , scp files as each split completes and combine files on target server

i use the split command to split a one terabyte backup file into 10 chunks of 100 GB each. The files are split one after the other. While the files is being split, I will like to scp the files one after the other as soon as the previous one completes, from server A to Server B. Then on server B ,... (2 Replies)
Discussion started by: malaika
2 Replies
split(1)							   User Commands							  split(1)

NAME
split - split a file into pieces SYNOPSIS
split [-linecount | -l linecount] [-a suffixlength] [ file [name]] split [ -b n | nk | nm] [-a suffixlength] [ file [name]] DESCRIPTION
The split utility reads file and writes it in linecount-line pieces into a set of output-files. The name of the first output-file is name with aa appended, and so on lexicographically, up to zz (a maximum of 676 files). The maximum length of name is 2 characters less than the maximum filename length allowed by the filesystem. See statvfs(2). If no output name is given, x is used as the default (output-files will be called xaa, xab, and so forth). OPTIONS
The following options are supported: -linecount | -l linecounNumber of lines in each piece. Defaults to 1000 lines. -a suffixlength Uses suffixlength letters to form the suffix portion of the filenames of the split file. If -a is not specified, the default suffix length is 2. If the sum of the name operand and the suffixlength option-argument would create a filename exceeding NAME_MAX bytes, an error will result; split will exit with a diagnostic message and no files will be created. -b n Splits a file into pieces n bytes in size. -b nk Splits a file into pieces n*1024 bytes in size. -b nm Splits a file into pieces n*1048576 bytes in size. OPERANDS
The following operands are supported: file The path name of the ordinary file to be split. If no input file is given or file is -, the standard input will be used. name The prefix to be used for each of the files resulting from the split operation. If no name argument is given, x will be used as the prefix of the output files. The combined length of the basename of prefix and suffixlength cannot exceed NAME_MAX bytes. See OPTIONS. USAGE
See largefile(5) for the description of the behavior of split when encountering files greater than or equal to 2 Gbyte ( 2**31 bytes). ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of split: LANG, LC_ALL, LC_CTYPE, LC_MES- SAGES, and NLSPATH. EXIT STATUS
The following exit values are returned: 0 Successful completion. >0 An error occurred. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWesu | +-----------------------------+-----------------------------+ |CSI |enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ SEE ALSO
csplit(1), statvfs(2), attributes(5), environ(5), largefile(5), standards(5) SunOS 5.10 16 Apr 1999 split(1)
All times are GMT -4. The time now is 11:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy