Sponsored Content
Full Discussion: Trying To Split a Large File
Top Forums UNIX for Beginners Questions & Answers Trying To Split a Large File Post 303030783 by sub terra on Friday 15th of February 2019 12:38:02 PM
Old 02-15-2019
Trying To Split a Large File

Trying to split a 35gb file into 1000mb parts. My research shows I should you this.
Code:
 split -b 1000m file.txt

and my return is "split: cannot open 'crunch1.txt' for reading: No such file or directory" so I tried
Code:
split -b 1000m Documents/Wordlists/file.txt

and I get nothing other than the curser just dropps down a line but that's all. My path is
Code:
/root/Documents/Wordlists/file.txt


For what I am doing everything I have found says my first option is correct.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split A Large File

Hi, I have a large file(csv format) that I need to split into 2 files. The file looks something like Original_file.txt first name, family name, address a, b, c, d, e, f, and so on for over 100,00 lines I need to create two files from this one file. The condition is i need to ensure... (4 Replies)
Discussion started by: nbvcxzdz
4 Replies

2. Shell Programming and Scripting

Split a large file with patterns and size

Hi, I have a large file with a repeating pattern in it. Now i want the file split into the block of patterns with a specified no. of lines in each file. i.e. The file is like 1... 2... 2... 3... 1... 2... 3... 1... 2... 2... 2... 2... 2... 3... where 1 is the start of the block... (5 Replies)
Discussion started by: sudhamacs
5 Replies

3. Shell Programming and Scripting

Split Large File

HI, i've to split a large file which inputs seems like : Input file name_file.txt 00001|AAAA|MAIL|DATEOFBIRTHT|....... 00001|AAAA|MAIL|DATEOFBIRTHT|....... 00002|BBBB|MAIL|DATEOFBIRTHT|....... 00002|BBBB|MAIL|DATEOFBIRTHT|....... 00003|CCCC|MAIL|DATEOFBIRTHT|.......... (1 Reply)
Discussion started by: AMARA
1 Replies

4. Shell Programming and Scripting

split large file based on field criteria

I have a file containing date/time sorted data of the form ... 2009/06/10,20:59:59.950,XAG/USD,Q,1,1115, 14.3025,100,1,1 2009/06/10,20:59:59.950,XAG/USD,Q,1,1116, 14.3026,125,1,1 2009/06/10,20:59:59.950,XAG/USD,R,0,0, , 0,0,0 2009/06/10,20:59:59.950,XAG/USD,R,1,0, 14.1910,100,1,1... (6 Replies)
Discussion started by: asriva
6 Replies

5. Shell Programming and Scripting

Splitting a large file, split command will not do.

Hello Everyone, I have a large file that needs to be split into many seperate files, however the text in between the blank lines need to be intact. The file looks like SomeText SomeText SomeText SomeOtherText SomeOtherText .... Since the number of lines of text are different for... (3 Replies)
Discussion started by: jwillis0720
3 Replies

6. Shell Programming and Scripting

Split large file based on last digit from a column

Hello, What's the best way to split a large into multiple files based on the last digit in the first column. input file: f 2738483300000x0y03772748378831x1y13478378358383x2y23743878383802x3y33787828282820x4y43748838383881x5y5 Desired Output: f0 3738483300000x0y03787828282820x4y4 f1... (9 Replies)
Discussion started by: alain.kazan
9 Replies

7. Shell Programming and Scripting

Split a large file

I have a 3 GB text file that I would like to split. How can I do this? It's a giant comma-separated list of numbers. I would like to make it into about 20 files of ~100 MB each, with a custom header and footer. The file can only be split on commas, but they're plentiful. Something like... (3 Replies)
Discussion started by: CRGreathouse
3 Replies

8. UNIX for Dummies Questions & Answers

Split large file to smaller fastly

hi , I have a requirement input file: 1 1111111111111 108 1 1111111111111 109 1 1111111111111 109 1 1111111111111 110 1 1111111111111 111 1 1111111111111 111 1 1111111111111 111 1 1111111111111 112 1 1111111111111 112 1 1111111111111 112 The output should be, (19 Replies)
Discussion started by: mechvijays
19 Replies

9. UNIX for Beginners Questions & Answers

sed awk: split a large file to unique file names

Dear Users, Appreciate your help if you could help me with splitting a large file > 1 million lines with sed or awk. below is the text in the file input file.txt scaffold1 928 929 C/T + scaffold1 942 943 G/C + scaffold1 959 960 C/T +... (6 Replies)
Discussion started by: kapr0001
6 Replies

10. UNIX for Advanced & Expert Users

How to split large file with different record delimiter?

Hi, I have received a file which is 20 GB. We would like to split the file into 4 equal parts and process it to avoid memory issues. If the record delimiter is unix new line, I could use split command either with option l or b. The problem is that the line terminator is |##| How to use... (5 Replies)
Discussion started by: Ravi.K
5 Replies
flow-split(1)						      General Commands Manual						     flow-split(1)

NAME
flow-split -- Split flow files into smaller files. SYNOPSIS
flow-split [-gGhn] [-b big|little] [-C comment] [-d debug_level] [-N nflows] [-o outfile_basename] [-T nseconds] [-z z_level] DESCRIPTION
The flow-split utility will split a flow file into smaller files based on the the number of flows or the ammount of time that has passed. OPTIONS
-b big|little Byte order of output. -C Comment Add a comment. -d debug_level Enable debugging. -g Split on source tag. -G Split on destination tag. -h Display help. -n Use symbols for tag field in filename. -N nflows Split after processing nflows. -o outfile_basename The basename of the resulting files. -T nsecond Split after processing an interval of nseconds flows. -z z_level Configure compression level to z_level. 0 is disabled (no compression), 9 is highest compression. EXAMPLES
Create 1 minute flow files from the flow archive in /flows/krc4. Store the results in /flows/krc4.split flow-cat /flows/krc4 | flow-split -T60 -o /flows/krc4.split/1min. BUGS
None known. AUTHOR
Mark Fullmer maf@splintered.net SEE ALSO
flow-tools(1) flow-split(1)
All times are GMT -4. The time now is 05:13 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy