Sponsored Content
Full Discussion: Split then rename
Top Forums UNIX for Dummies Questions & Answers Split then rename Post 302433728 by coach5779 on Wednesday 30th of June 2010 10:21:47 AM
Old 06-30-2010
Split then rename

Hi everyone,

I am trying to write an if statement that will split a file if it is over 1 million records/lines into files with say 900,000 records and then rename
those files without the aaa, aab, aac format that splitting normally does and into a specific naming convention. For instance, if my file was named Test.File.txt and looked like the following:
Code:
Todd
1
2
3
4
5
6
7
8
9
Jim
10
11
12
13
14
15
16
17
18
19
Scott
20
21
22
23
24
25
26
27
28
29

I would want to split the file so that there would be 3 files (imagine each name represents 900,000 records) and they would be named Test.A1File.txt, Test.A2File.txt, Test.A3File.txt.

Thanks.

Last edited by radoulov; 06-30-2010 at 11:24 AM.. Reason: Code tags, please!
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

how can I rename the following=-^

I have a file named -^, I want to look at it, rename, etc. Any help out there?? (5 Replies)
Discussion started by: nj78
5 Replies

2. UNIX for Dummies Questions & Answers

Split a file with no pattern -- Split, Csplit, Awk

I have gone through all the threads in the forum and tested out different things. I am trying to split a 3GB file into multiple files. Some files are even larger than this. For example: split -l 3000000 filename.txt This is very slow and it splits the file with 3 million records in each... (10 Replies)
Discussion started by: madhunk
10 Replies

3. Shell Programming and Scripting

split and rename the file

Hi All, I have a requirement .I want to split a file and the split files should have certain names. Currently when i use the split command split -1000 testdata testdata_ Then the output is testdata_aa testdata_bb testdata_cc and so on. But i want the output as testdata1.snd... (3 Replies)
Discussion started by: dnat
3 Replies

4. Shell Programming and Scripting

awk split and rename files

I have a file test1.html like below: <dctm_topnav_en_US> <html> ..... </html> <dctm_topnav_en_CA> <html> ..... </html> <dctm_topnav_en_FR> <html> ..... </html> I need to use awk to split this into three file names like en_US.html , en_CA.html, en_FR.html each having content between... (4 Replies)
Discussion started by: vijay52
4 Replies

5. UNIX for Dummies Questions & Answers

Split and Rename files using Terminal and bin/bash

I have a file named Me_thread_spell.txt that I want to split into smaller files. I want it to be split in each place there is a ;;;. For example, blah blah blah ;;; blah bhlah hlabl awasnceuir asenduhfoijhacseiodnbfxasd;;; oabwcuhaweoir;;; This full file would be three separate files... (7 Replies)
Discussion started by: mschpers
7 Replies

6. Shell Programming and Scripting

awk to split one field and print the last two fields within the split part.

Hello; I have a file consists of 4 columns separated by tab. The problem is the third fields. Some of the them are very long but can be split by the vertical bar "|". Also some of them do not contain the string "UniProt", but I could ignore it at this moment, and sort the file afterwards. Here is... (5 Replies)
Discussion started by: yifangt
5 Replies

7. UNIX for Dummies Questions & Answers

Split and Rename multiple files

Hi, I have a data file like below messageid|email|timestamp 750452173|123@googlemail.com|2013-05-24 16:14:32 750464921|000@gmail.com|2013-06-13 19:38:01 750385426|001@googlemail.com|2013-01-06 12:06:36 750373470|000@wz.eu|2012-11-30 22:32:07 . . I want to split the files based on the... (4 Replies)
Discussion started by: armsaran
4 Replies

8. Shell Programming and Scripting

Split and rename files

Hello, Need to split files into n number of files and rename the files Example: Input: transaction.txt.1aa transaction.txt.1ab ...... Output: transaction.txt.1 transaction.txt.2 transaction.txt.3 (3 Replies)
Discussion started by: krux_rap
3 Replies

9. Shell Programming and Scripting

awk : split file and rename and save in path according to content

Hello, I'm using Windows 7 ; sed, awk and gnuwin32 are installed. I have a big text file I need to manipulate. In short, I will have to split it in thousands of short files, then rename and save in a folder which name is based upon filename. Here is a snippet of my big input.txt file (this... (4 Replies)
Discussion started by: sellig
4 Replies

10. UNIX for Beginners Questions & Answers

Split and Rename Split Files

Hello, I need to split a file by number of records and rename each split file with actual filename pre-pended with 3 digit split number. What I have tried is the below command with 2 digit numeric value split -l 3 -d abc.txt F (# Will Produce split Files as F00 F01 F02) How to produce... (19 Replies)
Discussion started by: techedipro
19 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:23 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy