Sponsored Content
Top Forums Shell Programming and Scripting Read Lines from One file.. and create another.. and Post 302978271 by chetanojha on Thursday 28th of July 2016 10:56:55 AM
Old 07-28-2016
Read Lines from One file.. and create another.. and

Hello Members,

I have one file which contains million of supplier code. I need to load these codes into database 1000 at a time.
Database procedure reads from an external table which is based on the unix files.

All I want to do is to read from the bigger file e.g. MAIN_FILE.txt and create another files LOAD_FILE.txt (with only first 1000 records). Then run a pl/sql block. Once that is done create new LOAD_FILE.txt file with next set of 1000 records (i.e. from line 1001 - 2000).

But I need to make sure that first 10 lines of the LOAD_FILE.txt always contains a header .. so the 1000 records should be amended from line 11 onwards.

I have the pl/sql bit of code ready in another script (which I will call from this script everytime 1000 lines are created in LOAD_FILE.txt.

Can anyway help?

Thanks
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read words from file and create new file using K-shell.

Hi All, Please help me in creating files through K-shell scripts. I am having one file in this format. OWNER.TABLE_NAME OWNER.TABLE_NAME1 OWNER1.TABLE_NAME OWNER1.TABLE_NAME1 I want to read the above file and create new file through k shell script. The new file should looks like this.... (4 Replies)
Discussion started by: bsrajirs
4 Replies

2. Shell Programming and Scripting

read mp3 filename and create one XML for each file

Hi: I have a collection of mp3s and I need to create 1 xml file per mp3. I have: recording1.mp3 recording2.mp3 etc and I want to generate this kind of files. recording1.xml recording2.xml and inside each xml file I need to add a url prefix and then the filename at the end. ... (4 Replies)
Discussion started by: jason7
4 Replies

3. Shell Programming and Scripting

Read a file and search a value in another file create third file using AWK

Hi, I have two files with the format shown below. I need to read first field(value before comma) from file 1 and search for a record in file 2 that has the same value in the field "KEY=" and write the complete record of file 2 with corresponding field 2 of the first file in to result file. ... (11 Replies)
Discussion started by: King Kalyan
11 Replies

4. Shell Programming and Scripting

Read a file and create egrep variable

I want to create a egrep variable from a file. For example: string=`cat query.txt` cat myfile.txt | egrep "$string" The string variable file has a list of one or multiple lines So the end result of: cat myfile.txt | egrep "$string" would be: cat myfile.txt | egrep... (2 Replies)
Discussion started by: numele
2 Replies

5. UNIX for Dummies Questions & Answers

When reading a csv file, counter to read 20 lines and wait for minute then read next 20 till end

Hello All, i am a newbie and need some help when reading a csv file in a bourne shell script. I want to read 10 lines, then wait for a minute and then do a reading of another 10 lines and so on in the same way. I want to do this till the end of file. Any inputs are appreciated ... (3 Replies)
Discussion started by: victor.s
3 Replies

6. UNIX for Dummies Questions & Answers

Read a flat file, evaluate and create output. UNIX SCRIPT.

Hi all, I have a flat file as below; 470423495|1||TSA-A000073800||1|||1 471423495|1||TSA-A000073800||5|||5 472423495|1||TSA-A000073800||2|||7 473423495|1||TSA-A000073800||3|||3 I like to create a Unix script. The script have to valuate the last two columns, if the values are... (4 Replies)
Discussion started by: mrreds
4 Replies

7. Shell Programming and Scripting

How to create a shell script to read a foldername from a text file and go to the folder?

Hi, I am trying to write a shell script which can read folder names from a text file and then go to the folder and picks up a xml file and write on my sipp script so that I can run the sipp script. For example: I have a text file called thelist.txt where I have provided all the folders... (7 Replies)
Discussion started by: pm1504
7 Replies

8. Shell Programming and Scripting

Script to read through a file and create new users/assign them to groups in Ubuntu

Hi all. I need a shell script that can, in short, read through a text file line by line and create a new user in Ubuntu, as well as assign that user to a group. The format of the text file is not important but preferably: 'username:group'. I don't have much programming knowledge no matter shell... (3 Replies)
Discussion started by: LewisWeekly
3 Replies

9. Shell Programming and Scripting

How to compare 2 files and create a result file with unmatched lines from first file.?

HI, I have 2 text files. file1 and file2. file1.txt (There are no duplicates in this file) 1234 3232 4343 3435 6564 6767 1213 file2.txt 1234,wq,wewe,qwqw 1234,as,dfdf,dfdf 4343,asas,sdds,dsds 6767,asas,fdfd,fdffd I need to search each number in file1.txt in file2.txt's 1st... (6 Replies)
Discussion started by: Little
6 Replies

10. Shell Programming and Scripting

Read log file to create Performance index

I am required to create a CSV file reading last 200000 lines form a log file. I have to grep 3 parameters from this log file and write these parameters in the .csv file, with time stamp. This script will be setup in a cron job which will run every 10 minutes. I have written the script but it is... (5 Replies)
Discussion started by: Crazy_Nix
5 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 07:39 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy