Sponsored Content
Top Forums UNIX for Advanced & Expert Users Sql dynamic table / dynamic inserts Post 302356461 by magedfawzy on Friday 25th of September 2009 03:37:58 PM
Old 09-25-2009
Sql dynamic table / dynamic inserts

I have a file that reads File (X.txt)

Contents of record 1:
rdrDESTINATION_ADDRESS (String) "91 971502573813"
rdrDESTINATION_IMSI (String) "000000000000000"
rdrORIGINATING_ADDRESS (String) "d0 movies"
rdrORIGINATING_IMSI (String) "000000000000000"
rdrTRAFFIC_EVENT_TIME (String) "09090801212416"
rdrTRAFFIC_EVENT_TYPE (Long) 8


File (Y.txt)

Contents of record 2:
rdrDESTINATION_ADDRESS (String) "91 9715023423813"
rdrDESTINATION_IMSI (String) "00000002340000"
rdrORIGINATING_ADDRESS (String) "d0 etisalat"
rdrORIGINATING_IMSI (String) "000000000000000"



I want to create a dynamic table / insert of values into this table..

Code:
tmpdir=/tmp/records ; mkdir -p $tmpdir ;
tmpfile=/tmp/records/dump.$$
TABLE_COLUMNS=$tmpdir/columns.$$
TABLE_NAME=DUMP_$$
sqlload=$tmpdir/records/sqldump.sql


cat X.txt Y.txt| grep -i rdr | sort -ru  > $TABLE_COLUMNS 


echo "CREATE TABLE $TABLE_NAME  " > $tmpdir/create_dump_tbl.sql
while read line ; do
echo "$line VARCHAR2(50 BYTE)," >> $tmpdir/create_dump_tbl.sql
TABLE_COL=`echo $TABLE_COL "," $line`
done < $TABLE_COLUMNS


## above has a flaw ..for last line of " , " but anyway, not a big deal

## My Real issue is create an insert dynamically"
for field in `cat X.txt Y.txt`
do
echo "Insert into $TABLE_NAME ($TABLE_COL) VALUES $ " >> $sqlload
done



Note that File X.txt has 6 columns
and File Y.txt has 4 columns,
so how do i construct the statement, putting nulls in the 2 columns that are empty. (i.e. rdrTRAFFIC_EVENT_TIME rdrTRAFFIC_EVENT_TYPE )


Note: the columns from the File (x.txt & y.txt) do not appear in order

Thanks
 

10 More Discussions You Might Find Interesting

1. AIX

Dynamic routing table

i have AIX Unix with "dynamic routing table" service enabled, but i do not need this service. How i can disable this service. thanks (1 Reply)
Discussion started by: jdsnbr
1 Replies

2. Shell Programming and Scripting

Dynamic SQl in KSH

My requirement is to create a KSH to generate the SQL select statement in oracle with all the columns and optional where condition if given the table name as input to the program Have any of you worked with a similar requirement? Can you give me some inputs? Regards, Kousikan (2 Replies)
Discussion started by: kousikan
2 Replies

3. Shell Programming and Scripting

Dynamic SQL for where clause

Hi, I have an app which user can query the database based on 4 criteria, that is Field1, Field2, Field3 and Field4 Mya I know how to write a dynamic SQL where I can choose to retrieve data based on their selected value. eg. where Field1=AAA eg. where Field1=AAA and Field2=BBB eg.... (1 Reply)
Discussion started by: TeSP
1 Replies

4. Shell Programming and Scripting

Dynamic SQL query based on shell script parameters

Hi, I need a script that will run a dynamic Oracle SQL. Dynamic meaning the SQL statement depends on the parameter. For instance, something like this: #!/bin/ksh -x # Set environment . /home/mine/set_vars sqlplus $LOGINID <<! >> /home/mine/log.txt select count(1) from $1 where... (2 Replies)
Discussion started by: laiko
2 Replies

5. Shell Programming and Scripting

Store data from dynamic website table

hi everybody, Asking for something that I´m not sure if it´s possible to implement. I hope be clear enough. Well, my issue is that I´m looking how to copy or extract a particular table content of a website. I get the content from a external feed (Iframe format), the content is updated every... (1 Reply)
Discussion started by: cgkmal
1 Replies

6. Shell Programming and Scripting

Dynamic sql where contents

Hi all, I need to add the contents from a file into a sql stament in the where clause. file1: id 1 2 3 10 11 ... script should look like : select name from tab_user tus where tus.id in (1,2,3,10,11..) any ideas or suggetions will be appreciatte. (5 Replies)
Discussion started by: valigula
5 Replies

7. Web Development

MYSQL: Creating Dynamic Table Names 5.1

Hey everyone. Thanks for looking at this. I'm trying to create a table with the dynamic name of TableName + today's date. My variables are all happily created but the system chokes when I try to create the new table name example: Set @BFBW = CONCAT("BFBW", CURDATE()); Select @BFBW; ... (2 Replies)
Discussion started by: Astrocloud
2 Replies

8. Linux

Shell Script to generate Dynamic Param file Using SQL Plus Quey

Hi All, Can anyone give me Shell script sample script to generate Param file by Reading Values from SQL Plus query and it should assign those values to variables like.. $$SChema_Name='ORCL' Thanks in Advance... Srav... (4 Replies)
Discussion started by: Sravana Kumar
4 Replies

9. Shell Programming and Scripting

Request: How to Parse dynamic SQL query to pad extra columns to match the fixed number of columns

Hello All, I have a requirement in which i will be given a sql query as input in a file with dynamic number of columns. For example some times i will get 5 columns, some times 8 columns etc up to 20 columns. So my requirement is to generate a output query which will have 20 columns all the... (7 Replies)
Discussion started by: vikas_trl
7 Replies

10. Shell Programming and Scripting

Replace sql with dynamic values

Hi Guys, I am using a function to replace the values dynamically to frame sql query by reading a file. My file will have column names like file.txt col_1 col_2 expected output: select id,col_1,col_2 from ( select a.id, a.col_1, rank() over (ORDER BY cast(a.col_1 AS double)... (5 Replies)
Discussion started by: Master_Mind
5 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 04:18 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy