Sponsored Content
Full Discussion: Update a column from a Join
Top Forums Programming Update a column from a Join Post 302917545 by newborndba on Wednesday 17th of September 2014 11:38:54 AM
Old 09-17-2014
Update a column from a Join

Here is my select that I have to identify the child records that are Open (e.c7 < 6000) when the parent (t2068) c.c7 > 3


Code:
SELECT  c.c1000000161, 
    c.c7, 
    c.c1000000019, 
    e.c1000000829 
FROM 
    t2068 c INNER JOIN t1533 e ON e.c1000000829 = c.c301572100 
where 
    c.c7 > 3 AND e.c7 < 6000


What I want to do is update e.c7 = 6000 for the records identified from the statement above.
 

10 More Discussions You Might Find Interesting

1. Programming

Update with a join.

Hello, Can someone help me correct this statement. Update TESTDTA.F0101 Set F0101.ABAN86='5253' WHERE F03012.AIAN8 = F0101.ABAN8 AND ((F03012.AICO='00219') AND (F03012.AIAC21='H27')); Thanks Sam (6 Replies)
Discussion started by: s1a2m3
6 Replies

2. UNIX for Dummies Questions & Answers

Join 2 files using first column

Hi, I'm trying to compare the first column of two files (tab or whitespace delimited, either way's fine, I`ve got both) and print the lines that are identical for the first column of both files. Something like this: File1 AAA 26 49 7 27 36 33 46 75 73 69 AAAAA 4 10 4 7 10 18 21... (2 Replies)
Discussion started by: vanesa1230
2 Replies

3. Shell Programming and Scripting

Join 3 or more files using matching column

Dear Forum, Full title of the topic would be: "Join 3 or more files using matching column without full list in any of these columns" I have several, typically 3 or 4 files which I need to join, something like FULL JOIN in slq scripts, all combinations of matches should be printed into an... (3 Replies)
Discussion started by: cyz700
3 Replies

4. Shell Programming and Scripting

join two column

Hi I want to join last two column: File A U3268 2689 61 12 10 U3268 2684 71 13 0 U3268 2685 81 13 1 Output: U3268 2689 61 12/10 U3268 2684 71 13/0 U3268 2685 81 13/1 Thanks (6 Replies)
Discussion started by: pareshkp
6 Replies

5. Shell Programming and Scripting

join two files based on one column

Hi All, I am trying to join to files based on one common column. Cat File1 ID HID Ab_1 23 Cd 45 df 22 Vv 33 Cat File2 ID pval Ab_1 0.3 Cd 10 Vv 0.0444 (3 Replies)
Discussion started by: newpro
3 Replies

6. UNIX for Dummies Questions & Answers

How to use the the join command to join multiple files by a common column

Hi, I have 20 tab delimited text files that have a common column (column 1). The files are named GSM1.txt through GSM20.txt. Each file has 3 columns (2 other columns in addition to the first common column). I want to write a script to join the files by the first common column so that in the... (5 Replies)
Discussion started by: evelibertine
5 Replies

7. UNIX for Dummies Questions & Answers

Join files by second column

I have file input file1 1/1/2013 A 553.0763397 96 16582 1/1/2013 B 459.8333588 195 11992 1/2/2013 A 844.2973022 306 19555 1/2/2013 B 833.9300537 457 20165 1/3/2013 A 563.6917419 396 13879 1/3/2013 B 632.0749969 169 ... (1 Reply)
Discussion started by: radius
1 Replies

8. Shell Programming and Scripting

Multi column join

Hello folks, having a new problem with an old solution I previously had working, but think i'm missing something here. File1: 900001093|HAMU1|SUDO_ALIAS 100100361|IAM_IDS|SUDO_ALIAS 100100361|AMPF|SUDO_ALIAS File2: ABC123456|Cust1|900001093|myemail@here.com|Jane Smith|Win1|hamu1... (2 Replies)
Discussion started by: dagamier
2 Replies

9. UNIX for Dummies Questions & Answers

Join 2 files based on certain column

I have file input1.txt 11103|11|OTTAWA|City|AA|CAR|0|0|1|-1|0|8526|2014-09-07 23:00:14 11103|11|OTTAWA|City|BB|TRAIN|0|0|2|-2|6|6359|2014-09-07 23:00:14 11104|11|CANADA|City|CC|CAR|0|0|2|-2|0|5947|2014-09-07 23:00:14 11104|11|CANADA|City|DD|TRAIN|0|0|2|-2|1|4523|2014-09-07 23:00:14... (5 Replies)
Discussion started by: radius
5 Replies

10. UNIX for Dummies Questions & Answers

Join with awk different column

hi guys, i need help I need to join file2 to file1 when column 3 in my file1 and column 1 in my file2 in the same string file1 AA|RR|ESKIM RE|DD|RED WE|WW|SUPSS file2 ESKIM|ES SUPSS|SS Output AA|RR|ESKIM|ES RE|DD|RED| WE|WW|SUPSS|SS (3 Replies)
Discussion started by: radius
3 Replies
PAPS(1) 						      General Commands Manual							   PAPS(1)

NAME
paps - UTF-8 to PostScript converter using Pango SYNOPSIS
paps [options] files... DESCRIPTION
paps reads a UTF-8 encoded file and generates a PostScript language rendering of the file. The rendering is done by creating outline curves through the pango ft2 backend. OPTIONS
These programs follow the usual GNU command line syntax, with long options starting with two dashes (`-'). A summary of options is included below. --landscape Landscape output. Default is portrait. --columns=cl Number of columns output. Default is 1. --font=desc Set the font description. Default is Monospace 12. --rtl Do rtl layout. --paper ps Choose paper size. Known paper sizes are legal, letter, a4. Default is A4. --bottom-margin=bm Set bottom margin in postscript points (1/72 inch). Default is 36. --top-margin=tm Set top margin. Default is 36. --left-margin=lm Set left margin. Default is 36. --right-margin=rm Set right margin. Default is 36. --help Show summary of options. --header Draw page header for each page. --markup Interpret the text as pango markup. --encoding=ENCODING Assume the documentation encoding is ENCODING. --lpi Set the lines per inch. This determines the line spacing. --cpi Set the characters per inch. This is an alternative method of specifying the font size. --stretch-chars Indicates that characters should be stretched in the y-direction to fill up their vertical space. This is similar to the texttops behaviour. AUTHOR
paps was written by Dov Grobgeld <dov.grobgeld@gmail.com>. This manual page was written by Lior Kaplan <kaplan@debian.org>, for the Debian project (but may be used by others). April 17, 2006 PAPS(1)
All times are GMT -4. The time now is 02:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy