Sponsored Content
Full Discussion: sed within awk statement
Top Forums Shell Programming and Scripting sed within awk statement Post 303010922 by sagar_1986 on Wednesday 10th of January 2018 06:20:25 AM
Old 01-10-2018
I have already tried DATE_FORMAT(date, format_mask) function but it gives error

Code:
 
 'DATE_FORMAT' is not a recognized built-in function name.

Details of database are as below.

Code:
 
 Microsoft SQL Server Management Studio      11.0.2100.60
Microsoft Analysis Services Client Tools      11.0.2100.60
Microsoft Data Access Components (MDAC)      6.3.9600.16384
Microsoft MSXML      3.0 6.0 
Microsoft Internet Explorer      9.11.9600.17690
Microsoft .NET Framework      4.0.30319.34014
Operating System      6.3.9600

The sample input given, is output of query and I want to format it as given above.


Is it possible to use sed within awk statement or please let me how to do it with gsub function to get desired output.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

if and sed statement

this is my output for my crawler. /about.html /ads/ /advanced_search?hl=en froogle.google.com/frghp?hl=en&tab=wf&ie=UTF-8 groups.google.com/grphp?hl=en&tab=wg&ie=UTF-8 /imghp?hl=en&tab=wi&ie=UTF-8 /intl/en/options/ /language_tools?hl=en /maphp?hl=en&tab=wl&ie=UTF-8... (3 Replies)
Discussion started by: chris1234
3 Replies

2. Shell Programming and Scripting

Variables within a sed statement

I am just wondering if it's possible to refer to variables within a sed statement as follows:- cat $file | sed -e 1's/$oldtext/$newtext/' > $file as when I run the script, the variables are not recognised and nothing happens..?? Thanks (5 Replies)
Discussion started by: sirtrancealot
5 Replies

3. Shell Programming and Scripting

sed remove statement

I am having some problems with sed, that I am hoping that I can get some assistance with. I am trying to remove two subsets of a string, and cannot figure out how to have it work. Here is an example string: auth_ldap authenticate: user joe authentication failed; URI /svn/ I want to... (4 Replies)
Discussion started by: Guyverix
4 Replies

4. Shell Programming and Scripting

sed / grep / for statement performance - please help

I'm searching the most effective way of doing the following task, so if someone can either provide a working solution with sed or one totally different but more effective then what I've got so far then please go ahead! The debugme directory has 3 subdirectorys and each of them has one .txt file... (7 Replies)
Discussion started by: TehOne
7 Replies

5. Shell Programming and Scripting

A complex sed statement

I have following requirement. Say, my text file contains following patterns {2010501005|XXGpvertex|9|0|17|0|{|{30100001|XXparameter_set|@@@@{{30001002|XXparameter|!prototype_path|$AB_COMPONENTS/Sort/Sort.mpc|3|2|Pf$|@{0|}} }}@0|@315000|78500|335000|99000|114000|87000|17|And the Sort|Ab... (8 Replies)
Discussion started by: Shell_Learner
8 Replies

6. Shell Programming and Scripting

Use awk/sed/grep with goto statement!

Hi, I have an array with characters and I am looking for specific character in that array and if those specific character not found than I use goto statment which is define somehwhere in the script. My code is: set a = (A B C D E F) @ i = 0 while ($i <= ${#a}) if ($a != "F" || $a != "D")... (3 Replies)
Discussion started by: dixits
3 Replies

7. Shell Programming and Scripting

Awk/sed problem to write Db insertion statement

Hi There, I am trying to load data from a csv file into a DB during our DB migration phase. I am successfully able export all data into a .csv file but those have to rewritten in terms insert statement which will allow for further population of same data in different DB My exiting csv record... (6 Replies)
Discussion started by: bhaskar_m
6 Replies

8. Shell Programming and Scripting

If else statement in sed

Hello Guys, I am new here and this is my first post, hope someone can help me I am writing a script that is supposed to go in 9 different directories and edit a given file in each of the directories. I am using sed to edit the file as sed -i 'line# s/#to be changed/#to be replaced with/... (5 Replies)
Discussion started by: Madiouma Ndiaye
5 Replies

9. Shell Programming and Scripting

Convert Update statement into Insert statement in UNIX using awk, sed....

Hi folks, I have a scenario to convert the update statements into insert statements using shell script (awk, sed...) or in database using regex. I have a bunch of update statements with all columns in a file which I need to convert into insert statements. UPDATE TABLE_A SET COL1=1 WHERE... (0 Replies)
Discussion started by: dev123
0 Replies

10. Shell Programming and Scripting

awk statement piped inside sed

Hello folks, I have multiple occurrences of the pattern: ).: where is any digit, in various text context but the pattern is unique as this regex. And I need to turn this decimal fraction into an integer (corresponding percent value: the range of 0-100). What I'm doing is: cat... (1 Reply)
Discussion started by: roussine
1 Replies
MARC::Lint::CodeData(3pm)				User Contributed Perl Documentation				 MARC::Lint::CodeData(3pm)

NAME
MARC::Lint::CodeData -- Contains codes from the MARC code lists for Geographic Areas, Languages, and Countries. DESCRIPTION
Code data is used for validating fields 008, 040, 041, and 043. Also, sources for subfield 2 in 600-651 and 655. Stores codes in hashes, %MARC::Lint::CodeData::[name]. Note: According to the official MARC documentation, Sears is not a valid 655 term. The code data below treats it as valid, in anticipation of a change in the official documentation. SYNOPSIS
use MARC::Lint::CodeData; #Should provide access to the following: #%MARC::Lint::CodeData::GeogAreaCodes; #%MARC::Lint::CodeData::ObsoleteGeogAreaCodes; #%MARC::Lint::CodeData::LanguageCodes; #%MARC::Lint::CodeData::ObsoleteLanguageCodes; #%MARC::Lint::CodeData::CountryCodes; #%MARC::Lint::CodeData::ObsoleteCountryCodes; #%MARC::Lint::CodeData::Sources600_651; #%MARC::Lint::CodeData::ObsoleteSources600_651; #%MARC::Lint::CodeData::Sources655; #%MARC::Lint::CodeData::ObsoleteSources655; #or, import specific code list data use MARC::Lint::CodeData qw(%GeogAreaCodes); my $gac = "n-us---"; my $validgac = 1 if ($GeogAreaCodes{$gac}); print "Geographic Area Code $gac is valid " if $validgac; EXPORT
None by default. @EXPORT_OK: %GeogAreaCodes, %ObsoleteGeogAreaCodes, %LanguageCodes, %ObsoleteLanguageCodes, %CountryCodes, %ObsoleteCountryCodes, %Sources600_651, %ObsoleteSources600_651, %Sources655, %ObsoleteSources655. TO DO
Update codes as needed (see <http://www.loc.gov/marc/>). Add other codes for MARC Code Lists for Relators, Sources, Description Conventions. Determine what to do about 600-655 codes with indicators (cash, lcsh, lcshac, mesh, nal, and rvm). Currently, these are duplicated in valid and obsolete hashes. Validation routines should probably treat these differently due to large numbers of records using these codes, created before the indicators were allowed. Determine whether three blank spaces should be in the LanguageCodes (for 008 validation) or not. If it is here, then 041 would be allowed to have three blank spaces as a valid code (though other checks would report the error--spaces at the beginning and ending of a subfield and multiple spaces in a field where such a thing is not allowed). SEE ALSO MARC::Lint MARC::Lintadditions (for check_040, check_041, check_043 using these codes) MARC::Errorchecks (for 008 validation using these codes) <http://www.loc.gov/marc/> for the official code lists. The following (should be included in the distribution package for this package): countrycodelistclean.pl gaccleanupscript.pl languagecodelistclean.pl The scripts above take the MARC code list ASCII version as input. They output tab-separated codes for updating the data below. VERSION HISTORY
Version 1.28: Updated May 2, 2009. -Added new sources codes from Technical Notice of Oct. 10, 2008. -Added new sources codes from Technical Notice of Dec. 16, 2008. -Added new language codes from Technical Notice of Jan. 6, 2009 (mol moved to ObsoleteLanguageCodes). -Added new sources codes from Technical Notice of Jan. 23, 2009. -Added new sources codes from Technical Notice of Feb. 19, 2009. -Added new sources codes from Technical Notice of Apr. 22, 2009. Version 1.27: Updated Aug. 14, 2008. -Added new sources codes from Technical Notice of July 25, 2008. Version 1.26: Updated July 6, 2008. -Added new language codes from Technical Notice of July 1, 2008. -Moved obsolete language codes 'scc' and 'scr' to the obsolete language hash. Version 1.25: Updated Apr. 28, 2008. -Added new sources codes from Technical Notice of Apr. 25, 2008. Version 1.24: Updated Mar. 30, 2008. -Added new sources codes from Technical Notice of Mar. 28, 2008. Version 1.23: Updated Mar. 26, 2008. -Added new country and GAC codes from Technical Notice of Mar. 25, 2008. Version 1.22: Updated Jan. 21, 2008. -Added new sources codes from Technical Notice of Jan. 18, 2008. Version 1.21: Updated Nov. 30, 2007. -Added new sources codes from Technical Notice of Nov. 30, 2007. Version 1.20: Updated Nov. 19, 2007. -Added new language codes from Technical Notice of Nov. 16, 2007. Version 1.19: Updated Oct. 22, 2007. -Added new language codes from Technical Notice of Oct. 22, 2007. Version 1.18: Updated Aug. 14, 2007. -Added new source codes from Technical Notice of Aug. 13, 2007. Version 1.17: Updated July 16, 2007. -Added new source codes from Technical Notice of July 13, 2007. Version 1.16: Updated Apr. 18, 2007. -Added new source codes from Technical Notice of Apr. 5, 2007. Version 1.15: Updated Feb. 28, 2007. -Added new country and geographic codes from Technical Notice of Feb. 28, 2007. -Added 'yu ' to list of obsolete codes. Version 1.14: Updated Jan. 8, 2007. -Added new source codes from Technical Notice of Jan. 5, 2007. Version 1.13: Updated Nov. 19, 2006. -Added new source codes from Technical Notice of Nov. 14, 2006. Version 1.12: Updated Oct. 20, 2006. -Added new source code from Technical Notice of Oct. 19, 2006. Version 1.11: Updated Oct. 18, 2006. -Added new source codes from Technical Notice of Oct. 17, 2006. Version 1.10: Updated Aug. 30, 2006. -Added new source codes from Technical Notice of Aug. 29, 2006. Version 1.09: Updated June 26, 2006. -Added new source codes from Technical Notice of June 23, 2006. Version 1.08: Updated May 30, 2006. -Added new source codes from Technical Notice of May 26, 2006. Version 1.07: Updated Mar. 13, 2006. -Added new source codes from Technical Notice of Mar. 10, 2006. Version 1.06: Updated Feb. 23, 2006. -Added new language codes from Technical Notice of Feb 23, 2006. -Alphabetized language codes. Version 1.05: Updated Jan. 11, 2006. -Added new sources codes from Technical Notice of Jan. 10, 2006. Version 1.04: Updated Oct. 13, 2005. -Added new sources codes from Technical Notice of Oct. 12, 2005. Version 1.03: Updated Aug. 31, 2005. -Added new language codes for Ainu and Southern Altai (August 30, 2005 technical notice) Version 1.02: Updated June 21-July 12, 2005. Released (to CPAN) with new version of MARC::Errorchecks. -Added GAC and Country code changes for Australia (July 12, 2005 update) -Added 6xx subfield 2 source code data for June 17, 2005 update. -Updated valid Language codes to June 2, 2005 changes. Version 1.01: Updated Jan. 5-Feb. 10, 2005. Released (to CPAN) Feb. 13, 2005 (with new version of MARC::Errorchecks). -Added code list data for 600-651 subfield 2 and for 655 subfield 2 sources. -Updated codes based on changes made Jan. 19 (languages), Feb. 2 (sources), Feb. 9 (sources). Version 1.00 (original version): First release, Dec. 5, 2004. Uploaded to SourceForge CVS, Jan. 3, 2005. -Included in MARC::Errorchecks distribution on CPAN. -Used by MARC::Lintadditions. LICENSE
This code may be distributed under the same terms as Perl itself. Please note that this module is not a product of or supported by the employers of the various contributors to the code. AUTHOR
Bryan Baldus eijabb@cpan.org Copyright (c) 2004-2008. perl v5.10.0 2009-09-01 MARC::Lint::CodeData(3pm)
All times are GMT -4. The time now is 09:53 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy