Sponsored Content
Top Forums Shell Programming and Scripting How to remove new line characters from data rows in a Pipe delimited file? Post 303041254 by styris on Tuesday 19th of November 2019 12:29:27 PM
Old 11-19-2019
How to remove new line characters from data rows in a Pipe delimited file?

I have a file as below

Code:
Emp1|FirstName|MiddleName|LastName|Address|Pincode|PhoneNumber
1234|FirstName1|MiddleName2|LastName3| Add1 || ADD2|123|000000000
2345|FirstName2|MiddleName3|LastName4|
 Add1 || ADD2|
 234|000000000

OUTPUT :
Code:
Emp1|FirstName|MiddleName|LastName|Address|Pincode|PhoneNumber
1234|FirstName1|MiddleName2|LastName3| Add1 || ADD2|123|000000000
2345|FirstName2|MiddleName3|LastName4| Add1 || ADD2|234|000000000


Last edited by RavinderSingh13; 11-19-2019 at 01:39 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove new line characters from a partcular column data

Dear friends, I have a pipe delimited file having 5 columns. However the column no-3 is having extra new line characters as the data owing to owing , I am having issues. Ideally my file should have only newline termination at the end of each record and not within column data of any of... (1 Reply)
Discussion started by: sureshg_sampat
1 Replies

2. Shell Programming and Scripting

Remove SPACES between PIPE delimited file

This is my input file with extra information in the HEADER and leading & trailing SPACES between PIPE delimiter. 02/04/2010 Dynamic List Display 1 --------------------------------------------------------------------------------------... (6 Replies)
Discussion started by: srimitta
6 Replies

3. Shell Programming and Scripting

Selecting rows from a pipe delimited file based on condition

HI all, I have a simple challenge for you.. I have the following pipe delimited file 2345|98|1809||x|969|0 2345|98|0809||y|0|537 2345|97|9809||x|544|0 2345|97|0909||y|0|651 9685|98|7809||x|321|0 9685|98|7909||y|0|357 9685|98|7809||x|687|0 9685|98|0809||y|0|234 2315|98|0809||x|564|0 ... (2 Replies)
Discussion started by: nithins007
2 Replies

4. Shell Programming and Scripting

Help with converting Pipe delimited file to Tab Delimited

I have a file which was pipe delimited, I need to make it tab delimited. I tried with sed but no use cat file | sed 's/|//t/g' The above command substituted "/t" not tab in the place of pipe. Sample file: abc|123|2012-01-30|2012-04-28|xyz have to convert to: abc 123... (6 Replies)
Discussion started by: karumudi7
6 Replies

5. UNIX for Dummies Questions & Answers

find string and get the rest of the line in a pipe delimited file

Hi friends, I have a file where I should search for a string and get the rest of the line but without the delimiter using awk. for example I have the series of string in a file: input_string.txt bbb ccc aaa and the mapping file looks like this. mapping.txt aaa|12 bbb|23 ccc|43... (11 Replies)
Discussion started by: kokoro
11 Replies

6. Shell Programming and Scripting

Find for line with not null values at nth place in pipe delimited file

Hi, I am trying to find the lines in a pipe delimited file where 11th column has not null values. Any help is appreciated. Need help asap please. thanks in advance. (3 Replies)
Discussion started by: manikms
3 Replies

7. Shell Programming and Scripting

Remove few columns from pipe delimited file

I have file as below column1|column2|column3|column4|column5| fill1|fill2|fill3|fill4|fill5| abc1|abc2|abc3|abc4|abc5| . . . . i need to remove column2,3, from that file column1|column4|column5| fill1|fill4|fill5| abc1|abc4|abc5| . . . (3 Replies)
Discussion started by: greenworld123
3 Replies

8. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies

9. Shell Programming and Scripting

How to remove alphabets/special characters/space in the 5th field of a tab delimited file?

Thank you for 4 looking this post. We have a tab delimited file where we are facing problem in a lot of funny character. I have tried using awk but failed that is not working. In the 5th field ID which is supposed to be a integer only of that file, we are getting corrupted data as below. I... (12 Replies)
Discussion started by: Srithar
12 Replies

10. UNIX for Dummies Questions & Answers

Need to convert a pipe delimited text file to tab delimited

Hi, I have a rquirement in unix as below . I have a text file with me seperated by | symbol and i need to generate a excel file through unix commands/script so that each value will go to each column. ex: Input Text file: 1|A|apple 2|B|bottle excel file to be generated as output as... (9 Replies)
Discussion started by: raja kakitapall
9 Replies
SQLSRV_FETCH_ARRAY(3)													     SQLSRV_FETCH_ARRAY(3)

sqlsrv_fetch_array - Returns a row as an array

SYNOPSIS
array sqlsrv_fetch_array (resource $stmt, [int $fetchType], [int $row], [int $offset]) DESCRIPTION
Returns the next available row of data as an associative array, a numeric array, or both (the default). PARAMETERS
o $stmt - A statement resource returned by sqlsrv_query or sqlsrv_prepare. o $fetchType - A predefined constant specifying the type of array to return. Possible values are SQLSRV_FETCH_ASSOC, SQLSRV_FETCH_NUMERIC, and SQLSRV_FETCH_BOTH (the default). A fetch type of SQLSRV_FETCH_ASSOC should not be used when consuming a result set with multiple columns of the same name. o $row - Specifies the row to access in a result set that uses a scrollable cursor. Possible values are SQLSRV_SCROLL_NEXT, SQL- SRV_SCROLL_PRIOR, SQLSRV_SCROLL_FIRST, SQLSRV_SCROLL_LAST, SQLSRV_SCROLL_ABSOLUTE and, SQLSRV_SCROLL_RELATIVE (the default). When this parameter is specified, the $fetchType must be explicitly defined. o $offset - Specifies the row to be accessed if the row parameter is set to SQLSRV_SCROLL_ABSOLUTE or SQLSRV_SCROLL_RELATIVE. Note that the first row in a result set has index 0. RETURN VALUES
Returns an array on success, NULL if there are no more rows to return, and FALSE if an error occurs. EXAMPLES
Example #1 Retrieving an associative array. <?php $serverName = "serverNameinstanceName"; $connectionInfo = array( "Database"=>"dbName", "UID"=>"username", "PWD"=>"password"); $conn = sqlsrv_connect( $serverName, $connectionInfo ); if( $conn === false ) { die( print_r( sqlsrv_errors(), true)); } $sql = "SELECT FirstName, LastName FROM SomeTable"; $stmt = sqlsrv_query( $conn, $sql ); if( $stmt === false) { die( print_r( sqlsrv_errors(), true) ); } while( $row = sqlsrv_fetch_array( $stmt, SQLSRV_FETCH_ASSOC) ) { echo $row['LastName'].", ".$row['FirstName']."<br />"; } sqlsrv_free_stmt( $stmt); ?> Example #2 Retrieving a numeric array. <?php $serverName = "serverNameinstanceName"; $connectionInfo = array( "Database"=>"dbName", "UID"=>"username", "PWD"=>"password"); $conn = sqlsrv_connect( $serverName, $connectionInfo ); if( $conn === false ) { die( print_r( sqlsrv_errors(), true)); } $sql = "SELECT FirstName, LastName FROM SomeTable"; $stmt = sqlsrv_query( $conn, $sql ); if( $stmt === false) { die( print_r( sqlsrv_errors(), true) ); } while( $row = sqlsrv_fetch_array( $stmt, SQLSRV_FETCH_NUMERIC) ) { echo $row[0].", ".$row[1]."<br />"; } sqlsrv_free_stmt( $stmt); ?> NOTES
Not specifying the $fetchType or explicity using the SQLSRV_FETCH_TYPE constant in the examples above will return an array that has both associative and numeric keys. If more than one column is returned with the same name, the last column will take precedence. To avoid field name collisions, use aliases. If a column with no name is returned, the associative key for the array element will be an empty string (""). SEE ALSO
sqlsrv_connect(3), sqlsrv_query(3), sqlsrv_errors(3), sqlsrv_fetch(3). PHP Documentation Group SQLSRV_FETCH_ARRAY(3)
All times are GMT -4. The time now is 10:56 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy