Sponsored Content
Top Forums Shell Programming and Scripting Sendmail ignoring line endings Post 302985312 by vbe on Tuesday 8th of November 2016 08:25:39 AM
Old 11-08-2016
I would add that in older days, knowing MS Exchange server would mess up the content or display error messages about non compatible characters, I used a .mailrc and with the content of the file:
Code:
# =======================================================#
set crt=21
set encoding=8bit
set charset=iso-8859-1
# set mimeheader=yes

# =======================================================#

it solved the issue, but you here, are using HTML tags so not quite the same...
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

ignoring blank line in a file

i have a file called Cleaner1.log . This files have some blank lines also.My requirement is that it should ignore the blank lines and give me the lines that contain some data. I m using this logic in a script: below the contents of file : Maximum Time Taken for Processing(Failed) RR... (4 Replies)
Discussion started by: ali560045
4 Replies

2. Shell Programming and Scripting

How to extract a string from a file ignoring new line

Hi, sumdays before i had posted a query with same subject. i got sum great help from great ppl which solved my problem then. But now there is a small problem with the code that i need the experts help upon. for parsing a text like this where $ had been the delimiter between... (3 Replies)
Discussion started by: suresh_kb211
3 Replies

3. Shell Programming and Scripting

Problems with Sed/awk/grep and line endings

Hello I have created the following script, which is designed to manipulate a text document: #!/bin/sh # Get 3 lines, (last of which is "Quantity"); adjust order; put all three on one line with tabs. FILENAME=~/Desktop/email.txt LIST=$(grep -B2 "Quantity" ${FILENAME} |awk 'BEGIN { FS = "\n"; RS... (6 Replies)
Discussion started by: benwiggy
6 Replies

4. UNIX for Advanced & Expert Users

Vi line endings conversions

I was reading these 2 articles. Why does the wikia one think :e ++ff=dos? Or am I just misunderstanding it? :e ++ff=unix :e ++ff=dos File format - Vim Tips Wiki Managing/Munging Line-Endings with Vi/Vim | Jeet Sukumaran (1 Reply)
Discussion started by: cokedude
1 Replies

5. UNIX for Advanced & Expert Users

line endings help of non-ASCII files

When you are dealing with ASCII files it easy to check on line endings type. You can just use the file command. You are not always lucky enough to be dealing with ASCII files. So in the cases that you don't have ASCII files how can you check what type of line endings you have? Please list all... (5 Replies)
Discussion started by: cokedude
5 Replies

6. UNIX for Dummies Questions & Answers

Find a file that could have different endings

Hello all. Hope you can help. I am looking for a complete command to search for a file named HOSPCHK. The file could be listed with numbers after it like it could be listed with letters after it or a combination of both or just by it self. The other catch is the file that I want to look for... (27 Replies)
Discussion started by: azapp51
27 Replies

7. UNIX for Advanced & Expert Users

vimrc help with line endings

I was reading this and thought I could put this in my vimrc and it would convert the line endings to unix. Am I doing something wrong or am I missing something? set ff=unixManaging/Munging Line-Endings with Vi/Vim | Jeet Sukumaran I used this command and it confirms that my global option is... (2 Replies)
Discussion started by: cokedude
2 Replies

8. Red Hat

Sendmail ignoring Mailertable

Hi Friends, I am running sendmail 8.14 on rhel6. I have 2 mail servers as serv1.home.com and test.home.com. Currently test.home.com is pointing as MX for home.com domain. So what I am trying to do is to route emails arriving at test.home.com server to serv1.home.com using mailertable. I have... (0 Replies)
Discussion started by: Rohit Bhanot
0 Replies

9. Shell Programming and Scripting

"sed" ignoring last line

Hi, I am giving below command script and getting below output. I tried using "sed" which is ignoring 4th line. Can you please help me to get the expected output like below Code: echo "dis clusqmgr(*) cluster(BT.CL.OFSSTAT4) conname qmtype deftype"| runmqsc -e $QMGR|egrep... (7 Replies)
Discussion started by: darling
7 Replies

10. UNIX for Beginners Questions & Answers

Tip to remove line endings and spaces on a pre-formatted text file?

Hi, At the moment, using Notepad++ to do a search and replace, manually section by section which is real painful. Yeah, so copying each section of the line of text and putting into a file and then search and replace, need at least 3-operations in Notepad++. Here's hoping I will be able to... (1 Reply)
Discussion started by: newbie_01
1 Replies
Ns_ConnType(3aolserver) 				   AOLserver Library Procedures 				   Ns_ConnType(3aolserver)

__________________________________________________________________________________________________________________________________________________

NAME
Ns_ConnGetType, Ns_ConnSetType - Routines to manage the HTTP response type SYNOPSIS
#include "ns.h" char * Ns_ConnGetType(conn) void Ns_ConnSetType(conn, type) ARGUMENTS
Ns_Conn conn (in) Pointer to open connection. char *type (in) Character string with response mimetype. _________________________________________________________________ DESCRIPTION
These routines both manage the eventual content-type header which is generated by a later call to Ns_ConnFlush or Ns_ConnFlushDirect as well as manage the appropriate character encoding for text output types. char *Ns_ConnGetType(conn) Return the current HTTP mime type (e.g., "text/html; charset=iso-8859-1") or NULL if no type has yet been set. void Ns_ConnSetType(conn, type) Sets the mimetype of the response to the given type. A later call to Ns_ConnFlush will include a header of the form content-type: type when generating the response. CHARSETS AND ENCODINGS
For text types, a call to Ns_ConnSetType can also include an optional "charset=" attribute. If no charset is specified, the server will append a default charset if specified as the outputcharset server configuration variable. With a given or automatically appended charset for text types, the server will then set the output encoding to the cooresponding Tcl_Encod- ing, for example, mapping the charset "iso-8859-1" to the Tcl_Encoding equivalent "iso8859-1". All text later sent via Ns_ConnFlush will be first encoded using the determined Tcl_Encoding (calls to Ns_ConnFlushDirect will bypass this encoding step). See the man pages on Ns_GetCharsetEncoding for details on how these mappings are configured. The charset modification feature was added in later versions of AOLserver to support legacy code which may have been sprinkled with direct calls to set text types without specifying the charset, e.g., calls such as: ns_return 200 text/html "<body>hello</body>" EXAMPLES
The following example demonstrates sending Japanese character data. In this case, assume "utf8string" contains a series of UTF-8 bytes with various Japanese characters. The call to Ns_ConnSetType will setup the appropriate "shiftjis" output Tcl_Encoding to match the given "shift_jis" charset: Ns_ConnSetStatus(conn, 200); Ns_ConnSetType(conn, "text/html; charset=shift_jis"); Ns_ConnFlushDirect(conn, utf8string, -1, 0); The following demonstrates the behavior of the default server charset encoding. Assume the following is set in the config file: ns_section ns/server/serverName ns_param outputcharset iso-8859-1 In this case, a call to Ns_ConnSetType(conn, "text/html") without a specific charset would be modified to include "charset=iso-8859-1". Based on this modification, the output encoding would be set to the "iso8859-1" Tcl_Encoding. SEE ALSO
Ns_ConnGetType(3), Ns_ConnSetType(3), Ns_ConnFlush(3), Ns_ConnFlushDirect(3), Ns_ConnSetRequiredHeaders(3), Ns_ConnQueueHeaders(3), Ns_GetCharsetEncoding(3), Ns_GetTypeEncoding(3), ns_conn(n) KEYWORDS
connectionn, response, status, encoding, charset AOLserver 4.0 Ns_ConnType(3aolserver)
All times are GMT -4. The time now is 08:32 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy