Hello,
How to i determine via ftp commandline if files on ftp server is ascii or binary files. Like every other comon windows ftp program does it automatically.
regards
Thomas (5 Replies)
I tried to decode a binary script using the command 'uudecode'. but it is giving error as 'No begining line'.
'uudecode -o <outfile name> <binary file>'
Please help me in resolving this. (4 Replies)
I want to verify the file is Binary or ascii file and accordingly I want to switch the program with ret code
ie 0 or success and 1 for failure
Can any one help me is this a correct syntex...i am getting error
#!/bin/ksh
$file filename
if
echo "ascii fie Found"
else
echo " binary... (6 Replies)
hi
i am receiving a file from one system , i have to verify the format of the file data i.e whether the data is in acii format or binary format,
please help
thanks in advance
satya (1 Reply)
Hi,
Is there a way to convert the binary file to ascii . the binary file is pipe delimited.
from source the file(pipe delimited) is ftped to mainframe and from mainframe it is ftped to the unix box using binary format. Is there a way to change it back to ascii and view it?
Thanks! (3 Replies)
Hello all,
I am working with ftp servers in unix, and always I have to get and put files but I don't know exactly if I have to get or put them as an ascii or binary. Some files that I use are: .txt, .sav, .fmb, .pct, .sh, .ksh, .dat, .log.
Somebody can tell me what is the difference between... (2 Replies)
what is the diff between ascii and binary file.
my understand is that..
ascii file - has only line feed - \n in it
where as
binary file - has both line feed and carriage return in it- \r\n
is that correct.
also,what is the ksh command to identify whether it is a binary or ascii... (1 Reply)
Hi All,
I have a binary file which is being exported from a Database, and i need to convert that to ASCII format. How can i achieve that? And this solution should work for any file which is given to us; means they will give different files from different tables.
Thanks in advance. (8 Replies)
Hi All ,
I have a mainframe file which contains the data in EBCDIC format.I have downloaded this file from mainframe to windows in binary format(unreadable raw data).Now I want convert this file to ASCII format(readable format data) through Unix command.I have tried iconv but that is not working... (2 Replies)
Discussion started by: STCET22
2 Replies
LEARN ABOUT DEBIAN
glshaderbinary
GLSHADERBINARY(3G) [FIXME: manual] GLSHADERBINARY(3G)NAME
glShaderBinary - load pre-compiled shader binaries
C SPECIFICATION
void glShaderBinary(GLsizei count, const GLuint *shaders, GLenum binaryFormat, const void *binary, GLsizei length);
PARAMETERS
count
Specifies the number of shader object handles contained in shaders.
shaders
Specifies the address of an array of shader handles into which to load pre-compiled shader binaries.
binaryFormat
Specifies the format of the shader binaries contained in binary.
binary
Specifies the address of an array of bytes containing pre-compiled binary shader code.
length
Specifies the length of the array whose address is given in binary.
DESCRIPTION
glShaderBinary loads pre-compiled shader binary code into the count shader objects whose handles are given in shaders. binary points to
length bytes of binary shader code stored in client memory. binaryFormat specifies the format of the pre-compiled code.
The binary image contained in binary will be decoded according to the extension specification defining the specified binaryFormat token.
OpenGL does not define any specific binary formats, but it does provide a mechanism to obtain token vaues for such formats provided by such
extensions.
Depending on the types of the shader objects in shaders, glShaderBinary will individually load binary vertex or fragment shaders, or load
an executable binary that contains an optimized pair of vertex and fragment shaders stored in the same binary.
ERRORS
GL_INVALID_OPERATION is generated if more than one of the handles in shaders refers to the same shader object.
GL_INVALID_ENUM is generated if binaryFormat is not an accepted value.
GL_INVALID_VALUE is generated if the data pointed to by binary does not match the format specified by binaryFormat.
ASSOCIATED GETS
glGet() with parameter GL_NUM_SHADER_BINARY_FORMATS.
glGet() with parameter GL_SHADER_BINARY_FORMATS.
SEE ALSO
glGetProgram(), glGetProgramBinary(), glProgramBinary()
COPYRIGHT
Copyright (C) 2010 Khronos Group. This material may be distributed subject to the terms and conditions set forth in the Open Publication
License, v 1.0, 8 June 1999. http://opencontent.org/openpub/.
[FIXME: source] 05/30/2012 GLSHADERBINARY(3G)