Sponsored Content
Full Discussion: multi part compressed files
Top Forums UNIX for Dummies Questions & Answers multi part compressed files Post 302166237 by denn on Monday 11th of February 2008 11:53:44 AM
Old 02-11-2008
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

import compressed files using pipe

I am trying to import compressed files using a pipe on a server, IBM AIX UNIX 3.4, with very little disk space The command is: nohup cat xaa xab xac xad xae xaf xag | uncompress - > imp_pip & Then the imp_pip file is used in the import statement, files=imp_pip Does this statement... (0 Replies)
Discussion started by: pengwyn
0 Replies

2. UNIX for Dummies Questions & Answers

Best practice for bulk multi-part mail

Hi, I am currently building a PHP/MySQL database that handles our offices newsletters. Now everything works great in the alpha with only a few names in the list, but I anticipate that once we fill it up (around 10,000) that this will not work from the PHP. I already have the script echo the... (0 Replies)
Discussion started by: begin23
0 Replies

3. UNIX for Dummies Questions & Answers

delete compressed files from year 2005

I'm trying to delete files that were created/modified in the year 2005 that we compressed and have the .Z extension on them. I tried using the awk utility but the syntax is incorrect. I don't know how to use a wildcard to capture all the compressed files. Here's the code I used ( ls -lR |... (5 Replies)
Discussion started by: igidttam
5 Replies

4. UNIX for Dummies Questions & Answers

To view compressed files

Hello All I compressed a file hello by using compress command compress hello ( enter ) i got the file as hello.z 1. My question is how can i see the file hello.z 2. How can i uncompress it back to change it to filename hello thanks (4 Replies)
Discussion started by: supercops
4 Replies

5. UNIX for Dummies Questions & Answers

How to distribute compressed files as text?

Hello everybody, I've seen some text documents where they publish blocks of text and tell you to save it as "file.tgz" for example, and when you decompress the file, it actually works. How is that done? is there a program? Because i tried cat and doesn't work, tried less, more, hexedit and... (2 Replies)
Discussion started by: semash!
2 Replies

6. UNIX for Dummies Questions & Answers

Extracting data from many compressed files

I have a large number (50,000) of pretty large compressed files and I need only certain lines of data from them (each relevant line contains a certain key word). Each file contains 300 such lines. The individual file names are indexed by file number (file_name.1, file_name.2, ... ,... (1 Reply)
Discussion started by: Boltzmann
1 Replies

7. UNIX for Dummies Questions & Answers

Reading compressed files during a grep search

All, The bottom line is that im reading a file, storing it as variables, recursively grep searching it, and then piping it to allow word counts as well. I am unsure on how to open any .zip .tar and .gzip, search for keywords and return results. Any help would be much appreciated! Thanks (6 Replies)
Discussion started by: ryan.lee
6 Replies

8. Shell Programming and Scripting

Search compressed files with awk and get FILENAME

I have many compressed files I want to search using awk and want to print some file contents along with the filename it came from on each output record (I simplified awk command). Here are the results with the files uncompressed: awk '{print FILENAME, $0}' test*.txt test1.txt from test1... (3 Replies)
Discussion started by: mjf
3 Replies

9. Shell Programming and Scripting

How to zip or rm the multi part named files?

Hello There, There are more than 1000 files in my log folder and i want to zip it to relase the space. But my method throwing syntax error due to the multi part file name, how to overcome in this ? ls -lart | grep "MDB_Kernel11.1_gwlog_SUN 22_09_2013" | awk '{print $9,$10,$11,$12}' | head... (8 Replies)
Discussion started by: gowthamakanthan
8 Replies

10. Shell Programming and Scripting

Validate compressed files

Hi All, I have zip file that needs to be validated and checked for 5 times with sleep of 60 seconds. Some thing like below #!/bin/bash counter=1 while do curl -i -k -X GET `strings tmp.txt |grep Location| cut -f2 -d" "` -H "Authorization: Token $TOKEN" -o $zip_file ## this is... (6 Replies)
Discussion started by: Master_Mind
6 Replies
Ns_Url(3aolserver)					   AOLserver Library Procedures 					Ns_Url(3aolserver)

__________________________________________________________________________________________________________________________________________________

NAME
Ns_AbsoluteUrl, Ns_ParseUrl, Ns_RelativeUrl, Ns_SkipUrl - URL manipulation routines SYNOPSIS
#include "ns.h" int Ns_AbsoluteUrl(Ns_DString *pds, char *url, char *baseurl) int Ns_ParseUrl(char *url, char **pprotocol, char **phost, char **pport, char **ppath, char **ptail) char * Ns_RelativeUrl(char *url, char *location) char * Ns_SkipUrl(Ns_Request *request, int n) _________________________________________________________________ DESCRIPTION
Ns_AbsoluteUrl(pds, url, baseurl) Construct an URL based on baseurl but with as many parts of the incomplete url as possible. Return NS_OK or NS_ERROR. Ns_ParseUrl(url, pprotocol, phost, pport, ppath, ptail) Parse a URL into its component parts. Pointers to the protocol, host, port, path, and "tail" (last path element) will be set by ref- erence in the passed-in pointers. The passed-in url will be modified. Ns_RelativeUrl(url, location) If the url passed in is for this server, then the initial part of the URL is stripped off. e.g., on a server whose location is http://www.foo.com, Ns_RelativeUrl of "http://www.foo.com/hello" will return "/hello". Returns a pointer to the beginning of the relative url in the passed-in url, or NULL if error. Will set errno on error. Ns_SkipUrl(request, n) Return a pointer n elements into the request's url. SEE ALSO
nsd(1), info(n) KEYWORDS
AOLserver 4.0 Ns_Url(3aolserver)
All times are GMT -4. The time now is 08:32 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy