Adding gzip compression to a connection using nc


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Adding gzip compression to a connection using nc
# 1  
Old 03-02-2012
Adding gzip compression to a connection using nc

Hello everyone,
As the title suggests, I am attempting to test adding gzip compression to a connection to an application I am testing. Currently I have the application set up with httptunnel, which forwards the connection to the remote host.

I would like to use a script to intercept the connection between the application and httptunnel, gzip the data that is there, and then forward it on. On the other side, the opposite will happen with gzip -d, with the resulting data passed on to the server application.

However I need some help, as I am fairly new to some of the things which are required to do this, like mkfifo, as well as piping outputs. I think there may be some other problems implementing such an idea with multiple connections, but that is a problem for later.

I have created a connection which works fine for sending gz files, and a while loop to keep things going.

Code:
while true ; do nc -l 3333 | gzip -d ; done

and on the other side, simply

Code:
 nc 192.168.20.84 3333 < a.gz

Can someone assist in developing this so that I can use this method to compress the connection on one side ,and decompress on the other side?

The following is an attempt to do what what needs to be done on one side, but on other, I am not so sure about decompressing and passing data on. httptunnel is listening on 8080 of course.

Code:
#!/bin/sh
mkfifo /tmp/gz.tmp
while true; do
cat gz.tmp | gzip | nc -l -k localhost 9123 | nc localhost 8080 ;
done

Secondly, I know that this will only compress data travelling in one direction. I would like to have the same idea operating in reverse, from the server back to the host. Would this be possible? This might be quite complicated, due to dynamic source ports. Having gzip know what to do with the data might be another problem. Perhaps attempting "gzip -d" first, then if it gives a fail code, just run "gzip". Is there a cleaner way of detecting gzip compression, if it is needed?

Any help would be greatly appreciated.

Last edited by haggismn; 03-02-2012 at 02:34 PM..
# 2  
Old 03-02-2012
Perhaps other tools can be used for the job? ssh, for instance, can forward ports, and can be told to compress data as well with -S...

---------- Post updated at 01:16 PM ---------- Previous update was at 01:07 PM ----------

For instance:

Code:
ssh -S -L 3333:hostname:8080

This will open port 3333 on 127.0.0.1 on your local machine, which you can connect to at will.

Whenever you do, the far side of the ssh connection will connect to hostname on 8080.

The -S will cause all this traffic to be compressed inbetween.

If you're concerned about performance or CPU use you can fiddle with the cipher options, the blowfish chiper is meant to be very fast.
# 3  
Old 03-02-2012
Thanks for your response.

Yes, ssh is a possible solution, however I can't use it in this case.

I should have added more detail. The problem is that I can't use protocols like ssh or even other applications like stunnel for using ssl. The assumption is that the connection the program is being used on is heavily firewalled. In such a case, an http connection may be the only kind that works. The idea is that by using gzip, any data that goes through the http connection will be compressed, and also, the data will be transformed in a way that will allow it to escape detection from things like regex filtering.

Well that is in theory. If I can get this part working, I will then start onto using suricata to perform on-the-fly gzip decompression and regex filtering.

I have a normal tunnel now working correctly, with
Code:
 while true ; do cat /tmp/rev | nc -l 3333 | nc localhost 8080  > /tmp/rev ; done

I just need to figure out where to put the gzip commands. Furthermore, would a similar command be used on the server, or would it need altering?

Thanks again in advance for any help.

Last edited by haggismn; 03-02-2012 at 04:53 PM..
# 4  
Old 03-02-2012
Quote:
Originally Posted by haggismn
The assumption is that the connection the program is being used on is heavily firewalled.
So, port 23 is blocked, but port 8080 isn't?

Quote:
In such a case, an http connection may be the only kind that works.
...Oh, I see.

Since the protocol you're running over port 80 is most blatantly not going to be HTTP anyway, why not run an ssh server on port 80 or 443 instead of netcat? You can do that.

Then just
Code:
ssh -p 80 username@host -L 3333:hostname:8080

# 5  
Old 03-02-2012
Hi again
I know that in most cases using ssh or ssl would be fine. Indeed I have both stunnel (ssl) and ssh versions of this connection working fine, with ssl being slightly faster.

However the overall objective is to demonstrate suricata's ability to decompress gzip data on the fly and detect patterns vs regular methods like iptables/layer7, which just look at the plaintext pattern rather than putting it together.

To do this, I will of course need a connection which uses gzip compression.

If I can get this nc/gzip connection working, I then show that a regular layer7 rule will not be able to detect regex pattens that the test program ouputs. However it is expected that suricata will be able to detect the patterns.

In the case of using http to form the connection, you are right, it is obvious from the traffic pattern that it isn't a standard http connection. However this isn't relevant as yet.

I am attempting firstly to get the nc port forwarding working on both the server and host. Once this is working I will look again at putting in gzip commands. I have a similar command on both machines, however on the server, things do not work properly. I have the server set up with the http tunnel server to receive, then forward packets to port 8008. I am then using this command (after mkfifo /tmp/rev)
Code:
while true ; do cat /tmp/rev | nc -l 8008 | nc localhost 80 > /tmp/rev ; done

with the test server program listening on port 80.

I notice when I enter this command, cpu usage increases dramatically and I get a lot of errors; something isn't right
Code:
TCP: time wait bucket table overflow

If I use this command on the host (with different ports), it forwards correctly between the test program and the tunnel application.

Edit: Well that part is fixed. Turns out the server was using a different version of nc. I had to add "-p" to the listening nc. Now I just need to add gzip/gzip -d in the right places.

---------- Post updated at 08:30 PM ---------- Previous update was at 05:47 PM ----------

Well I have made a little progress. It seems as though there is a problem, possibly to do with gzip not knowing when to stop compressing, although I could be wrong. I am using the following command on both sides. The link works fine if I remove the gzip parts.

Code:
while true ; do cat /tmp/rev | gzip -v | nc -l (-p )8080 | gzip -v | nc localhost 80 | gzip -d > /tmp/rev ; done

Can anyone assist here? If its necessary to do so, is there a way to inform gzip to compress until x bytes, then output? If what I am attempting to achieve is not possible, please let me know so I don't waste too much time!

Again, any help would be greatly appreciated.Thanks

Last edited by haggismn; 03-02-2012 at 09:35 PM..
# 6  
Old 03-04-2012
Quote:
Originally Posted by haggismn
Hi again
I know that in most cases using ssh or ssl would be fine. Indeed I have both stunnel (ssl) and ssh versions of this connection working fine, with ssl being slightly faster.

However the overall objective is to demonstrate suricata's ability to decompress gzip data on the fly and detect patterns vs regular methods like iptables/layer7, which just look at the plaintext pattern rather than putting it together.
Hmmmm. How about two, separate netcat connections then, one going each way?

I'm not sure how the ability to decompress gzip on the fly is useful. Wouldn't that make it bigger?

Quote:
To do this, I will of course need a connection which uses gzip compression.
So send a gzip over netcat...
Quote:
If I can get this nc/gzip connection working, I then show that a regular layer7 rule will not be able to detect regex pattens that the test program ouputs. However it is expected that suricata will be able to detect the patterns.
I have not been able to make nc do what you want yet, despite much fiddling and trying. I have tried to do it the way you want. Unfortunately the way you want is silly. Anything which takes >1 named pipes to kludge into place cannot be the right tool for the job.

Do you have the C language available for a custom solution to be written?
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Tar gzip compression rate

How good is the compression rate of gzip when you use tar with the gzip option? I am pretty amazed that a 1 GB file was reduced to 1019K. This is what I did. tar -cvf tar_test.tar.gz -T /list_of_files ls -hl -rw-r-----. 1 owner group 19 Jul 23 16:00 list_of_files -rw-r-----. 1 owner group... (7 Replies)
Discussion started by: cokedude
7 Replies

2. UNIX for Advanced & Expert Users

Compression with openssl

Hi , 1-I need to know please if it's possible to compress using openssl? Here is the version used: openssl version -a OpenSSL 0.9.7d 17 Mar 2004 (+ security fixes for: CVE-2005-2969 CVE-2006-2937 CVE-2006-2940 CVE2006-3738 CVE-2006-4339 CVE-2006-4343 CVE-2007-5135 CVE-2008-5077... (3 Replies)
Discussion started by: Eman_in_forum
3 Replies

3. Shell Programming and Scripting

Show Percentage Compression in GZIP

Hi, I used gzip command to compress a huge tar file. But I saw that compression % was more than 100%. It might have inflated instead , probably because tar file is already packed properly. So I thought of unzippping it. Now after unzip I expected the tar file to be of less size than... (12 Replies)
Discussion started by: vinay4889
12 Replies

4. Linux

Best Compression technique ?

Hi all, I am working on a sample backup code, where i read the files per 7200 bytes and send it to server. Before sending to server, i compress each 7200 bytes using zlib compression algorithm using dictionary max length of 1.5 MB . I find zlib is slow. Can anyone recommend me a... (3 Replies)
Discussion started by: selvarajvss
3 Replies

5. UNIX for Advanced & Expert Users

How keep running a program n an another computer via a connection ssh when the connection is closed?

Hi everybody, I am running a program on a supercomputer via my personal computer through a ssh connection. My program take more than a day to run, so when I left work with my PC I stop the connection with the supercomputer and the program stop. I am wondering if someone know how I can manage... (2 Replies)
Discussion started by: TomTomGre
2 Replies

6. UNIX and Linux Applications

Compression (gzip) for image analysis

Hi Everyone, I am a Ph.D student working on some image processing tasks and I have run into an interesting problem that I thought someone on here might have an idea about. This paper discusses a method to compare two images based upon the amount they can be compressed. Sorry, since this is my... (3 Replies)
Discussion started by: rudigarude
3 Replies

7. UNIX for Advanced & Expert Users

gzip vs pipe gzip: produce different file size

Hi All, I have a random test file: test.txt, size: 146 $ ll test.txt $ 146 test.txt Take 1: $ cat test.txt | gzip > test.txt.gz $ ll test.txt.gz $ 124 test.txt.gz Take 2: $ gzip test.txt $ ll test.txt.gz $ 133 test.txt.gz As you can see, gzipping a file and piping into gzip... (1 Reply)
Discussion started by: hanfresco
1 Replies

8. Shell Programming and Scripting

file compression

I'am looking for script (or tool) that would compress all files with given extension in all subdirectory. Important part is that every one file have to end in separate archive whit it's own name. Eaven if I could point multiple file in one directory and compress them it would be ok. I' am... (1 Reply)
Discussion started by: Demerzel
1 Replies

9. Solaris

Solaris 10 ftp connection problem (connection refused, connection timed out)

Hi everyone, I am hoping anyone of you could help me in this weird problem we have in 1 of our Solaris 10 servers. Lately, we have been having some ftp problems in this server. Though it can ping any server within the network, it seems that it can only ftp to a select few. For most servers, the... (4 Replies)
Discussion started by: labdakos
4 Replies

10. UNIX for Advanced & Expert Users

Connection reset by peer..closing connection

Hello I'm facing the above problem while doing a performance run. I've a script which I'm launching from my windows desktop using mozilla. The script will invoke backend action on a Solaris host which in turn feeds the records to a driver located on a linux box(Cent OS). What's happening is... (1 Reply)
Discussion started by: subramanyab
1 Replies
Login or Register to Ask a Question