I am a Ph.D student working on some image processing tasks and I have run into an interesting problem that I thought someone on here might have an idea about. This paper discusses a method to compare two images based upon the amount they can be compressed. [edit] Sorry, since this is my first post, I can't post URLs. So search for; ON THE CONTRIBUTION OF COMPRESSION TO VISUAL. PATTERN RECOGNITION. Gunther Heidemann
The formula is simply D_comp(I1, I2) = S(I1) + S(I2) - S(I12)
Basically, if I have two images, one.bmp (S(I1)) and two.bmp (S(I2)). If I compress each one, and also combine the two to make S(I12), and compress that, I can feed them into the formula and it tells me how similar they are. Sounds simple enough.
Then, in the description of the theory, it states that a large value for D_comp indicates that the two images are similar, and a value for zero is found when they're not similar. All good so far...(however I would have thought that if two images are identical, combining them would make D_comp closer to zero).
So I have implemented this as a quick test as follows;
so;
gives me;
Therefore my D_comp value is 937867 + 907797 - 1846241 = -577
Note that this is a negative number, not mentioned in the paper, since I am guessing that there is an assumption that by combining two images S(I12), one would achieve a greater level of compression.
Here are the two images I have tried this on;
one.bmp = once again, I can't post the URL - brooklyn Bridge
two.bmp = once again, I can't post the URL - golden gate bridge
Since these were jpg files, I used image magik to convert them to BMP, so that they were the same byte size. Both images are 800x600 and as you can see from the gzip -l output, the uncompressed file sizes are the same.
So my question is, why would two files produce a smaller total amount of bytes when compressed, than when combined. I would have thought that the gzip algorithm would have been more efficient at compressing the joint image.
Also, when both images are the same;
which results in a value of D_comp -60.
Appologies for the long post. Does anyone know why gzip might do this? Does it look normal? I was thinking it might be something to do with header information that it couldn't compress.
Hi, (HP-UX 11.11)
I need to create a tape image of an igniteUX image created on our igniteUX server.
That is to say. I have a "Online" image of the igniteUX of the targeted system but I now need to copy it to a useable TAPE (igniteUX) image so i can build an other server from it that is not... (3 Replies)
Hello everyone
I want to use compression in my tape when I backup some file. For example I have several files that use 50gb. If I backup this I need to use two cartridge because without compression I can backup 36gb.
My question is with flag I need to use to compress and I can use 72gb in... (2 Replies)
Hi All,
I have a random test file: test.txt, size: 146
$ ll test.txt
$ 146 test.txt
Take 1:
$ cat test.txt | gzip > test.txt.gz
$ ll test.txt.gz
$ 124 test.txt.gz
Take 2:
$ gzip test.txt
$ ll test.txt.gz
$ 133 test.txt.gz
As you can see, gzipping a file and piping into gzip... (1 Reply)
Hello everyone,
As the title suggests, I am attempting to test adding gzip compression to a connection to an application I am testing. Currently I have the application set up with httptunnel, which forwards the connection to the remote host.
I would like to use a script to intercept the... (5 Replies)
Hi, I have two sets of image files. Both sets have names A to Z but set 1 ends with .cdt.png and set 2 ends with .matrix.png. I want set 1 to match with set 2 if the names match (i.e. A.cdt.png will match with A.matrix.png) and with the convert image tool (program for images), it will merge the... (6 Replies)
Hi,
I used gzip command to compress a huge tar file. But I saw that compression % was more than 100%.
It might have inflated instead , probably because tar file is already packed properly.
So I thought of unzippping it. Now after unzip I expected the tar file to be of less size than... (12 Replies)
How good is the compression rate of gzip when you use tar with the gzip option? I am pretty amazed that a 1 GB file was reduced to 1019K. This is what I did.
tar -cvf tar_test.tar.gz -T /list_of_files
ls -hl
-rw-r-----. 1 owner group 19 Jul 23 16:00 list_of_files
-rw-r-----. 1 owner group... (7 Replies)