02-23-2011
Training Set Compression by Incremental Clustering
HPL-2011-25
Training Set Compression by Incremental Clustering - Li, Dalong; Simske, Steven
Keyword(s): Clustering, Support vector machine, KNN, Pattern recognition, CONDENSE.
Abstract: Compression of training sets is a technique for reducing training set size without degrading classification accuracy. By reducing the size of a training set, training will be more efficient in addition to saving storage space. In this paper, an incremental clustering algorithm, the Leader algorithm, ...
Full Report
More...
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I woulld like to use the JDE newest version, but I am considering whether using 2 X Wintel server with clustering or 1 Unix server without clustering. Is Unix stable enough to except the clustering? (0 Replies)
Discussion started by: superlouis
0 Replies
2. UNIX for Dummies Questions & Answers
I'm looking for a free unix download(for a p.c. which I think is running xp) to simulate or mirror a unix operating system to teach a friend basic unix skills for an interview. I saw the solaris link, but couldn't really tell if it would work on a p.c. And I don't want to mess with his operating... (9 Replies)
Discussion started by: mmtemp
9 Replies
3. Solaris
SunOS 5.10 Generic_142900-15 sun4u sparc SUNW,SPARC-Enterprise
How can I tell if "clustering" is being used in my shop?
I have to file systems that are identical. These filesystems are nfs mounted. But how can I tell if they are being kept in sync as a result of clustering or some other... (2 Replies)
Discussion started by: Harleyrci
2 Replies
4. UNIX for Dummies Questions & Answers
hi guys
Some time ago I used Linux HA(Heartbeat) to setup like 3 cluster.
Now I have to install another 2 cluster and was checking more info to be sure HA was still used but I found some other stuff like OpenAIS - Corosync - Pacemaker to tell you the truth I am kinda confused here
I get... (0 Replies)
Discussion started by: karlochacon
0 Replies
5. Linux
Hi,
I have done the OS clustering in linux redhat 5.6, my one node is down and when i am trying to reboot the other node it is not coming up. any pointer to this would be helpful.
the SAN storage luns are not coming as mounted (2 Replies)
Discussion started by: mohitj.engg
2 Replies
6. Red Hat
Hello,
I'm new in this forum.
I'll have a new project to change architecture for our servers.
From one server where we found database oracle 9i and Oracle application ebs 11 installed in HPux to cluster that contain nodes in redhat.
Can you give me a detailed documentation that... (1 Reply)
Discussion started by: Safi1982
1 Replies
7. HP-UX
Hello guys,
I would like to ask for your assistance, since i am new to HP-UX.
Please give me some documentation about clustering in HP-UX. More precisely design,architecture, configuring etc. I am working on my master thesis right now and would like to include some guidance about that.... (1 Reply)
Discussion started by: bazillion
1 Replies
8. Ubuntu
Hi All,
I am new user here and a new one to try clustering with Ubuntu nodes, and need help. If I should be in another place please mention.
I have a two nodes with Ubuntu 14.04 installed on them. I need to make a cluster consisting of these two nodes with purpose of experimentation with... (3 Replies)
Discussion started by: IncognitoExpert
3 Replies
9. UNIX for Beginners Questions & Answers
Hello!
I need some advices from You. How many days i need to setup cluster using virtual box for mid exp user? Do you have any ideas related to master thesis related to clustering? I need to include some search aspect within that topic.Can You recommend some books/docs about that case?
Thank... (4 Replies)
Discussion started by: protos27
4 Replies
LEARN ABOUT DEBIAN
advzip
AdvanceCOMP ZIP Compression Utility(1) General Commands Manual AdvanceCOMP ZIP Compression Utility(1)
NAME
advzip - AdvanceCOMP ZIP Compression Utility
SYNOPSIS
advzip [-a, --add] [-x, --extract] [-l, --list]
[-z, --recompress] [-t, --test] [-0, --shrink-store]
[-1, --shrink-fast] [-2, --shrink-normal] [-3, --shrink-extra]
[-4, --shrink-insane] [-N, --not-zip] [-p, --pedantic] [-q, --quiet]
[-h, --help] [-V, --version] ARCHIVES... [FILES...]
DESCRIPTION
The main purpose of this utility is to recompress and test the zip archives to get the smallest possible size.
For recompression the 7-Zip (www.7-zip.com) Deflate implementation is used. This implementation generally gives 5-10% more compression than
the zLib Deflate implementation.
For experimental purpose also the 7-Zip LZMA algorithm is available with the -N option. In this case, the generated zips WILL NOT BE USABLE
by any other program. To make them usable you need to recompress them without the -N option. Generally this algorithm gives 10-20% more
compression than the 7-Zip Deflate implementation.
OPTIONS
-a, --add ARCHIVE FILES...
Create the specified archive with the specified files. You must specify only one archive.
-x, --extract ARCHIVE
Extract all the files on the specified archive. You must specify only one archive.
-l, --list ARCHIVES...
List the content of the specified archives.
-z, --recompress ARCHIVES...
Recompress the specified archives. If the -1, -2, -3 options are specified, it's used the smallest file choice from: the previous
compressed data, the new compression and the uncompressed format. If the -0 option is specified the archive is always rewritten
without any compression.
-t, --test ARCHIVES...
Test the specified archives. The tests may be extended with the -p option.
-N, --not-zip
Use the LZMA algorithm when compressing. The generated zips will not be readable by any other application!
-p, --pedantic
Be pedantic on the zip tests. If this flag is enabled some more extensive tests on the zip integrity are done. These tests are gen-
erally not done by other zip utilities.
-0, --shrink-store
Disable the compression. The file is only stored and not compressed. This option is very useful to expand the archives of .png and
.mp3 files. These files are already compressed, trying to compress them another time is really a waste of time and resource.
-1, --shrink-fast
Set the compression level to "fast".
-2, --shrink-normal
Set the compression level to "normal". This is the default level of compression.
-3, --shrink-extra
Set the compression level to "extra".
-4, --shrink-insane
Set the compression level to "insane".
COPYRIGHT
This file is Copyright (C) 2002 Andrea Mazzoleni, Filipe Estima
SEE ALSO
advpng(1), advmng(1), advdef(1)
AdvanceCOMP ZIP Compression Utility(1)