On Computing Enterprise IT Risk Metrics

 
Thread Tools Search this Thread
Special Forums News, Links, Events and Announcements UNIX and Linux RSS News On Computing Enterprise IT Risk Metrics
# 1  
Old 02-23-2011
On Computing Enterprise IT Risk Metrics

HPL-2011-26 On Computing Enterprise IT Risk Metrics - Bhatt, Sandeep; Horne, William; Rao, Prasad
Keyword(s): No keywords available.
Abstract: Assessing the vulnerability of large heterogeneous systems is crucial to IT operational decisions such as prioritizing the deployment of security patches and enhanced monitoring. These assessments are based on various criteria, including (i) the NIST National Vulnerability Database which reports ten ...
Full Report

More...
Login or Register to Ask a Question

Previous Thread | Next Thread

5 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Performance Metrics using wget

I am looking for a way to measure performance metrics of streaming audio/video from a contecnt server, e.g. YouTube for example. Im keen to see if I can look at duration it took for contecnt to download. I know from the output of wget's log file you can see duration a url is downloaded in.... (5 Replies)
Discussion started by: rob171171
5 Replies

2. Infrastructure Monitoring

Risk report: Four years of Red Hat Enterprise Linux 4

Red Hat® Enterprise Linux® 4 was released on February 15th, 2005. This report takes a look at the state of security for the first four years from release. We look at key metrics, specific vulnerabilities, and the most common ways users were affected by security issues. We will show some... (0 Replies)
Discussion started by: Linux Bot
0 Replies

3. Virtualization and Cloud Computing

Event Cloud Computing - IBM Turning Data Centers Into ?Computing Cloud?

Tim Bass Thu, 15 Nov 2007 23:55:07 +0000 *I predict we may experience less*debates*on the use of the term “event cloud”*related to*CEP in the future, now that both IBM and Google* have made announcements about “cloud computing” and “computing cloud”, IBM Turning Data Centers Into ‘Computing... (0 Replies)
Discussion started by: Linux Bot
0 Replies

4. UNIX for Advanced & Expert Users

I/O Stats Metrics

What do others use for measuring I/O statistics? I'd like something versatile, as in being able to watch (like iostat, but easier to trend), generate load (like iozone, but more realistic), and perform somewhat generalized benchmarks (like bonnie, but more current.) It would scale from a few... (0 Replies)
Discussion started by: LivinFree
0 Replies

5. UNIX for Advanced & Expert Users

Unix Metrics

Hi, Does anyone know of any programs that can create Unix (Solaris) server metrics such as server uptime, services uptime, processor utilization by hour by day, memory utilization by hour by day, active users by hour by day, etc? Thanks! (2 Replies)
Discussion started by: ghuber
2 Replies
Login or Register to Ask a Question
Perl::Metrics::Simple::Analysis::File(3pm)		User Contributed Perl Documentation		Perl::Metrics::Simple::Analysis::File(3pm)

NAME
Perl::Metrics::Simple::Analysis::File - Methods analyzing a single file. SYNOPSIS
use Perl::Metrics::Simple::Analysis::File; my $object = Perl::Metrics::Simple::Analysis::File->new(file => 'path/to/file'); VERSION
This is VERSION 0.1 DESCRIPTION
A Perl::Metrics::Simple::Analysis::File object is created by Perl::Metrics::Simple for each file analyzed. These objects are aggregated into a Perl::Metrics::Simple::Analysis object by Perl::Metrics::Simple. In general you will not use this class directly, instead you will use Perl::Metrics::Simple, but there's no harm in exposing the various methods this class provides. CLASS METHODS
new Takes named parameters, current only the path parameter is recognized: my $file_results = BPerl::Metrics::Simple::Analysis::File->new( path => $path ); Returns a new Perl::Metrics::Simple::Analysis::File object which has been populated with the results of analyzing the file at path. Throws an exception if the path is missing or unreadable. OBJECT METHODS
Call on an object. all_counts Convenience method. Takes no arguments and returns a hashref of all counts: { path => $self->path, lines => $self->lines, main_stats => $self->main_stats, subs => $self->subs, packages => $self->packages, } analyze_main Takes a PPI document and an arrayref of PPI::Statement::Sub objects and returns a hashref with information about the 'main' (non- subroutine) portions of the document: { lines => $lines, # Line count outside subs. Skips comments and pod. mccabe_complexity => $complexity, # Cyclomatic complexity of all non-sub areas path => '/path/to/file', name => '{code not in named subroutines}', # always the same name }; get_node_length Takes a PPI node and returns a count of the newlines it contains. PPI normalizes line endings to newlines so CR/LF, CR and LF all come out the same. The line counts reported by the various methods in this class all exclude blank lines, comment lines and pod (the PPI document is pruned before counting.) lines Total non-blank, non-comment, non-pod lines. main_stats Returns the hashref generated by analyze_main without re-analyzing document. logic_keywords Returns an array (in array context) or ref-to-ARRAY of the keywords used in calculating complexity. See Logic Keywords section below. logic_operators Returns an array (in array context) or ref-to-ARRAY of the operators used in calculating complexity. See Logic Operators section below. measure_complexity Takes a PPI element and measures an approximation of the McCabe Complexity (aka Cyclomatic Complexity) of the code. McCabe Complexity is basically a count of how many paths there are through the code. We use a simplified method for counting this, which ignores things like the possibility that a 'use' statement could throw an exception. The actual measurement we use for a chunk of code is 1 plus 1 each logic keyword or operator: Logic operators: The default list is: @Perl::Metrics::Simple::Analysis::File::DEFAULT_LOGIC_OPERATORS ! !~ && &&= // < <<= <=> == =~ > >>= ? and cmp eq gt lt ne not or xor || ||= ~~ You can supply your own list by setting: @Perl::Metrics::Simple::Analysis::File::LOGIC_OPERATORS before creating a new object. Logic keywords: @Perl::Metrics::Simple::Analysis::File::DEFAULT_LOGIC_KEYWORDS else elsif for foreach goto grep if last map next unless until while You can supply your own list by setting: @Perl::Metrics::Simple::Analysis::File::LOGIC_KEYWORDS before creating a new object. Examples of Complexity Here are a couple of examples of how we count complexity: Example of complexity count of 1: use Foo; print "Hello world. "; exit; Example of complexity count of 2: if ( $a ) { # The "if" adds 1. # do something } Example of complexity count of 6: sub foo { # 1: for non-empty code if ( @list ) { # 1: "if" foreach my $x ( @list ) { # 1: "foreach" if ( ! $x ) { # 2: 1 for "if" and 1 for "!" do_something($x); } else { # 1 for "else" do_something_else($x); } } } return; } packages Arrayref of unique packages found in the file. path Either the path to the file, or a scalar ref if that was supplied instaed of a path. subs Count of subroutines found. STATIC PACKAGE SUBROUTINES
Utility subs used internally, but no harm in exposing them for now. hashify %hash = Perl::Metrics::Simple::Analysis::File::hashify(@list); Takes an array and returns a hash using the array values as the keys and with the values all set to 1. is_hash_key $boolean = Perl::Metrics::Simple::Analysis::File::is_hash_key($ppi_element); Takes a PPI::Element and returns true if the element is a hash key, for example "foo" and "bar" are hash keys in the following: { foo => 123, bar => $a } Copied and somehwat simplified from http://search.cpan.org/src/THALJEF/Perl-Critic-0.19/lib/Perl/Critic/Utils.pm See Perl::Critic::Utils. BUGS AND LIMITATIONS
None reported yet ;-) DEPENDENCIES
Readonly Perl::Metrics::Simple::Analysis SUPPORT
Via CPAN: Disussion Forum http://www.cpanforum.com/dist/Perl-Metrics-Simple Bug Reports http://rt.cpan.org/NoAuth/Bugs.html?Dist=Perl-Metrics-Simple AUTHOR
Matisse Enzer CPAN ID: MATISSE Eigenstate Consulting, LLC matisse@eigenstate.net http://www.eigenstate.net/ LICENSE AND COPYRIGHT
Copyright (c) 2006-2009 by Eigenstate Consulting, LLC. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. The full text of the license can be found in the LICENSE file included with this module. perl v5.10.1 2010-05-13 Perl::Metrics::Simple::Analysis::File(3pm)