Sponsored Content
Full Discussion: Computational complexity
Special Forums Cybersecurity Computational complexity Post 302377668 by gratuitous_arp on Friday 4th of December 2009 03:58:21 PM
Old 12-04-2009
jim,

Thanks for the reply. The point you and Schneier make is an excellent one; the human is certainly the weakest link of the equation, and the recently published 256-bit AES "crack" referred to in the article has no reason to dissuade people from the use of AES. (Another interesting reference to the security vulnerability posed by the human factor is one of Kevin Mitnick's books on social engineering, The Art of Deception.)

The point you make is that the computational complexity requirements of cracking such encryption is beyond feasability -- and that a high level of "infeasibility" is what is important where the algorithm is concerned. To someone asking about a practical measure of "safety" given by an algorithm, like I did in my original post, this is a good answer -- it is plenty safe!

My real interest is to understand the degree to which such a task is infeasible. From my example, if the 256-bit AES crack required 1.9 * 10^25 terabytes (rounded) of storage in some combination of RAM and/or disk space, plus the amount of computation required to perform the attack, that would certainly be well beyond the realm of feasibility. If I am to say that something is infeasible, I have to learn the "why" behind it -- otherwise, I have no personal understanding of it; I would be reciting "something I heard on the Internet". Not very convincing in conversation, even if there has never been a truer statement! Having a quantifiable amount like the above so I can say something like, "You would need 1.9 * 10^25 Terabytes of storage to implement the attack" is how one knows that something is infeasible. For educational purposes, I am trying to find the ground where the theory meets reality.

So, for example, am I arriving at the figure "1.9 * 10^25 terabytes" correctly, from this explanation in my first post? I know I did the math correctly, but am I missing any fundamental ideas, or would this be a correct derivation?

Quote:
... if I assume a key length of 256 bits and I need to store 2^119 keys simultaneously, I should be able to calculate the total storage size required to make an attempt at this particular attack:

In bits: 256 bits * 2^119 = 1.7 * 10^38 bits (rounded)

Converted to terabytes: 1.9 * 10^25 terabytes (rounded)
Similarly, is my statement of 2^256 password guesses to completely guess all possible keys (using a key length of 256 bits) where the idea of a computational complexity of 2^256 comes from? For instance, this would let me figure out how many processing cycles a single guess attempt might take using a specific piece of software on a given platform, then crunch out some numbers to arrive at a measure of time (however obscene and unimaginable it might be).

Thanks again for your help.
 

6 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Need Help installing a Computational Tool Kit

Hi ! I have been trying to install IMSL Computational Tool Kit on a server. It is a Lunix Redhat V.4 with Intel pentium d processor and Intel fortran compiler 8.1 and the type of command shell we run is bash. I dont know if the problem is with the Installation or the Lunix system. I have... (1 Reply)
Discussion started by: dsmv
1 Replies

2. Emergency UNIX and Linux Support

Appending the contents of two files in computational manner

Hi, I have two files say file 1 file 2 File1 1 2 4 5 File 2 asdf adf How to get the ouput something like asdf1 adf1 asdf2 adf2 asdf4 adf4 asdf5 (5 Replies)
Discussion started by: ecearund
5 Replies

3. AIX

topas - computational memory 95% : Any Impact?

Hello Gurus, I am using AIX 5 and on running topas command. I can see the computational memory is 93.3% with Swap Paging memory at 2.2%. Could you please advise if there is any impact by the growth of computational memory? Below is the stat: MEMORY Real,MB 22528 % Comp 93.3 %... (12 Replies)
Discussion started by: panchpan
12 Replies

4. UNIX for Advanced & Expert Users

Passphrase Complexity

Hi, How to configure minimum passphrase (Not UNIX password) requirements on any UNIX box? Passphrase - the one user enteres while generating pub/pvt keys using ssh-keygen. Thanks! Reddy (3 Replies)
Discussion started by: reddyr
3 Replies

5. Shell Programming and Scripting

Capturing computational/non computational memory from topas

Hi Friends, How to capture the value of %Comp and %Noncomp values from AIX using topas command. I tried lot, but i cannot capture the value. (4 Replies)
Discussion started by: Nowshath
4 Replies

6. Shell Programming and Scripting

Shell script for %computational memory & % non computational memory

Dear, How to calculate %computational memory and %non computational memory from AIX server. What command used to find out %computational memory and % non computational memory except topas. Regards Nowshath (1 Reply)
Discussion started by: Nowshath
1 Replies
Perl::Critic::Policy::Subroutines::ProhibitExcessComplexUser3Contributed Perl DocumePerl::Critic::Policy::Subroutines::ProhibitExcessComplexity(3)

NAME
Perl::Critic::Policy::Subroutines::ProhibitExcessComplexity - Minimize complexity by factoring code into smaller subroutines. AFFILIATION
This Policy is part of the core Perl::Critic distribution. DESCRIPTION
All else being equal, complicated code is more error-prone and more expensive to maintain than simpler code. The first step towards managing complexity is to establish formal complexity metrics. One such metric is the McCabe score, which describes the number of possible paths through a subroutine. This Policy approximates the McCabe score by summing the number of conditional statements and operators within a subroutine. Research has shown that a McCabe score higher than 20 is a sign of high-risk, potentially untestable code. See <http://en.wikipedia.org/wiki/Cyclomatic_complexity> for some discussion about the McCabe number and other complexity metrics. The usual prescription for reducing complexity is to refactor code into smaller subroutines. Mark Dominus book "Higher Order Perl" also describes callbacks, recursion, memoization, iterators, and other techniques that help create simple and extensible Perl code. CONFIGURATION
The maximum acceptable McCabe can be set with the "max_mccabe" configuration item. Any subroutine with a McCabe score higher than this number will generate a policy violation. The default is 20. An example section for a .perlcriticrc: [Subroutines::ProhibitExcessComplexity] max_mccabe = 30 NOTES
"Everything should be made as simple as possible, but no simpler." -- Albert Einstein Complexity is subjective, but formal complexity metrics are still incredibly valuable. Every problem has an inherent level of complexity, so it is not necessarily optimal to minimize the McCabe number. So don't get offended if your code triggers this Policy. Just consider if there might be a simpler way to get the job done. AUTHOR
Jeffrey Ryan Thalhammer <jeff@imaginative-software.com> COPYRIGHT
Copyright (c) 2005-2011 Imaginative Software Systems. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. The full text of this license can be found in the LICENSE file included with this module. perl v5.16.3 2014-06-09 Perl::Critic::Policy::Subroutines::ProhibitExcessComplexity(3)
All times are GMT -4. The time now is 08:53 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy