Entries Tagged “cryptography”
Page 26 of 44
I'm just going to quote without comment:
About the file: the text message file encrypted with a symmetric key combine 3 modes
1st changing the original text with random (white noise) and PHR (Pure Human Randomness) shuffle command , move and replace instruction combine with the key from mode 1 (white noise) and 2 (PHR)
2nd mode xor PHR - Pure Human random ( or ROEE Random Oriented Enhanced Encryption) with a TIME set of instruction , and a computational temporary set of instructions to produce a real one time PAD when every time ,
Text will transform to a cipher the last will be different
3rd mode xor WNS - White Noise Signal with a TIME set of instruction , and a computational temporary set of instructions to produce a real one time PAD when every time ,
Text will transform to a cipher the last will be different
4th Reconstructs file, levels and dimensions to a
this is a none mathematical with zero use of calculation algorithm - so no brute force , Rainbow Crack , or gpu cuda nvidia brute force crack can be applied on this technology . Sorry you have to find a new way to crack chaos theory for that.
We use 0% of any mathematical calculation algorithm so we can perform any ware with unparalleled strength
Key Strength - 1million bit or more
Speed performance 400% faster Compeer to AES
MPU use - Mathematical Process Unit in CPU use 3% - 7% only
Overhead of the file from original 5% +/- (original+5%) +/-
A combination of mode 1 and 2 applied with a new variation of XOR - to perform the encrypted message
Anyone have any ideas?
The most important issue of any encryption product is the 'bit key strength'. To date the strongest known algorithm has a 448-bit key. Crypteto now offers a 49,152-bit key. This means that for every extra 1 bit increase that Crypteto has over its competition makes it 100% stronger. The security and privacy this offers is staggering.
Yes, every key bit doubles an algorithm's strength against brute-force attacks. But it's hard to find any real meaning in a work factor of 249152.
Coupled with this truly remarkable breakthrough Crypteto does not compromise on encryption speed. In the past, incremental key strength improvements have effected the speed that data is encrypted. The usual situation was that for every 1 bit increase in key strength there was a consequent reduction in encryption speed by 50%.
That's not even remotely true. It's not at all obvious how key length is related to encryption speed. Blowfish has the same speed, regardless of key length. AES-192 is about 20% slower than AES-128, and AES-256 is about 40% slower. Threefish, the block cipher inside Skein, encrypts data at 7.6 clock cycles/byte with a 256-bit key, 6.1 clock cycles/byte with a 512-bit key, and 6.5 clock cycles/byte with a 1024-bit key. I'm not claiming that Threefish is secure and ready for commercial use -- at any keylength -- but there simply isn't a chance that encryption speed will drop by half for every key bit added.
This is a fundamental asymmetry of cryptography, and it's important to get right. The cost to encrypt is linear as a function of key length, while cost to break is geometric. It's one of the reasons why, of all the links in a security chain, cryptography is the strongest.
Normally I wouldn't bother with this kind of thing, but they explicitly asked me to comment:
But Hawthorne Davies has overcome this issue. By offering an algorithm with an unequalled key strength of 49,152 bits, we are able to encrypt and decrypt data at speeds in excess of 8 megabytes per second. This means that the aforementioned Gigabyte of data would take 2 minutes 13 seconds. If Bruce Schneier, the United State's foremost cryptologist, were to increase his Blowfish 448 bit encryption algorithm to Blowfish 49152, he would be hard pressed to encrypt one Gigabyte in 4 hours.
We look forward to receiving advice and encouragement from the good Dr. Schneier.
I'm not a doctor of anything, but sure. Read my 1999 essay on snake-oil cryptography:
Warning Sign #5: Ridiculous key lengths.
Jaws Technology boasts: "Thanks to the JAWS L5 algorithm's statistically unbreakable 4096 bit key, the safety of your most valued data files is ensured." Meganet takes the ridiculous a step further: "1 million bit symmetric keys -- The market offer's [sic] 40-160 bit only!!"
Longer key lengths are better, but only up to a point. AES will have 128-bit, 192-bit, and 256-bit key lengths. This is far longer than needed for the foreseeable future. In fact, we cannot even imagine a world where 256-bit brute force searches are possible. It requires some fundamental breakthroughs in physics and our understanding of the universe. For public-key cryptography, 2048-bit keys have same sort of property; longer is meaningless.
Think of this as a sub-example of Warning Sign #4: if the company doesn't understand keys, do you really want them to design your security product?
Or read what I wrote about symmetric key lengths in 1996, in Applied Cryptography (pp. 157–8):
One of the consequences of the second law of thermodynamics is that a certain amount of energy is necessary to represent information. To record a single bit by changing the state of a system requires an amount of energy no less than kT, where T is the absolute temperature of the system and k is the Boltzman constant. (Stick with me; the physics lesson is almost over.)
Given that k = 1.38×10-16 erg/°Kelvin, and that the ambient temperature of the universe is 3.2°Kelvin, an ideal computer running at 3.2°K would consume 4.4×10-16 ergs every time it set or cleared a bit. To run a computer any colder than the cosmic background radiation would require extra energy to run a heat pump.
Now, the annual energy output of our sun is about 1.21×1041 ergs. This is enough to power about 2.7×1056 single bit changes on our ideal computer; enough state changes to put a 187-bit counter through all its values. If we built a Dyson sphere around the sun and captured all its energy for 32 years, without any loss, we could power a computer to count up to 2192. Of course, it wouldn't have the energy left over to perform any useful calculations with this counter.
But that's just one star, and a measly one at that. A typical supernova releases something like 1051 ergs. (About a hundred times as much energy would be released in the form of neutrinos, but let them go for now.) If all of this energy could be channeled into a single orgy of computation, a 219-bit counter could be cycled through all of its states.
These numbers have nothing to do with the technology of the devices; they are the maximums that thermodynamics will allow. And they strongly imply that brute-force attacks against 256-bit keys will be infeasible until computers are built from something other than matter and occupy something other than space.
Ten years later, there is still no reason to use anything more than a 256-bit symmetric key. I gave the same advice in 2003 Practical Cryptography (pp. 65-6). Even a mythical quantum computer won't be able to brute-force that large a keyspace. (Public keys are different, of course -- see Table 2.2 of this NIST document for recommendations).
Of course, in the real world there are smarter ways than to brute-force keysearch. And the whole point of cipher cryptanalysis is to find shortcuts to brute-force search (like this attack on AES), but a 49,152-bit key is just plain stupid.
EDITED TO ADD (9/30): Now this is funny:
Some months ago I sent individual emails to each of seventeen experts in cryptology, all with the title of Doctor or Professor. My email was a first announcement to the academic world of the TOUAREG Encryption Algorithm, which, somewhat unusually, has a session key strength of over 49,000 bits and yet runs at 3 Megabytes per second. Bearing in mind that the strongest version of BLOWFISH has a session key of 448 bits and that every additional bit doubles the task of key-crashing, I imagined that my announcement would create more than a mild flutter of interest.
Much to his surprise, no one responded.
Here's some more advice: my 1998 essay, "Memo to the Amateur Cipher Designer." Anyone can design a cipher that he himself cannot break. It's not even hard. So when you tell a cryptographer that you've designed a cipher that you can't break, his first question will be "who the hell are you?" In other words, why should the fact that you can't break a cipher be considered evidence of the cipher's security?
If you want to design algorithms, start by breaking the ones out there. Practice by breaking algorithms that have already been broken (without peeking at the answers). Break something no one else has broken. Break another. Get your breaks published. When you have established yourself as someone who can break algorithms, then you can start designing new algorithms. Before then, no one will take you seriously.
EDITED TO ADD (9/30): I just did the math. An encryption speed of 8 megabytes per second on a 3.33 GHz CPU translates to about 400 clock cycles per byte. This is much, much slower than any of the AES finalists ten years ago, or any of the SHA-3 second round candidates today. It's kind of embarrassingly slow, really.
Texas Instruments' calculators use RSA digital signatures to authenticate any updates to their operating system. Unfortunately, their signing keys are too short: 512-bits. Earlier this month, a collaborative effort factored the moduli and published the private keys. Texas Instruments responded by threatening websites that published the keys with the DMCA, but it's too late.
So far, we have the operating-system signing keys for the TI-92+, TI-73, TI-89, TI-83+/TI-83+ Silver Edition, Voyage 200, TI-89 Titanium, and the TI-84+/TI-84 Silver Edition, and the date-stamp signing key for the TI-73, Explorer, TI-83 Plus, TI-83 Silver Edition, TI-84 Plus, TI-84 Silver Edition, TI-89, TI-89 Titanium, TI-92 Plus, and the Voyage 200.
Moral: Don't assume that if your application is obscure, or if there's no obvious financial incentive for doing so, that your cryptography won't be broken if you use too-short keys.
This is an important development:
Shor's algorithm was first demonstrated in a computing system based on nuclear magnetic resonance -- manipulating molecules in a solution with strong magnetic fields. It was later demonstrated with quantum optical methods but with the use of bulk components like mirrors and beam splitters that take up an unwieldy area of several square meters.
Last year, the Bristol researchers showed they could miniaturize this optical setup, building a quantum photonic circuit on a silicon chip mere millimeters square. They replaced mirrors and beam splitters with waveguides that weave their way around the chip and interact to split, reflect, and transmit light through the circuit. They then injected photons into the waveguides to act as their qubits.
Now they've put their circuit to work: Using four photons that pass through a sequence of quantum logic gates, the optical circuit helped find the prime factors of the number 15. While the researchers showed that it was possible to solve for the factors, the chip itself didn't just spit out 5 and 3. Instead, it came up with an answer to the "order-finding routine," the "computationally hard" part of Shor's algorithm that requires a quantum calculation to solve the problem in a reasonable amount of time, according to Jeremy O'Brien, a professor of physics and electrical engineering at the University of Bristol. The researchers then finished the computation using an ordinary computer to finally come up with the correct factors.
I've written about quantum computing (and quantum cryptography):
Several groups are working on designing and building a quantum computer, which is fundamentally different from a classical computer. If one were built -- and we're talking science fiction here -- then it could factor numbers and solve discrete-logarithm problems very quickly. In other words, it could break all of our commonly used public-key algorithms. For symmetric cryptography it's not that dire: A quantum computer would effectively halve the key length, so that a 256-bit key would be only as secure as a 128-bit key today. Pretty serious stuff, but years away from being practical.
Here's a really good essay on the realities of quantum computing and its potential effects on public-key cryptography.
Skein is one of the 14 SHA-3 candidates chosen by NIST to advance to the second round. As part of the process, NIST allowed the algorithm designers to implement small "tweaks" to their algorithms. We've tweaked the rotation constants of Skein. This change does not affect Skein's performance in any way.
The revised Skein paper contains the new rotation constants, as well as information about how we chose them and why we changed them, the results of some new cryptanalysis, plus new IVs and test vectors. Revised source code is here.
The latest information on Skein is always here.
Tweaks were due today, September 15. Now the SHA-3 process moves into the second round. According to NIST's timeline, they'll choose a set of final round candidate algorithms in 2010, and then a single hash algorithm in 2012. Between now and then, it's up to all of us to evaluate the algorithms and let NIST know what we want. Cryptanalysis is important, of course, but so is performance.
Here's my 2008 essay on SHA-3. The second-round algorithms are: BLAKE, Blue Midnight Wish, CubeHash, ECHO, Fugue, Grøstl, Hamsi, JH, Keccak, Luffa, Shabal, SHAvite-3, SIMD, and Skein. You can find details on all of them, as well as the current state of their cryptanalysis, here.
In other news, we're making Skein shirts available to the public. Those of you who attended the First Hash Function Candidate Conference in Leuven, Belgium, earlier this year might have noticed the stylish black Skein polo shirts worn by the Skein team. Anyone who wants one is welcome to buy it, at cost. Details (with photos) are here. All orders must be received before 1 October, and then we'll have all the shirts made in one batch.
But, after many years of evaluating the security of software systems, I'm incredibly down on using the book that made Bruce famous when designing the cryptographic aspects of a system. In fact, I can safely say I have never seen a secure system come out the other end, when that is the primary source for the crypto design. And I don't mean that people forget about the buffer overflows. I mean, the crypto is crappy.
My rule for software development teams is simple: Don't use Applied Cryptography in your system design. It's fine and fun to read it, just don't build from it.
The book talks about the fundamental building blocks of cryptography, but there is no guidance on things like, putting together all the pieces to create a secure, authenticated connection between two parties.
Plus, in the nearly 13 years since the book was last revised, our understanding of cryptography has changed greatly. There are things in it that were thought to be true at the time that turned out to be very false....
I agree. And, to his credit, Viega points out that I agree:
But in the introduction to Bruce Schneier's book, Practical Cryptography, he himself says that the world is filled with broken systems built from his earlier book. In fact, he wrote Practical Cryptography in hopes of rectifying the problem.
This is all true.
Designing a cryptosystem is hard. Just as you wouldn't give a person -- even a doctor -- a brain-surgery instruction manual and then expect him to operate on live patients, you shouldn't give an engineer a cryptography book and then expect him to design and implement a cryptosystem. The patient is unlikely to survive, and the cryptosystem is unlikely to be secure.
Even worse, security doesn't provide immediate feedback. A dead patient on the operating table tells the doctor that maybe he doesn't understand brain surgery just because he read a book, but an insecure cryptosystem works just fine. It's not until someone takes the time to break it that the engineer might realize that he didn't do as good a job as he thought. Remember: Anyone can design a security system that he himself cannot break. Even the experts regularly get it wrong. The odds that an amateur will get it right are extremely low.
For those who are interested, a second edition of Practical Cryptography will be published in early 2010, renamed Cryptography Engineering and featuring a third author: Tadayoshi Kohno.
EDITED TO ADD (9/16): Commentary.
Blog post from Steve Bellovin:
It is vital that the keystream values (a) be truly random and (b) never be reused. The Soviets got that wrong in the 1940s; as a result, the U.S. Army's Signal Intelligence Service was able to read their spies' traffic in the Venona program. The randomness requirement means that the values cannot be generated by any algorithm; they really have to be random, and created by a physical process, not a mathematical one.
A consequence of these requirements is that the key stream must be as long as the data to be encrypted. If you want to encrypt a 1 megabyte file, you need 1 megabyte of key stream that you somehow have to share securely with the recipient. The recipient, in turn, has to store this data securely. Furthermore, both the sender and the recipient must ensure that they never, ever reuse the key stream. The net result is that, as I've often commented, "one-time pads are theoretically unbreakable, but practically very weak. By contrast, conventional ciphers are theoretically breakable, but practically strong." They're useful for things like communicating with high-value spies. The Moscow-Washington hotline used them, too. For ordinary computer usage, they're not particularly practical.
I wrote about one-time pads, and their practical insecurity, in 2002:
What a one-time pad system does is take a difficult message security problem -- that's why you need encryption in the first place -- and turn it into a just-as-difficult key distribution problem. It's a "solution" that doesn't scale well, doesn't lend itself to mass-market distribution, is singularly ill-suited to computer networks, and just plain doesn't work.
One-time pads may be theoretically secure, but they are not secure in a practical sense. They replace a cryptographic problem that we know a lot about solving -- how to design secure algorithms -- with an implementation problem we have very little hope of solving.
Excellent paper: "On Locational Privacy, and How to Avoid Losing it Forever."
Some threats to locational privacy are overt: it's evident how cameras backed by face-recognition software could be misused to track people and record their movements. In this document, we're primarily concerned with threats to locational privacy that arise as a hidden side-effect of clearly useful location-based services.
We can't stop the cascade of new location-based digital services. Nor would we want to -- the benefits they offer are impressive. What urgently needs to change is that these systems need to be built with privacy as part of their original design. We can't afford to have pervasive surveillance technology built into our electronic civic infrastructure by accident. We have the opportunity now to ensure that these dangers are averted.
Our contention is that the easiest and best solution to the locational privacy problem is to build systems which don't collect the data in the first place. This sounds like an impossible requirement (how do we tell you when your friends are nearby without knowing where you and your friends are?) but in fact as we discuss below it is a reasonable objective that can be achieved with modern cryptographic techniques.
Modern cryptography actually allows civic data processing systems to be designed with a whole spectrum of privacy policies: ranging from complete anonymity to limited anonymity to support law enforcement. But we need to ensure that systems aren't being built right at the zero-privacy, everything-is-recorded end of that spectrum, simply because that's the path of easiest implementation.
I've already written about wholesale surveillance.
Photo of Bruce Schneier by Per Ervland.
Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.