Secure Hash Algorithm

The Secure Hash Algorithm is a family of cryptographic hash functions published by the National Institute of Standards and Technology (NIST) as a U.S. Federal Information Processing Standard (FIPS), and may refer to:-

  • SHA-0: A retronym applied to the original version of the 160-bit hash function published in 1993 under the name “SHA”. It was withdrawn shortly after publication due to an undisclosed “significant flaw” and replaced by the slightly revised version SHA-1.
  • SHA-1: A 160-bit hash function which resembles the earlier MD5 algorithm. This was designed by the National Security Agency (NSA) to be part of the Digital Signature Algorithm. Cryptographic weaknesses were discovered in SHA-1, and the standard was no longer approved for most cryptographic uses after 2010.
  • SHA-2: A family of two similar hash functions, with different block sizes, known as SHA-256 and SHA-512. They differ in the word size; SHA-256 uses 32-bit words where SHA-512 uses 64-bit words. There are also truncated versions of each standard, known as SHA-224 and SHA-384. These were also designed by the NSA.
  • SHA-3: A hash function formerly called Keccak, chosen in 2012 after a public competition among non-NSA designers. It supports the same hash lengths as SHA-2, and its internal structure differs significantly from the rest of the SHA family

 

SHA IS PRONOUNCED: shah and the checksum it returns is called Shaw.

checksum or hash sum is a small-size datum computed from an arbitrary block of digital data for the purpose of detecting errors that may have been introduced during its transmission or storage.

The actual procedure that yields the checksum, given a data input is called a checksum function or checksum algorithm. Depending on its design goals, a good checksum algorithm will usually output a significantly different value even for small changes made to the input. This is especially the case of cryptographic hash functions. Due to this property they may be used to detect many data corruption errors and verify overall data integrity; If the computed checksum for the current data input matches the stored value of a previously computed checksum, there is a very high probability the data has not been accidentally altered or corrupted.

Checksum functions are related to hash functionsfingerprintsrandomization functions, and cryptographic hash functions. However, each of those concepts has different applications and therefore different design goals. By themselves, checksums are often used to verify data integrity, but should not be relied upon to also verify data authentication. However they are used as cryptographic primitives in larger authentication algorithms. For cryptographic systems with these two specific design goals, see HMAC.

Check digits and parity bits are special cases of checksums, appropriate for small blocks of data (such as Social Security numbersbank account numbers, computer words, single bytes, etc.). Some error-correcting codes are based on special checksums that not only detect common errors but also allow the original data to be recovered in certain cases.

 

Advertisements