Misplaced Pages

Space–time tradeoff

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Time–space tradeoff) Algorithm trading more space for lower time
This article may require cleanup to meet Misplaced Pages's quality standards. The specific problem is: casual tone, lack of detail. Please help improve this article if you can. (March 2014) (Learn how and when to remove this message)

A space–time trade-off, also known as time–memory trade-off or the algorithmic space-time continuum in computer science is a case where an algorithm or program trades increased space usage with decreased time. Here, space refers to the data storage consumed in performing a given task (RAM, HDD, etc.), and time refers to the time consumed in performing a given task (computation time or response time).

The utility of a given space–time tradeoff is affected by related fixed and variable costs (of, e.g., CPU speed, storage space), and is subject to diminishing returns.

History

Biological usage of time–memory tradeoffs can be seen in the earlier stages of animal behavior. Using stored knowledge or encoding stimuli reactions as "instincts" in the DNA avoids the need for "calculation" in time-critical situations. More specific to computers, look-up tables have been implemented since the very earliest operating systems.

In 1980 Martin Hellman first proposed using a time–memory tradeoff for cryptanalysis.

Types of tradeoff

Lookup tables vs. recalculation

A common situation is an algorithm involving a lookup table: an implementation can include the entire table, which reduces computing time, but increases the amount of memory needed, or it can compute table entries as needed, increasing computing time, but reducing memory requirements.

Database indexes vs. table scans

Database Management Systems offer the capability to create Database index data structures. Indexes improve the speed of lookup operations at the cost of additional space. Without indexes, time-consuming Full table scan operations are sometimes required to locate desired data.

Compressed vs. uncompressed data

A space–time trade off can be applied to the problem of data storage. If data is stored uncompressed, it takes more space but access takes less time than if the data were stored compressed (since compressing the data reduces the amount of space it takes, but it takes time to run the decompression algorithm). Depending on the particular instance of the problem, either way is practical. There are also rare instances where it is possible to directly work with compressed data, such as in the case of compressed bitmap indices, where it is faster to work with compression than without compression.

Re-rendering vs. stored images

Storing only the SVG source of a vector image and rendering it as a bitmap image every time the page is requested would be trading time for space; more time used, but less space. Rendering the image when the page is changed and storing the rendered images would be trading space for time; more space used, but less time. This technique is more generally known as caching.

Smaller code vs. loop unrolling

Larger code size can be traded for higher program speed when applying loop unrolling. This technique makes the code longer for each iteration of a loop, but saves the computation time required for jumping back to the beginning of the loop at the end of each iteration.

Other examples

Algorithms that also make use of space–time tradeoffs include:

  • Baby-step giant-step algorithm for calculating discrete logarithms
  • Rainbow tables in cryptography, where the adversary is trying to do better than the exponential time required for a brute-force attack. Rainbow tables use partially precomputed values in the hash space of a cryptographic hash function to crack passwords in minutes instead of weeks. Decreasing the size of the rainbow table increases the time required to iterate over the hash space.
  • The meet-in-the-middle attack uses a space–time tradeoff to find the cryptographic key in only 2 n + 1 {\displaystyle 2^{n+1}} encryptions (and O ( 2 n ) {\displaystyle O(2^{n})} space) versus the expected 2 2 n {\displaystyle 2^{2n}} encryptions (but only O ( 1 ) {\displaystyle O(1)} space) of the naive attack.
  • Dynamic programming, where the time complexity of a problem can be reduced significantly by using more memory.
  • Time/memory/data tradeoff attack which uses the space–time tradeoff with the additional parameter of data.

See also

References

  1. Hellman, Martin (July 1980). "A Cryptanalytic Time-Memory Tradeoff". IEEE Transactions on Information Theory. 26 (4): 401–406. CiteSeerX 10.1.1.120.2463. doi:10.1109/tit.1980.1056220. S2CID 552536.

External links

Category: