In computer science, overhead is any combination of excess or indirect computation time, memory, bandwidth, or other resources that are required to perform a specific task. It is a special case of engineering overhead. Overhead can be a deciding factor in software design, with regard to structure, error correction, and feature inclusion. Examples of computing overhead may be found in functional programming, data transfer, and data structures.
A programmer/software engineer may have a choice of several algorithms, encodings, data types or data structures, each of which have known characteristics. When choosing among them, their respective overhead should also be considered.
In software engineering, overhead can influence the decision whether or not to include features in new products, or indeed whether to fix bugs. A feature that has a high overhead may not be included - or needs a big financial incentive to do so. Often, even though software providers are well aware of bugs in their products, the payoff of fixing them is not worth the reward, because of the overhead.
Algorithmic complexity is generally specified using Big O Notation. This makes no comment on how long something takes to run or how much memory it uses, but how its increase depends on the size of the input. Overhead is deliberately not part of this calculation, since it varies from one machine to another, whereas the fundamental running time of an algorithm does not.
This should be contrasted with algorithmic efficiency, which takes into account all kinds of resources - a combination (though not a trivial one) of complexity and overhead.
Reliably sending a payload of data over a communications network requires sending more than just payload itself. It also involves sending various control and signalling data (TCP) required to reach the destination. This creates a so-called protocol overhead as the additional data does not contribute to the intrinsic meaning of the message.
The encoding of information and data introduces overhead too. The date and time "2011-07-12 07:18:47" can be expressed as Unix time with the 32-bit signed integer
1310447927, consuming only 4 bytes. Represented as ISO 8601 formatted UTF-8 encoded string
2011-07-12 07:18:47 the date would consume 19 byte, a size overhead of 375% over the binary integer representation. As XML this date can be written as follows with an overhead of 218 characters, while adding the semantic context that it is a CHANGEDATE with index 1.
<?xml version="1.0" encoding="UTF-8"?> <DATETIME qualifier="CHANGEDATE" index="1"> <YEAR>2011</YEAR> <MONTH>07</MONTH> <DAY>12</DAY> <HOUR>07</HOUR> <MINUTE>18</MINUTE> <SECOND>47</SECOND> </DATETIME>
The 349 byte, resulting from the UTF-8 encoded XML, correlates to a size overhead of 8625% over the original integer representation.
Manage research, learning and skills at defaultlogic.com. Create an account using LinkedIn to manage and organize your omni-channel knowledge. defaultlogic.com is like a shopping cart for information -- helping you to save, discuss and share.