There are a number of ways to approach the concept of complexity, none, I think, inherently any better than any other.

A simple cybernetics approach would simply count the number of parts, the number of types of parts and the number of types of relationships between the parts of the system. Game theory would view those relationships as rules which govern the parameters of the game and which in turn constrain the strategies of the causal parts of the system, called players in the game. General systems theory would go a step further, applying the cybernetic approach within the system itself and applying the game theory approach towards the system interacting with it's larger context environment (psychology within sociology, sociology within ecology, etc.).

But these are older perspectives. The newer nonlinear-dynamics folks, as characterized by the Santa Fe group, would be more likely to start with an information theory perspective of complexity. If you're tring to measure the complexity of a given system, you would speak in terms of it's compressability. In other words, to how simple an alogorithmic description of the system can you reduce it while still doing the phenomena justice? If it's a simple repeating pattern adequately captured by a linear equation, then it's not very complex. This view has the interesting consequence that purely random signals are incredibly complex.

In a computer science context, "complexity" usually refers to the "cost" of an algorithm or data structure. There are two main formas of complexity: Furthermore, attention is usually focused on the asymptotic complexity in relation to the number n of objects that are handled, for large numbers. Constant factors are by and large ignored, because they vary with the execution environment (mainly hardware), not with the problem. But if the time your algorithm takes increases exponentially with n, then you are fucked for large n, no matter how fast your hardware is. This is usually expressed by using big-oh notation.

Finally, there are different measures of complexity:

  • worst case complexity: you assume that the input is always such that your algorithm works pessimal. This is the most common form, because of Finagle's Law.
  • average case complexity: you assume that your input is "average", in most cases this is interpreted as random. The problem is that this interpretation is often wrong.
  • amortized complexity: a mix of the above where you prove that the worst case cannot happen many times in a row and you instead use the average complexity over a sequence with a maximum number of worst cases.

Com*plex"i*ty (?), n.; pl. Complexities (#). [Cf. F. complexit'e.]

1.

The state of being complex; intricacy; entanglement.

The objects of society are of the greatest possible complexity. Burke.

2.

That which is complex; intricacy; complication.

Many-corridored complexities Of Arthur's palace. Tennyson.

 

© Webster 1913.

Log in or register to write something here or to contact authors.