> Entropy. Complexity equals the entropy, or disorder, of a system as
> measured by thermodynamics.
I don't buy this one.
>
> Information. Complexity equals the capacity of a system to "surprise",
> or inform, an observer.
This may get somewhere with further refinement. The minimum
amount of information required to describe a system may take it a bit
further.
> Fractal Dimension. The "fuzziness" of a system, the degree of detail it
> displays at smaller and smaller scales.
Complexity increasing to me is systems interacting at larger and larger
scales. If you reduce enough they start to look very similar. Toss this one.
> Effective Complexity. The degree of "regularity" (rather than randomness)
> displayed by a system.
This doesn't work very well.
> Hierarchical Complexity. The diversity displayed by the different levels
> of a hierarchically structured system.
There may be something to this but how does one evaluate it?
> Grammatical Complexity. The degree of universality of the language required
> to describe a system.
> Thermodynamic Depth. The amount of thermodynamic resources required to put a
> system together from scratch.
> Time Computational Complexity. The time required for a computer to describe
> a system (or solve a problem).
> Spatial Computational Complexity. The amount of computer memory required
> to describe a system.
These four seem similar and have some is potential.
> Mutual Information. The degree to which one part of a system contains
> information on, or resembles, other parts.
This is redundancy to me.
>
> I'm not sure the definition I offered corresponds to any of the ones above.
I don't think so but I like your inclusion of efficiency in the concept
because we are dealing with an evolutionary model and evolution selects
for efficiency.
Duane