Antifragility

Antifragility is a property of systems that increase in capability, resilience, or robustness as a result of stressors, shocks, volatility, noise, mistakes, faults, attacks, or failures. It is a concept developed by Professor Nassim Nicholas Taleb in his book, Antifragile. As Taleb explains in his book, antifragility is fundamentally different from the concepts of resiliency (i.e. the ability to recover from failure) and robustness (that is, the ability to resist failure). The concept has been applied in risk analysis, physics, molecular biology,  transportation planning,  engineering, and computer science.

Taleb defines it as follows:

Antifragile versus robust/resilient
In his book, Taleb stresses the differences between antifragile and robust/resilient. "Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better."

Antifragile versus adaptive/cognitive
An adaptive system is one that changes its behavior based on information available at time of utilization (as opposed to having the behavior defined during system design). This characteristic is sometimes referred to as cognitive. While adaptive systems allow for robustness under a variety of scenarios (often unknown during system design), they are not necessarily antifragile. In other words, the difference between antifragile and adaptive is the difference between a system that is robust under volatile environments/conditions, and one that is robust in a previously unknown environment.

Applications of antifragility
The concept has been applied in physics, risk analysis, molecular biology,  transportation planning,  engineering, megaproject management, and computer science.

In computer science, there is a structured proposal for an "Antifragile Software Manifesto", to react to traditional system designs. The major idea is to develop antifragility by design, building a system which improves from environment's input.

Criticisms
Kovalenko and Sornette have argued that antifragile systems do not exist. In general, for systems subjected to variability, noise, shocks and other random perturbations, it is possible to develop strategies or designs that, on average, benefit from variability, but not any variability. Such strategies are designed to profit from the variability of particular stressors. Simultaneously, they are vulnerable to other stressors. The refusal to accept this fundamental characteristic (or intrinsic weakness) shared by any strategy or system is very dangerous, as it may lead to unexpected shocks or intended manipulations by insiders. For instance, in the financial sphere, antifragility is a name for the exploitation of a situation that turns losses for most into gains for some by special design (put option strategy) which is, however, vulnerable to non-anticipated occurrences. Moreover, the so-called antifragile strategy can contain the germs for large externalities, leading to systemic crises for which neither the strategy itself nor the system are prepared for.