The book provides a perspective on how not to design systems, based on system engineering failures. The primary principle is that large, complex systems are extremely difficult to design correctly, even with the best intentions. Therefore, it is important to design smaller, less complex systems, and to do so incrementally, with close and continual attention to user needs and measure of effectiveness.
Quotes from the book:
Systems tend to oppose their own proper function.
A complex system cannot be make to work, it does or doesn't.
A complex system designed from scratch never works and cannot be patched up to make it work.
A complex system that works is invariably found to have evolved from a simple system that works.
People in systems do not do what the system says they are doing.
The system itself does not do what it says it is doing.
Problems are not the problem, coping is the problem.
In a closed system information tends to decrease and hallucinations tend to increase.
A system that ignores feedback has already begun the process of terminal instability.
To those within the system the outside reality seems to pale and disappear.
Almost anything is easier to get into that out of.
Loose systems last longer and function better.
Plan to scrap the first system, you will anyways
The future is no more predictable now that it was in the past, but you can at least take note of trends.
The system is altered by the probe used to test it. The probe is altered also.
In order to remain unchanged, the system must change.
Systems tend to malfunction conspicously just after their greatest triumph.
The ghost of the old system continues to haunt the new.