Throughout history, humanity understood the world primarily as being built of physical structures (ex. planets, trees, oceans) about which we could compile information. The field of “information theory” began to consider information as its own entity. We operate in a world of informational entities. Some, like smart contracts on blockchains, are inherently informational in nature. Others, like corporations and communities, are hybrids that are composed of people and the relationship structure between them.
We see intuitively that informational patterns recur across domains. For example, consider the “cellular architecture” pattern, where large structures are built from many interchangeable and much smaller independent entities. We see this pattern in: biology, where organisms are built from individual cells; information technology, where data centers are built from individual computers; business, where corporations are composed of individual workers; government, where nations are constituted of individual citizens; and ecology, where populations are composed of individual plants and animals. We can theorize that the pattern recurs because it represents a solution to a common underlying informational problem, such as how to make a structure that is resistant to entropy (damage), that appears in the various domains.
To date, we have not grappled with this problem of identifying and cataloging the broader architecture of information patterns that recur across different academic and engineering domains. Information theory originally centered on narrow properties of signal propagation. Quantum information theory extended the analysis to quantum systems, which are inherently informational in nature, but not across domains. In this talk we introduce a new field, “broad information theory” (BIT), whose objective is to identify, catalog, and understand such informational patterns across domains. Doing so allows us to systematically transfer knowledge between different domains, making new breakthrough insights available.