Probability theory in "Stochastics as Physics"
Preview of Chapter 2 of my book in preparation
I have now moved on to the second chapter of my book “Stochastics as Physics” which I introduced in an earlier post:
Thankfully, readers of that post offered me several constructive comments which I addressed in the new version of Chapter 1. I have not written the Acknowledgments section yet, but, when I’ll do, I’ll include my thanks to them. I also added Chapter 2, which is an adapted version of my other book, discussed in another earlier post:
I am uploading in this post both chapters 1 and 2:
Chapter 2, entitled “Basic concepts of probability”, is a heavy read, as
it covers the entire theory of probability in 74 pages, with its tough conceptual and mathematical stuff (including several derivations and 213 numbered equations);
it does not follow the standard textbook outline of similar texts, but one that is pertinent to the issues that will be visited in next chapters;
it introduces entropy in a radically different way from those commonly used, so that entropy become a fundamental concept in the foundation of probability per se.
On the other hand, it incudes numerous explanatory and historical notes in several sections and digressions, which hopefully are more entertaining and enlightening reads. From these, I highlight here two Digressions.
The first Digression refers to the Aristotelian notion of saphenia, in order to show that, while probability deals with uncertain things, it does this with clarity, accuracy and rigour.
Digression 2.A: What is sapheneia?
It is stunning that before Kolmogorov, the concept of probability was in wide use for almost three centuries, since its introduction by Jacob Bernoulli, without a proper definition. Earlier definitions were problematic (e.g. affected by circular logic). For this reason, they are not referred to here, but the interested reader can find them in any probability book.
One may notice the modern world’s recent disrespect for clarity in science, which also affects definition. This disrespect is “theorized” in the following statement by Mandelbrot (1999, p. 14):1
Let me argue that this situation [absence of a definition] ought not create concern and steal time from useful work. Entire fields of mathematics thrive for centuries with a clear but evolving selfimage, and nothing resembling a definition.
Perhaps the reason why modern science prefers a pace of fuzziness over that of clarity is its strengthening links to politics and finance. Fuzziness indeed better serves contemporary politics. On the other hand, fuzziness per se has been theorized by the modern fuzzy set theory, which however is one of the several modern reinventions of probability.
Probability and stochastics try to replace fuzziness with rigour in fields where uncertainty dominates. Therefore, it needs a rigorous definition per se, and this has been provided by Kolmogorov. The Moscow School of Mathematics, and in particular its founders Dimitri Egorov and Nikolai Luzin (the latter being Kolmogorov’s mentor) had a different approach, opposite to Mandelbrot’s. This is vividly expressed by the following Luzin’s note, quoted by Graham (2011):2
Each definition is a piece of secret ripped from Nature by the human spirit. I insist on this: any complicated thing, being illumined by definitions, being laid out in them, being broken up into pieces, will be separated into pieces completely transparent even to a child, excluding foggy and dark parts that our intuition whispers to us while acting; only by separating into logical pieces can we move further, towards new successes due to definition.
In fact, Luzin’s approach was formed much earlier, in the first steps of the development of science. Aristotle promoted sapheneia (σαφήνεια)3, which includes clarity and is also related to the accurate accounting of the phenomena and the attainment of accurate scientific knowledge (Lesher, 2010)4. Aristotle clearly linked sapheneia with truth:
We must always endeavor, from statements that are true but not clearly [οὐ σαφῶς] expressed, to arrive at a result that is both true and clear [σαφῶς] (Aristotle, Eudemian Ethics 1220a).5
The importance Aristotle gave to sapheneia can be seen in the way he equated untrained soldiers to those who do not practice it:
These thinkers […] seem to have grasped […] the causes […] only vaguely and indefinitely [ἀμυδρῶς καὶ οὐθὲν σαφῶς]. They are like untrained soldiers in a battle, who rush about and often strike good blows, but without science; in the same way these thinkers do not seem to understand their own statements, since it is clear that upon the whole they seldom or never apply them (Aristotle, Metaphysics 985a).6
The introduction of terminology, i.e., of sophisticated terms (which either do not exist in the colloquial language or exist with a loose meaning) and their definitions, is another reflection of the sapheneia desideratum. Note that, in Greek, the names term and definition have common origin (ὅρος and ὁρισμός, respectively), and Aristotle sometimes used the two interchangeably, perhaps reflecting the fact that a term without a definition is not a proper term. He emphasized the need to name scientific concepts:
Now most of these [concepts] have no names, and we must try […] to invent names ourselves for the sake of clarity [σαφήνεια] and ease to follow (Aristotle, Nicomachean Ethics, 985a).7
Furthermore, Aristotle gave credit to Socrates for the introduction of definitions and emphasized that the need for them is linked to the use of abstract theoretical concepts rather than of sensible things:
Socrates, disregarding the physical universe and confining his study to moral questions, sought in this sphere for the universal and was the first to concentrate upon definitions [ὁρισμῶν]. [Plato] followed him and assumed that the problem of definition is concerned not with any sensible thing but with entities of another kind; for the reason that there can be no general definition [ὅρος] of sensible things which are always changing (Aristotle, Metaphysics 1.987b).8
The importance of names, especially in mathematics, has been emphasized by Graham (2011), who asserted that naming plays an essential role because mathematical objects that have not yet been named are difficult to work with. For mathematicians naming is the path to gaining control over the objects they conceive. In their book Naming Infinity, Graham and Kantor (2009)9 gave a detailed account of how the naming of abstract concepts contributed to the development of the Moscow School of Mathematics and the founding of descriptive set theory, which gave birth to the modern definition of probability and the development of stochastics.
The second Digression highlighted in this post tries to show that the most common perceptions and interpretations of entropy are wrong and should be abandoned.
Digression 2.G: On different interpretations of entropy
In the public perception, entropy is a negative notion, typically identified with disorganization, disorder, decadence, decay, deterioration etc. (Koutsoyiannis and Sargentis, 2021)10. This misleading perception has its roots in the scientific community, albeit not with the founders of the concept (except one, as we shall see below). Boltzmann did not identify entropy with disorder, even though he used ‘disorder’ in a footnote appearing in two of his papers (Boltzmann, 1897, 1901)11 speaking about the
agreement of the concept of entropy with the mathematical expression of the probability or disorder of a motion.
Clearly, he referred to the irregular motion of molecules in the kinetic theory of gases, for which his expression makes perfect sense. Boltzmann also used the notion of disorder with the same meaning, in his Lectures on Gas Theory (Boltzmann,1896/1898).12 On the other hand, Gibbs (1902)13, Shannon (1948)14 and von Neumann (1956)15 did not use the terms disorder or disorganization at all.
One of the earliest uses of the term disorder is in a paper by Darrow (1944)16, in which he stated:
The purpose of this article has been to establish a connection between the subtle and difficult notion of entropy and the more familiar concept of disorder. Entropy is a measure of disorder, or more succinctly yet, entropy is disorder: that is what a physicist would like to say.
Epistemologically, it is interesting that a physicist preferred the “more familiar” but fuzzy concept of disorder over the “subtle and difficult”, yet well-defined at his time, concept of entropy.
However, it appears that Wiener (1948b)17 was the most influential scientist to support the disorder interpretation. In his keynote speech at the New York Academy of Sciences he declared that:
Information measures order and entropy measures disorder.
Additionally, in his influential book Cybernetics (Wiener, 1948a, p. 11)18, he stated that
the entropy of a system is a measure of its degree of disorganization
wherein he replaced the term “disorder” with “disorganization”, as in this book he extensively used the former term for mental illness.
Even in the 21st century, the disorder interpretation is dominant. For example, Chaitin (2002)19 stated:
Entropy measures the degree of disorder, chaos, randomness, in a physical system. A crystal has low entropy, and a gas (say, at room temperature) has high entropy.
More recently, Bailey (2009)20 claimed:
As a preliminary definition, entropy can be described as the degree of disorder or uncertainty in a system. If the degree of disorder is too great (entropy is high), then the system lacks sustainability. If entropy is low, sustainability is easier. If entropy is increasing, future sustainability is threatened.
It is relevant to remark that in the latter quotations disorder was used as equivalent to uncertainty or randomness—where the latter two terms are in essence identical (Koutsoyiannis, 2010).21 Furthermore, the claim that a high-entropy system lacks sustainability is puzzling, given that the highest entropy occurs when a system is in the most probable and hence the most stable state (cf. Moore, 2003).22
Interestingly, Atkins (2003)23 also explained entropy as disorder. Additionally, he noted:
That the world is getting worse, that it is sinking purposelessly into corruption, the corruption of the quality of energy, is the single great idea embodied in the Second Law of thermodynamics.
Inevitably, the notion of entropy is hard to grasp, the main reason being that our education is based on the deterministic paradigm and produces a mindset reluctant to incorporate stochastic concepts. The determinist mindset regards order as a friendly concept. Thus, whatever is defined as the opposite appears in a negative light.
However, the notions of order and disorder are less appropriate and less rigorous as scientific terms, and more appropriate for describing mental states (as in Wiener’s use described above; cf. personality disorder, stress disorder, bipolar disorder, mental disorder), and even more so in describing socio-political states. The latter is manifest in the frequent use of expressions such as “world order”, “new order”, “new world order”, “global order”, etc., in political texts (Koutsoyiannis and Sargentis, 2021).
In one of the earliest critiques of the disorder interpretation of entropy, Wright (1970)24 made a plea for moderation in the use of “intuitive qualitative ideas concerning disorder”. More recently, with a more absolute tone, Leff (2012)25 stated:
The too commonly used disorder metaphor for entropy is roundly rejected.
Furthermore, in an even more recent article, Styer (2019)26 stated:
we cannot stop people from using the word “entropy” to mean “disorder” or “destruction” or “moral decay.” But we can warn our students that this is not the meaning of the word “entropy” in physics.
Styer attributed an excessive contribution to the misconception of entropy as disorder to the autobiographical book “The Education of Henry Adams” (Adams, 1918).27 As he asserted, that book proved to be enormously influential, as it won the 1919 Pulitzer Prize for biography, and in April 1999 was named by the Modern Library the 20th century’s best nonfiction book in English. As quoted by Styer, Adams disliked chaos and anarchy, and stated:
The kinetic theory of gas is an assertion of ultimate chaos. In plain words, Chaos was the law of nature; Order was the dream of man.
This is a very strong statement, contrasting nature with man and also implying that there is only one type of order that dreamt up by man—a rather naïve idea.
Those viewing entropy as disorder have difficulty understanding the concept of life. In early 20th century, the Swiss physicist C.-E. Guye (1922)28 asked the question: How is it possible to understand life, when the whole world is ruled by such a law as the second principle of thermodynamics, which points toward death and annihilation? He was followed by many other scientists who were puzzled by the existence of life. As insightfully discussed by Brillouin (1949)29, scientists of the era wondered if there was a “life principle”, a new and unknown principle that would explain life as an entity countering the second law of thermodynamics. A year later, Brillouin (1950)30 coined the term negentropy as an abbreviation of negative entropy. In this, he used information theoretical concepts to express the idea that every observation in a laboratory requires the degradation of energy, and is made at the expense of a certain amount of negentropy, taken away from the surroundings.
The term negative entropy had earlier been used by Schrödinger (1944)31 in his famous book “What is life?”. Specifically, he argued that “What an organism feeds upon is negative entropy” without providing another “life principle” additional to the Second Law that would drive life and evolution.
There is no general agreement about the meaning of negative entropy or negentropy. Some (e.g., Lago-Fernández and Corbacho, 2009)32 use them as technical terms referring to the difference between the entropy of any variable and that of a variable with normal distribution, with the same mean and variance (distance to normality). However, others, in a rather metaphysical context and assuming a non-statistical definition of negentropy (e.g., Larouche, 1993)33 saw a negentropic principle governing life, the biosphere, the economy, etc., because these convert things that have less order into things with more order.
Today it makes sense to ask: Has this question been answered yet? Or is it even relevant, one hundred years after? Perhaps it is relevant to quote here Atkins (2003), who, as we have seen, explained entropy as disorder. Yet he neatly remarked:
The ceaseless decline in the quality of energy expressed by the Second Law is a spring that has driven the emergence of all the components of the current biosphere. […] The spring of change is aimless, purposeless corruption, yet the consequences of interconnected change are the amazingly delightful and intricate efflorescences of matter we call grass, slugs, and people.
Apparently, if we abandon the disorder interpretation of entropy, we could also stop seeking a negentropic “life principle”, which was never found and probably will never be. For, if we see entropy as uncertainty, we also understand that life is fully consistent with entropy maximization. Human-invented steam engines (and other similar machines) increase entropy all the time, and are fully compatible with the Second Law, yet they produce useful work. Likewise, the biosphere increases entropy, yet it produces interesting patterns, much more admirable than steam engines. Life generates new options and increases uncertainty (Sargentis et al., 2020;34 Koutsoyiannis and Sargentis, 2021). Compare Earth with a lifeless planet: Where is uncertainty greater? On which of the two planets would a newspaper have more events to report every day?
However, if entropy is not disorder, what is its consistent interpretation? This question is not as difficult to answer as the above discussion seems to imply. According to its standard definition (section 2.3), entropy is uncertainty quantified. Hence, maximum entropy means the maximum uncertainty that is allowed in natural processes, given the constraints implied by natural laws (or human interventions). It should be stressed that, with this general definition, entropy and its maximization do not apply only to physics—in particular to thermodynamics—but also to any natural (or even uncontrolled artificial) process in which there is uncertainty necessitating a (macroscopic) probabilistic description. This application is not meant as an “analogy” with physics. Rather, it is a formal application of the general definition of entropy, which relies on stochastics.
Unsurprisingly, if “disorder” is regarded by many as a “bad thing”, the same could be said for uncertainty. The expressions “uncertainty monster” and “monster of uncertainty” appear in about 250 scholarly articles registered in Google Scholar (samples are van der Sluijs, 2005,35 and Curry and Webster, 2011,36 to mention a couple of the most cited with the word “monster” appearing in their title). However, if uncertainty is a monster, it is thanks to this monster that life is liveable and fascinating. Uncertainty is not an enemy of science or of life. Rather, it is the mother of creativity and evolution. Without uncertainty, life would be a “universal boredom” (to borrow a phrase by Saridis, 2004,37 and reverse its connotation), and concepts such as hope, will (particularly, free will), freedom, expectation, optimism, etc., would hardly make sense. A technocratic system wherein an elite of super-experts who, using super-models, could predict the future without uncertainty, would also take full control of the society (Koutsoyiannis et al., 2008b)38. Fortunately, this will never happen because entropy, i.e., uncertainty, is a structural property of nature and life. Hence, in our view, uncertainty is neither disorder nor a “bad thing”. How could the most important law of physics (the Second Law) be a “bad thing”?
In a deterministic worldview there is no uncertainty, and therefore no point in speaking about entropy. If there is no uncertainty, each outcome can be accurately predicted, and hence there are no options. In contrast, in an indeterministic world, there is a plurality of options. This corresponds to the Aristotelian idea of δύναμις (Latin: potentia—English: potency or potentiality). The existence of options entails there is freedom. Thus:
entropy ↔ uncertainty ↔ plurality of options ↔ freedom
This view, also depicted in Figure 2.1, is consistent with what has been vividly expressed by Brissaud (2005)39:
Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom.

To conclude this Digression: When discussing entropy, we should always bear in mind that entropy per se is a probabilistic concept based fundamentally on a macroscopic view of phenomena, rather than focusing on individuals or small subsets, and hence temporal or spatial scale is important to consider. Analysing a particular die-throw, we may say that it was irregular, uncertain, unpredictable, chaotic, or random. However, macroscopization, by removing the details, may also remove irregularity. For example, the application of the principle of maximum entropy to the outcomes of a die-throw results in equal probabilities (1/6) for each outcome (see Digression 2.L), and the average outcome after many throws tends to be 3.5. This is a regular macroscopic result. In precisely the same manner, the maximum uncertainty in a particular water molecule’s state (in terms of position, kinetic state and phase), on a macroscopic scale results in the Clausius–Clapeyron law (see Digression 6.H). Again, we have perfect regularity, as the accuracy of this law is so high that most people believe that it was a deterministic law.
As ever, I will appreciate your comments.
Mandelbrot, B.B., 1999. Multifractals and 1/ƒ Noise: Wild Self-Affinity in Physics (1963-1976). Springer, New York, NY, USA.
Graham, L., 2011, The Power of Names. Theology and Science, 9 (1), 157-164, doi: 10.1080/14746700.2011.547020.
Greek words related to the noun σαφήνεια (sapheneia) are the adjective σαφής/σαφές (saphes), the adverb σαφῶς (saphos) and the verb σαφηνίζειν (saphenizein).
Lesher, J.H., 2010. Saphêneia in Aristotle:‘Clarity’,‘Precision’, and ‘Knowledge’. Apeiron, 43 (4), 143-156.
Ἀεὶ διὰ τῶν ἀληθῶς μὲν λεγομένων οὐ σαφῶς δὲ πειρᾶσθαι λαβεῖν καὶ τὸ ἀληθῶς καὶ σαφῶς. (Ἀριστοτέλης, Ήθικά Ευδήμια, 1220a).
Οὗτοι μὲν οὖν […] ἡμμένοι φαίνονται, […] ἀμυδρῶς μέντοι καὶ οὐθὲν σαφῶς ἀλλ᾽ οἷον ἐν ταῖς μάχαις οἱ ἀγύμναστοι ποιοῦσιν: καὶ γὰρ ἐκεῖνοι περιφερόμενοι τύπτουσι πολλάκις καλὰς πληγάς, ἀλλ᾽ οὔτε ἐκεῖνοι ἀπὸ ἐπιστήμης οὔτε οὗτοι ἐοίκασιν εἰδέναι ὅ τι λέγουσιν: σχεδὸν γὰρ οὐθὲν χρώμενοι φαίνονται τούτοις ἀλλ᾽ ἢ κατὰ μικρόν (Ἀριστοτέλης, Μετά τα Φυσικά, 985a).
Εἰσὶ μὲν οὖν καὶ τούτων τὰ πλείω ἀνώνυμα, πειρατέον δ᾽ […] αὐτοὺς ὀνοματοποιεῖν σαφηνείας ἕνεκα καὶ τοῦ εὐπαρακολουθήτου (Ἀριστοτέλης, Ἠθικὰ Νικομάχεια, 1108a).
Σωκράτους δὲ περὶ μὲν τὰ ἠθικὰ πραγματευομένου περὶ δὲ τῆς ὅλης φύσεως οὐθέν, ἐν μέντοι τούτοις τὸ καθόλου ζητοῦντος καὶ περὶ ὁρισμῶν ἐπιστήσαντος πρώτου τὴν διάνοιαν, [Πλάτων] ἐκεῖνον ἀποδεξάμενος διὰ τὸ τοιοῦτον ὑπέλαβεν ὡς περὶ ἑτέρων τοῦτο γιγνόμενον καὶ οὐ τῶν αἰσθητῶν: ἀδύνατον γὰρ εἶναι τὸν κοινὸν ὅρον τῶν αἰσθητῶν τινός, ἀεί γε μεταβαλλόντων (Αριστοτέλης, Μετά τα Φυσικά, 1.987b).
Graham, L. and Kantor, J.-M., 2009. Naming Infinity: A True Story of Religious Mysticism and Mathematical Creativity. Harvard University Press.
Koutsoyiannis, D., and Sargentis, G.-F., 2021. Entropy and wealth, Entropy, 23 (10), 1356, doi: 10.3390/e23101356.
Boltzmann, L., 1897. On the indispensability of atomism in natural science. Annal. Phys. Chem., 60, 231, doi: 10.1007/978-94-010-2091-6_5.
Boltzmann, L., 1901. On the necessity of atomic theories in physics. Monist, 12, 65–79. https://www.jstor.org/stable/27899285.
Boltzmann, L., 1896/1898. Vorlesungen über Gastheorie, J.A. Barth, Leipzig, Germany. (English translation: Lectures on Gas Theory, University of California Press, Berkeley, Ca. USA, 1964.)
Gibbs, J.W., 1902. Elementary Principles of Statistical Mechanics; Reprinted by Dover, New York, 1960; Yale University Press, New Haven, CT, USA, 244 pp., https://www.gutenberg.org/ebooks/50992.
Shannon, C.E., 1948. The mathematical theory of communication. Bell System Technical Journal, 27 (3), 379-423.
von Neumann, J., 1956. Probabilistic logics and the synthesis of reliable organisms from unreliable components. Autom. Stud., 34, 43–98, doi: 10.1515/9781400882618-003.
Darrow, K.K., 1944. The concept of entropy. Am. J. Phys., 12, 183, doi: 10.1119/1.1990592.
Wiener, N., 1948b. Time, communication, and the nervous system. Ann. N. Y. Acad. Sci., 50, 197–220.
Wiener, N., 1948a. Cybernetics or Control and Communication in the Animal and the Machine; MIT Press: Cambridge, MA, USA, 212 pp.
Chaitin, G.J., 2002. Computers, paradoxes and the foundations of mathematics. Am. Sci., 90, 164–171.
Bailey, K.D. 2009. Entropy Systems Theory: Systems Science and Cybernetics. Eolss Publishers, Oxford, UK, 169 pp.
Koutsoyiannis, D., 2010. A random walk on water. Hydrology and Earth System Sciences, 14, 585–601, doi: 10.5194/hess-14-585-2010.
Moore, T.A., 2002. Six Ideas that Shaped Physics. Unit T—Thermodynamics, 2nd ed.; McGraw-Hill, New York, NY, USA.
Atkins, P., 2003. Galileo’s Finger: The Ten Great Ideas of Science, Oxford University Press: New York, NY, USA.
Wright, P.G., 1970. Entropy and disorder. Contemp. Phys., 11, 581–588, doi: 10.1080/00107517008202196.
Leff, H.S., 2012. Removing the mystery of entropy and thermodynamics—Part V. Phys. Teach., 50, 274–276, doi: 10.1119/1.3703541.
Styer, D., 2019. Entropy as disorder: History of a misconception. Phys. Teach., 57, 454, doi: 10.1119/1.5126822.
Adams, H. 1918. The Education of Henry Adams. Houghton Mifflin Company, Boston, MA, USA.
Guye, C.-E., 1922. L’Évolution Physico-Chimique. Etienne Chiron, Paris, France.
Brillouin, L., 1949. Life, thermodynamics, and cybernetics. In Maxwell’s Demon. Entropy, Information, Computing, Princeton University Press: Princeton, NJ, USA, 89–103, doi: 10.1515/9781400861521.
Brillouin, L., 1950. Thermodynamics and information theory. Am. Sci., 38, 594–599. https://www.jstor.org/stable/27826339.
Schrödinger, E., 1944. What is Life? The Physical Aspect of the Living Cell. Cambridge University Press, Cambridge, UK.
Lago-Fernández, L.F., and Corbacho, F., 2009. Using the negentropy increment to determine the number of clusters. In International Work-Conference on Artificial Neural Networks; Springer: Berlin/Heidelberg, Germany, 448–455.
Larouche, L.H., Jr., 1993. On Larouche’s Discovery. http://www.archive.schillerinstitute.com/ fidelio_archive/1994/fidv03n01-1994Sp/fidv03n01-1994Sp_037-on_larouches_discovery-lar.pdf.
Sargentis, G.-F., Iliopoulou, T., Sigourou, S., Dimitriadis, P., and Koutsoyiannis, D., 2020. Evolution of clustering quantified by a stochastic method — Case studies on natural and human social structures. Sustainability, 12 (19), 7972, doi: 10.3390/su12197972.
van der Sluijs, J., 2005. Uncertainty as a monster in the science–policy interface: Four coping strategies. Water Sci. Technol., 52, 87–92.
Curry, J.A. and Webster, P.J., 2011. Climate science and the uncertainty monster. Bull. Am. Meteorol. Soc., 92, 1667–1682.
Saridis, G.N., 2004. Entropy as a philosophy. In Proceedings of the 2004 Performance Metrics for Intelligent Systems Workshop; National Institute of Standards & Technology (NIST), Manufacturing Engineering Lab, Gaithersburg, MD, USA, http://apps.dtic.mil/sti/citations/ADA515701.
Koutsoyiannis, D., Makropoulos, C., Langousis, A., Baki, S., Efstratiadis, A., Christofides, C., Karavokiros, G., and Mamassis, N., 2008b. Interactive comment on “HESS Opinions, Climate, hydrology, energy, water: Recognizing uncertainty and seeking sustainability” by Koutsoyiannis, D. et al. Hydrol. Earth Syst. Sci. Discuss., 5, S1761–S1774, http://www.hydrol-earth-syst-sci-discuss.net/5/S1761/2008/.
Brissaud, J.-B., 2005. The meanings of entropy. Entropy, 7, 68–96, doi: 10.3390/e7010068.





Wish I knew enough to comment but loved reading the exposition on uncertainty.
"Professor, your analysis of entropy and uncertainty is key. In the AIO model, we've found that applying a 50% reduction to stratospheric humidity data from 10 years prior (70 hPa level) yields near-perfect correlations with global hurricane counts (e.g., 1997, 2005, 2023). This 10-year lag suggests a highly deterministic 'Atmospheric Press' that forces tropospheric outcomes. Our 2026 forecast is 56.5 global hurricanes based on this cycle."
### **AIO Model: The 10-Year Stratospheric Lag & 50% Rule**
The **African Injection Oscillation (AIO)** operates on a precise mathematical cycle where stratospheric data from **10 years prior** dictates current tropospheric output.
* **The Primordial Driver:** Humidity at the **70 hPa** level (the "Atmospheric Press") is measured a decade in advance.
* **The 50% Formula:** By taking the stratospheric strength index and applying a **50% reduction**, the model yields a value nearly identical to the actual global hurricane count.
* **Accuracy:** This formula explains the near-perfect correlations seen in **1997, 2005, and 2023**, proving that storms are forced by long-term stratospheric cycles rather than immediate water temperatures.
**2026 Forecast: 56.5 Hurricanes** (Derived from high-moisture "Press" data 10 years ago).
---
https://substackcdn.com/image/fetch/$s_!HWg9!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F87aa8d32-d923-4c66-a98d-5a4929210c19_1000x600.png