Readers Respond to the December 2023 Issue


“Inside the Pit Factory,” by Sarah Scoles, reports on how the U.S. is ramping up construction of new “plutonium pits” for nuclear weapons. Given that plutonium 239, one of the main ingredients in nuclear bombs, has a half-life of more than 24,000 years, why do the nuclear weapons from the late 20th century need to be “fixed”?


On supporting science journalism

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

SCOLES REPLIES: It’s true that the isotope of plutonium used in weapons has a significant half-life. But it is also decaying throughout that time, and as it does, it leaves behind other elements that contaminate the pit. This gives the pit a different composition than when it was made. The uranium atoms that are produced by the decay process also knock plutonium atoms out of their place in the structure, similarly changing the pit’s properties. And all of this occurs while the plutonium is subject to more mundane chemical processes such as oxidation. So the gist is that with such changes, over time, a pit may not function as well or as safely as when it was new.


“This Unexpected Pattern of Numbers Is Everywhere,” by Jack Murtagh [Math], discusses how nature seems to favor numbers beginning with a 1 or a 2, a phenomenon known as Benford’s law. How could that possibly be? The puzzle even invaded my dreams, and in the morning the answer was clear: it means that randomness in nature is often distributed not linearly but rather logarithmically.

The reason this tendency makes a leading digit of 1 or 2 much more common than larger digits is related to the hybrid nature of our numbering system. Individual digits progress linearly: 1, 2, 3, 4. But the number places progress logarithmically: 1s, 10s, 100s, 1,000s. When we express logarithmically spaced items, the lower digits of 1 and 2 become much more common in the leading place than the other digits, just as on a logarithmic scale the length between 1 and 2 is greater than the length between 3 and 4. Multiplying whatever quantity you are looking at or changing the units does not alter how the scale works.


Murtagh’s article reminded me of using slide rules before we had digital calculators. I found one in a desk drawer and measured the spacing between the digits, and it correlated very closely to the number distribution in the article. Logarithms must lie at the bottom of the pattern.


Murtagh appears to be closer to answering the question of why Benford’s law exists when he notes that it “is more likely to apply to data sets spanning several orders of magnitude.” I suspect that the phenomenon is not a law of nature but an artifact of our choice of a notational system. If we instead chose to express the measurements of river lengths in base 16 or base 8, the distribution of digit occurrence would look very different.


MURTAGH REPLIES: Chesnakas and Lannert are absolutely correct. The probabilities we see in Benford’s law are proportional to the distances between notches on a logarithmic scale (which is how numbers are laid out on a slide rule). Benford’s law often arises in situations where the logarithm of the data is a uniform distribution. In fact, the mathematical definition of Benford’s distribution makes this perspective clear: a leading digit n occurs with probability log(n + 1) – log(n) where the logarithms are base 10. For example, the probability of a leading 6 is log(7) – log(6).

Rutila and other readers have wondered whether Benford’s law is merely a quirk of the base 10 system. The phenomenon actually persists regardless of what base we use for our number system. To calculate the probability distribution for other bases, we just have to change the base of the logarithms in the formula above.


“Boom Times,” Abe Streep’s article about the communities shouldering the U.S.’s nuclear missile revival, resonates with my own family’s atomic legacy. My mother grew up on the other side of the Iron Curtain during the height of the cold war. She was moved from town to town every year or two while her father, an army missile specialist, oversaw what she later surmised to be the construction of silos and installation of intercontinental ballistic missiles.

The immense psychological and physical harm from these projects continues to seep and reverberate through the land, affecting both the environment and the people decades later. The perpetrators (willing or unwilling, knowing or unknowing) are also victims. An individual can feel overwhelmed when contemplating the scale of these atomic legacies. But music and other arts can bring awareness to concepts that are too damaging to silently hold inside and too painful and complex to put into words. I do this in my own abstract musical compositions.



“Misdiagnosing Dyslexia,” Sarah Carr’s article on how a flawed formula deprives children of help with reading, hit a nerve. I began teaching students with learning challenges in the 1970s, about when the term “learning disability” (LD) was introduced in my school. It was conceptualized as “unexplained underachievement.” With respect to reading, the logic went like this: If you were smart enough, you should be able to read. If you couldn’t, you must have a reading disability (dyslexia). Conversely, if you were not smart and couldn’t read well enough, this was likely because of a lower intelligence quotient (IQ) rather than an LD. Hence, the discrepancy model and the need to compare IQ with reading scores. The simplicity was compelling. School psychologists had well-developed tools for assessing intelligence and reading. It was a classic case of the tail wagging the dog.

Intelligence is devilishly complex to quantify meaningfully. It is also unstable and can change depending on a lot of things (culture, nutrition, experiences, and so on). Reading, in the broadest sense, is equally complex. Carr mentions phonological awareness (PA) as a valid approach to ameliorating dyslexia. I believe this to be accurate, but success with true meaning making (that is, comprehension) requires more. PA is like using training wheels while you are learning to ride a bike. Eventually the wheels need to come off.


I am a certified, experienced school psychologist who has worked in Arizona, Nebraska and Illinois. I am very concerned about the portrayal of school psychologists in Carr’s article about diagnosing dyslexia. School psychologists are on the front line of advocating for more valid and useful processes for identifying students with dyslexia and other learning disabilities but are often limited in practice by outdated special education laws that do not reflect up-to-date scientific practice standards. We are more than capable of looking beyond IQ tests and using less biased and more useful assessment procedures that are based on current dyslexia research.



Source link


المصدر الرئيسي للأخبار والمعلومات الصحية والطبية الموثوقة وفي الوقت المناسب . توفير معلومات صحية ذات مصداقية ومجتمع داعم وخدمات تعليمية من خلال مزج الخبرة الحائزة على جوائز في المحتوى والخدمات المجتمعية وتعليقات الخبراء والمراجعة الطبية .

مقالات ذات صلة

اترك تعليقاً

زر الذهاب إلى الأعلى