Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.

#
Saturday Night Science: The Analytical Engine

Charles Babbage was one of the preeminent polymaths of the 19th century. He worked in mathematics (appointed Lucasian Professor of Mathematics at Cambridge—Newton’s chair, later Stephen Hawking’s—in 1828), astronomy, economics (his work on the division of labour in manufacturing was one of the first in the field now called operations research), mechanical engineering (he invented an algebraic notation for describing the design of mechanisms), and cryptography. He was a founder of the British Association for the Advancement of Science and the Statistical Society of London. He invented the cow-catcher for locomotives and the ophthalmoscope.

During his studies at Cambridge, Babbage became dismayed by the quality of the mathematical tables used by mathematicians, scientists, and engineers in their calculations. At the time, all computations requiring more precision than the two or three digits of a slide rule were done manually, using printed tables of logarithms, trigonometric functions, ephemerides for astronomy, and sight reduction tables for celestial navigation at sea. Examining these tables, Babbage discovered that they contained numerous errors, and that there were even discrepancies among different editions of the same tables. These errors were both due to calculation errors computing the tables, plus those introduced in typesetting printed editions. Errors in mathematical tables were not just a matter of pedantry or aesthetics; the Royal Navy used these tables for navigation, and flaws could lead to Her Majesty’s ships running aground or failing to be in the right place at the right time in a naval battle.

In the spirit of the industrial revolution, Babbage began to investigate whether the process of calculation and printing these tables might be performed by machinery, just as machines had transformed the textile and other industries in Britain which previously relied on manual labour. In 1822, he began work on what he called the Difference Engine. The Difference Engine can be thought of as a series of coupled mechanical adding machines which, by using the method of finite differences, is able to evaluate polynomial equations of an order limited only by the number of columns in the machine. Most of the functions and quantities tabulated in mathematical tables can be approximated by polynomials, and the finite difference method permits evaluating them purely by addition and subtraction, avoiding the need to multiply and divide. The machine could not only print the results of its calculations on paper, but directly mould stereotype plates (the technical term for which is the delightful word “flong”) from which type could be set, eliminating the possibility of typesetting errors. The machine was designed to jam under circumstances which would lead to an error, and Babbage argued that it would reliably produce flawless tables.

In 1823, Babbage received a grant for £1700 from the British government to build the Difference Engine, but, as so often happens with visionary projects pushing the limits of technology, the project quickly ran into schedule slips and budget overruns, many due to lack of experience in manufacturing components with the required tolerances and in the quantity needed (the original design had more than 20,000 parts). The government continued to fund the effort until 1842, having sunk a total of £17,000 without producing anything which worked. By that time Babbage had lost interest in the Difference Engine. He envisioned something much more ambitious: the Analytical Engine.

While the Difference Engine was a special-purpose machine for evaluating polynomials, the Analytical Engine was a general-purpose mechanical computer, able to add, subtract, multiply, divide, scale numbers by powers of ten, read numerical values from punched cards, and store and recall numbers in a thousand memory locations. The operations performed by the machine were determined by another set of punched cards, and the machine was able to make decisions based upon the results of its calculations, backing up the cards to repeat computations or advancing forward to skip them. This set of operations was sufficient to perform all of the operations of mathematics, and to evaluate any function. The specifications of the machine varied between Babbage’s first description in 1837 and his last word on it in his 1864 autobiography, but based upon the latter document, they were breathtaking, even compared to early electronic computers of a century hence.

**The mill** corresponded to the arithmetic and logic unit of a modern computer. It could add and subtract 50 digit signed decimal numbers in a fixed time (independent of the number of digits) of around a second, multiply two 50 digit numbers producing a 100 digit product, and divide a 100 digit dividend by a 50 digit divisor yielding a 50 digit quotient and remainder. Unlike addition and subtraction, the speed of multiplication and division depended upon the number of digits in the operands: the worst case took about a minute for each operation. The mill kept track of the signs of quantities, and activated a “run-up” lever in case of exceptional conditions, for example the sign changing due to an overflow or underflow in addition or subtraction. Inputs to the mill were placed on “ingress axes” and results on “egress axes”.

**The store** consisted of 1000 columns, each able to hold a 50 digit signed number. The value in a column of the store could be placed on an ingress axis of the mill, or a value from the egress axis placed in a store column by a variable card specifying the mill axis and store column.

**The card readers** read operation cards, which specify which of the four arithmetic operations the mill is to perform, variable cards to transfer values between the mill and store, number cards, which place numerical values in store columns, and combinatorial cards, which allow advancing or backing up the cards in the readers, optionally depending on the state of the mill’s run-up lever.

**The printer** directly printed results from the mill, and a **sterotype casting** apparatus allowed direct production of masters for printing plates.

**The curve drawing apparatus** plotted mathematical functions on a Cartesian grid, allowing insight into their behaviour difficult to perceive from tables of numbers.

The Analytical Engine would have been a massive machine; estimates of its size and mass were comparable to those of a locomotive, although unlike the locomotive it would contain myriad precision-machined components, all moving as orchestrated by the program read by the card readers. The machine would be steam powered; no human would have the strength to manually turn the crank.

From the time Charles Babbage first imagined the Analytical Engine until his death in 1871, he elaborated and refined the design, produced detailed drawings and prototypes of components, and attempted to persuade any person or institution that would listen to fund the project. But with the failure of the much simpler and less ambitious Difference Engine, he found no takers. After his death, a committee of the British Association for the Advancement of Science (which Babbage had founded), in 1878 formally recommended against constructing the Analytical Engine.

Very few of Babbage’s contemporaries grasped the potential of his invention. One notable exception was Ada Augusta, Countess of Lovelace (the daughter of Lord Byron), who in 1842 translated a document describing the Analytical Engine by L. F. Menabrea, adding extensive notes and explanations. Her “Sketch of the Analytical Engine” contained, among other foundations of computer science, the first published computer programs, in a notation she invented. Charles Babbage’s son, Major-General Henry P. Babbage, along with his military career in India, pursued the project, producing, in 1910, an operating sub-scale model of the mill and printer components which may now be seen in the Science Museum in London. The son experienced the exasperation of the father in trying to persuade people of the utility and feasibility of the Engine, writing in 1888, “The History of Babbage’s Calculating Machines is sufficient to damp the ardour of a dozen enthusiasts. ”

**The Analytical Engine Emulator**

I have always been a fan of retro technology and lost causes—heck, when I lived in California, my house was on Saint Jude Road. In 1996, I decided to see if it would be possible, using modern technology, to build an emulator for the Analytical Engine, which would demonstrate its capabilities and allow programmers today to experience working on the first general-purpose computer ever conceived by the mind of man: the charter member of the steam-powered brass vapourware club. I collected all of the original documents I could lay my hands on (all of which are now available from the Table of Contents of the project), and quickly discovered that there were numerous inconsistencies among them and ambiguities in how the machine was intended to operate. I concluded that all involved, including Charles Babbage and Lady Ada, were just as talented at arm waving in the face of complex but not-entirely-baked designs as their intellectual descendants today.

Well, I wasn’t going to let *that* stop me! I made what seemed to me to be reasonable assumptions about Babbage’s intentions and how the machine would have to operate, and produced a detailed specification of a machine guided by the principles that it should be able to perform any operation of which Babbage explicitly said his design would be capable, and would have no capability he did not mention. I had to make a number of inferences about details such as the operation of the card readers and the decision-making facilities (combinatorial cards), but again my choices were based upon the same principles. If you’re interested in these details and my justifications for the design decisions made, please see my document “Is the Emulator Authentic?”.

The emulator was originally posted on the Web in 1997, using the Java language to allow users to run Analytical Engine programs either stand-alone from the command line, or within a Web browser with a Java “applet”. As many readers may have experienced, Web applications are subject to “code rot”, and attempting to keep the Java components working over twenty years of security patches, incompatibilities, platform sensitivities, and other slings and arrows was sufficient to damp the ardour of a dozen enthusiasts. By 2017, Web browsers were abandoning Java applets entirely, which rendered the emulator inaccessible to many potential users. On March 19, 2017, I released the Twentieth Anniversary Edition of the Analytical Engine Emulator, which included an entirely new Web-based version implemented in JavaScript (which, notwithstanding the name, has nothing to do with the Java language) and HTML5, the Analytical Engine Web Emulator, new examples for Engine Programming Cards which use it, and a new collection of Sample Programs which illustrate more complicated tasks for which the Analytical Engine might have been employed. The collection of original documents which accompany the emulator has been updated to modern Web standards and typography. To explore on your own, start at the Introduction, then move on to the Table of Contents.

Doubtless, had the Analytical Engine been built, one of the principal “customers” would have been the Admiralty, not just for navigation but also computation of gunnery tables for warships and shore batteries. The following illustrates how the Analytical Engine could have done this job via the technique of numerical integration, which allows direct simulation of the physical effects of gravity and air resistance. Other more subtle effects on the projectile’s flight (for example, wind, barometric pressure, and humidity) would complicate the program, but could be easily added to it and computed by the Engine.

**Numerical Integration: Naval Gunnery**

One of the first problems to which mechanical, electro-mechanical, and electronic computers were applied was calculating tables for naval gunnery and army artillery. Indeed, ENIAC, one of the first electronic computers, was funded by the U.S. Army with that application in mind. The flight of an artillery shell, and hence how a gun must be aimed for it to hit its target, depends in a complex manner upon a multitude of factors: the muzzle velocity with which it leaves the gun, its mass, size, and shape (which determines how much air resistance it will encounter in flight and the degree wind may cause it to diverge from a pure ballistic trajectory), the barometric pressure and humidity of the air through which it is passing, the elevation of the gun’s barrel, and more. Naval gunnery further complicates the problem, since both the ship from which the gun is fired and the target may be moving with respect to one another, and pitching and rolling due to rough seas.

So formidable is this problem that in the 20th century naval fire control systems were developed: mechanical or electromechanical analogue computers which could solve the problem in close to real time. Before the advent of such machines, gunners relied upon range tables, compiled by a combination of experiment and manual calculation. Given the importance of naval gunnery to the British Empire, it is almost certain that one of the first tasks for which The Analytical Engine would have been employed was the computation of artillery tables for warships and shore batteries.

Problems such as this are best solved through the process of numerical integration. The flight of the artillery shell is traced along its trajectory from muzzle to target in discrete time steps, with all of the factors affecting its path (momentum, gravitational force, air resistance, etc.) computed at each step. This process is lengthy and tedious to do by hand, but it’s ready-made for a computing machine, which can work through the successive steps without human intervention.

This sample program illustrates numerical integration by modeling the trajectory of a projectile fired from a British BL 15 inch Mark I naval gun, which served the Royal Navy from 1915 through 1959, including both World Wars. As a sample program, the computation is much simpler than would have been used to create real artillery tables, but the basic principle of numerical integration is the same.

We start by defining the parameters of the problem, physical constants, and options for the computation as number cards at the top of the program. The following table lists these parameters, giving the symbol by which they are referred to below and the column in the Store into which each is placed. Consistent with the Victorian vintage of The Analytical Engine, these definitions and all calculations in the program will be done in quaint imperial units.

From the calibre (inside barrel diameter) *c* of the gun, we can calculate the frontal area *a* of the projectile, which will figure in the calculation of atmospheric drag, from the formula for the area of a circle.

At each step *n* in the calculation, we start with the current velocity of the projectile *v_n*, its distance downrange *x_n*, its altitude *y_n*, and the angle of its motion to the horizontal *θ_n*. At the start of the computation, these values are initialised with velocity equal to the gun’s muzzle velocity, distance and altitude 0, and angle that of the gun’s muzzle elevation. In each step, the effects of gravity and air resistance on the projectile will be calculated and used to update these quantities, which are then passed on to the next iteration. The process continues until the altitude goes negative, indicating the projectile has impacted the Earth.

We start by calculating the effect of air resistance, *d*. This is proportional to the product of the air density ρ, the coefficient of drag *C_d* (a measure of how the projectile’s shape affects the drag it will encounter), the frontal area of the projectile *a* (computed above), and the square of the projectile’s velocity *v_n*². Drag has the units of force and, acting over the period of the time step *s*, reduces the projectile’s momentum, originally *v_n* *m*, by the amount *d s*, giving the new momentum *p*. Dividing the momentum by the mass of the projectile *m* gives us its velocity *v_d* as reduced by air resistance. Taking the cosine and sine of this velocity gives the new horizontal *v_x* and vertical *v_y* components of the projectile’s velocity.

The horizontal and vertical velocities must be treated separately since the acceleration of gravity only acts upon the vertical component of velocity, decreasing it over time *s* by *g s*. Taking the square root of the sum of the squares of the horizontal and gravity-affected vertical velocities gives the new velocity *v_*{*n*+1} at the end of the time step, and the arc tangent of the vertical velocity divided by the horizontal velocity gives *θ_*{*n*+1}, its new angle of motion. Multiplying the horizontal and vertical velocities by the time step and adding them respectively to the range and altitude updates these quantities: *x_*{*n*+1} and *y_*{*n*+1}. If the new altitude is negative, the projectile has impacted and we’re done. Otherwise, we continue for another step, starting with the updated position, velocity, and angle just computed.

Note that in the computation, we use mathematical functions such as sin, cos, square root, and arctan. How can the Analytical Engine, which can only add, subtract, multiply, and divide, compute these functions? All of the basic functions of analysis can be approximated by series expansions involving only the four basic arithmetic operations and which, by computing a sufficient number of terms, can provide the value of the required function to whatever precision is required. For example, the following series, given an angle *x* in radians, approximates the sine (sin) of the angle.

Babbage envisioned the Analytical Engine as having a library of mathematical functions, consisting of program cards to evaluate all of the standard functions, which could be incorporated in other programs as needed. This program uses cards from such a library to compute the functions it employs.

At the end of the computation, a summary appears on the Printer, both with and without the effects of atmospheric drag. The latter is not calculated because the Royal Navy anticipated naval battles on the Moon, but to illustrate the degree air resistance affects the flight of an artillery shell. Try increasing the coefficient of drag on the N104 card to see how it reduces the range and altitude reached by the projectile. A massive projectile moving faster than the speed of sound delivers a large amount of kinetic energy to its target, quite apart from any explosives with which it may have been filled. This kinetic energy is calculated in “foot pounds” (another example of sloppy terminology in imperial units, users of which often fail to make the distinction between weight and mass—the correct term is “foot lbf” where “lbf” stands for “pounds of force”). The energy is then compared to the quantity in pounds of the high explosive TNT which would release the same energy.

The Curve Drawing Apparatus will display the trajectory of the projectile, both taking into account air resistance (the black curve) and without air resistance (blue curve). Note how increasing the drag coefficient (the N104 number card at the top of the program) affects the trajectory, including deviation from the parabola expected absent the effects of atmospheric drag. With very high drag, the projectile almost comes to a stop and falls from the sky

Using rudimentary estimates of the speed of calculation, the emulator estimates that the Analytical Engine would have taken around 8 days and 11 hours to run this program. Most of the time is spent evaluating mathematical functions at each step in the integration. If those functions were replaced by values supplied from pre-computed tables on number cards, as envisaged by Babbage, this time would be dramatically reduced. Also, the series used to compute the mathematical functions were chosen to be straightforward, not efficient. Replacing them with optimised (but messy) series used in contemporary mathematical libraries would speed up the program substantially.

**Could it Have Been Built?**

This has been the largest question concerning the Analytical Engine from Babbage’s time to the present day. Setting aside questions of budget and whether the funds could have been raised to attempt to build it, were the materials available in Babbage’s day, the fabrication technologies he could have employed, and the tolerances they could have produced for the components sufficient for such a huge and complicated machine to have worked? Babbage was adamant that they were, and his son, who was best acquainted with the details of the design, concurred. But there was little hard data one way or another to answer this question before 1990. Between 1847 and 1849, after the cancellation of the original Difference Engine project, and based upon technologies he had developed for the Analytical Engine, Babbage redesigned the Difference Engine to use fewer parts (8,000 instead of 20,000) and run faster. He called this new version Difference Engine No. 2, and prepared complete drawings for the machine. He never secured funding to build it (nor does it appear he tried very hard to do so, concentrating on the Analytical Engine), and the plans ended in the archives of the Science Museum in London.

Between 1989 and 1991, based upon these drawings and using only materials and machining techniques available to Babbage and tolerances measured from surviving components of Babbage’s machines, the Computing department of the museum built a Difference Engine No. 2, and subsequently built a second copy of the machine for a financial contributor to the project. Apart from correcting some errors which were discovered in the drawings, the machine is considered faithful to Babbage’s design and representative of what could have been built in his time.

It works. Since the machine uses many of the same mechanisms anticipated for the Analytical Engine, it provides evidence that not only was the design sound, but that it could have been built. However, this is a very simple machine compared to the Analytical Engine, and there remains much room for doubt that something so large and complicated could have been gotten to work, or would have been sufficiently reliable to be useful.

Here is an introduction to and demonstration of the replica of Babbage’s Difference Engine #2 at the Computer History Museum.

This is a TED talk about the Analytical Engine.

Here is an hour and a half talk by Doron Swade, who led the project to build Difference Engine No. 2, about Babbage, his engines, and the plan to build an Analytical Engine.

.

Difference Engine → Analytical Engine: maybe not the first incident of feature creep, but one of the best known. If he had kept it tight and finished the Difference Engine, it is hard to imagine just what changes would have proceeded from that.

On the other hand, a total of 19 years were spent on the government-funded attempt to build the Difference Engine, consuming ten times the budget originally allocated for its construction. The total spent on the project was £17,000, which, using the common conversion factor of one Victorian pound sterling to 80 present-day US$, is more than one and a quarter million dollars. The design they were trying to build was Difference Engine No. 1, which had more than 20,000 parts.

Babbage’s revised design, Difference Engine No. 2, designed between 1847 and 1849, reduced the parts count by more than half, to around 8000. It is this machine which was built in the 1990s to his specifications, and building it (albeit as a non-crash museum project) took more than 15 years, and was, even with modern manufacturing techniques, a difficult project. While the museum project demonstrated Difference Engine No. 2 could have been built in Babbage’s time, it’s not clear that the original design, with far more parts and not benefiting from what Babbage learned from working on the Analytical Engine could have ever been finished or worked.

It is true that Babbage did not actively promote Difference Engine No. 2, but even if he had, it is unlikely that, following the failure of the original 19 year project, any funding agency would have taken him sufficiently seriously to start over on a new design.

Doron Swade, who led the project to build the modern Difference Engine No. 2, estimates the Analytical Engine would be around ten times as complicated, measured by parts count, than the machine he built. He estimates the time required to build a modern Analytical Engine as 20–25 years with a budget of US$30–40 million.

Some will be acquainted with the idea through Gibson and Sterling’s The Difference Engine (1991), which IMHO is a frustrating mix of fascinating, well worked out “What if”, and somewhat precious, artsy storytelling in places. Well worth checking out despite its flaws as a novel.

Shouldn’t an organization like that have a web site?

This one comes close:

http://www.douglas-self.com/MUSEUM/museum.htm

Wow. Thanks, Fourmilab guy @johnwalker. I recommend http://www.fourmilab.ch/

One thing I always found amusing about TV shows like “Futurama” and “The Big Bang Theory” was their lifelike depiction, if exaggerated for comic purposes. of some scientists as being like the ones I’d known: angelically gifted but greatly uneven in wisdom, common sense, and people skills. Some really did have it all. Some really did have nothing but a grasp of blood flocculation rates graphed against various other criteria. Prone to human jealousies, certainly.

I enjoy Alex Wellerstein’s Nuclear Secrecy Blog and have spent hours exploring its narrow, twisted alleys of history, but I am amused and impressed that even Wellerstein, regularly published in The New Yorker, is a bit jealous of Ricochet’s own John Walker. Wellerstein’s NUKEMAP had a vogue as the most popular web-based fantasy of nuclear destruction, but it clearly gets Wellerstein’s goat that the real cognos out there bypass his simple-enough-for-Barney-Rubble calculation of rubble, and go directly to the source, John’s own calculations of atomic blast effects.

As a point of comparison, it was just in the past few decades that truly real-time simulations of projectiles such as rockets and guided missiles have become possible. Now it’s the industry standard to build a full hardware-in-the-loop test bed in which the rocket’s avionics are mounted on a bench in a lab and connected to a computer running a real-time simulation. The simulation runs and provides flight-like data to the avionics. The avionics don’t know they aren’t really flying.

I know I’m kind of the anachronism guy around here. Heck, I’m so naive I once asked John if they still used gun assembly-format A-bombs anymore. But when I was a kid, just approaching the interface of manned flight, Aviation Week was full of ads and references to analog applications (in the old sense, not “apps”, of course) that were acknowledged to be likely to be superseded by digital, but hadn’t been yet. There’s a sentimental case to be made for analog, and for a few postwar years, analog still ruled, in the real world. 3 volts plus 5 volts equals 8 volts; what could be more obvious?

Audio recording is notable for the number of fanatics and retro-fanatics who insist to this day that the time-sampling steps of CD and iTunes level digital processing aren’t sufficient even now to equal the infinitely variable, if inevitably imperfect sound of the finest recorded music prior to about 1985.

Music synthesizers have their own paean to the analog years. I remember ’em well and fondly, though they seemed cussed at the time. Synthesizer people considered themselves more than just keyboard artists, because they also had to remember how to get ready for a bridge with a re-patch the ring modulator through the envelope follower of the voltage controlled amplifier, with five different modes patched in advance for each song, for an hour on end in front of 20,000 people. A different sort of analog hero.

Thanks John. I was going to drink myself stupid tonight — but then you had to write this. Luckily, I managed to have two glasses of bourbon before reading this and as a result — you’ve defeated Mankind’s nemesis: Entropy; by enlisting this spherical cow in trying to comprehend the sum totality of concepts you laid out — a frictionless plane came into existence in the vacuum of my head — as this essay moved from one ear, out the other.

Please, entertain this simp with, go ahead and boo, a Liberal Arts degree. Babbage’s analytical engine made way in my final bluebook in cognitive psych, mainly through its foundation precept of conditional branching. Conditional programming, unconditional and conditional branching, and the boolean logic input that guides all these produce the sum-of-their-parts: Control Flow (arguable), where these come together to collect, store, disseminate and disperse the data which guides choice.

In perception, sensation is the natural, hardwired input for the 3-pound cheese that is our analytical engine — and our family, friends, culture provide the if-then-(else) programming for abstract possibilities, probabilities, and eventualities. The same for switch statements, which I would argue use sensation, at least more, as the qualitative input that is the “game-changing” variable for the whole shebang (

esp.if hyphenated. wink-wink). Is perception an assembly language, batch program, shell, etc?And here is where we overlap Babbage, Programming, Purpose, Ideology, Religion, Innovation, Free Will, and Psychology:

Heuristics. (part 1/2 whiskies)

Sorry for the meandering. My question Mr. Walker: Entropy is inescapable it seems. Even in machines that don’t exist in our realm of physical mechanics. You mentioned code rot; in music recording, degradation of audio-vibrancy is called generation loss — aren’t these examples of isolated, closed, and open systems — therefore immune to the 2nd law of thermodynamics? Or is it theoretical validation of entropy, considering the varying computers, internet travel, and multiple interfaces as “matter and force” permeating through both systems — making upgrades to software or remastering of original reel-to-reel analogous to biotic-systems manipulating environment through travel or tool-use towards thermodynamic equilibrium?

And could Gravity, rather than being a result of the curvature of space-time, be Entropy? Or at least an entropic system, say the permeability that allows space and time to intersect?

And am I still writing dumb stuff?

Sorry, I’ll stop after this: I was going to thank you for bringing me to a transcendental realization when I remembered what we were actually talking about — trajectory, drag, gravity, and its effect on projectiles in terms of parabolas. Gary brought up sinusoidal waves, and I have to ask another stupid question: do those parabolas that vary in expected and actual outcomes represent sinusoidal curves? Or are they the transcendental ellipses, therefore any algebraic function, such as Cartesian plotting, are not applicable?

And while I’m sure if I look up Brahe or Keppler, they have some equation that accounts for this in their planetary orbit theories — but I pay for Rico and want to throw out some more wildly incoherent bluster. If light and sound waves move in sinusoidal curves, meaning the presence of algebraic numbers that operates on Pythagorean outcomes of a moving unit circle — wtf is up with PI? Is this where God decided to be irrational? And do sine waves of light and sound oscillations vary with gravitational influence? Is there a reconciliation between Pi and Algebraic algorithm (please don’t say Parametrics)

I’ll upgrade to Thatcher for this falsidical hullabaloo.

[edit] So, I decided to relax my brain by playing some guitar. I use pinch harmonics for more precise tuning , as any frequency disparity oscillates clearly, which I looked up and see Pythagorean tuning and circle of 5ths is a mathematical coincidence.

enter Music theory into the mind bending world of spheres

Those birds annoy me. They hear finely-tuned distortion and call it “fidelity.” Digital rules, analog drools.

As late as the X-15 program in the 1960s, analogue computers were used in flight simulators. The X-15 flight simulator was a pure analogue computer, configured to model the characteristics of the vehicle in flight. No affordable digital computer at the time would run sufficiently fast to provide real-time response to a pilot flying the simulator, but this was no problem for an analogue computer which evaluates differential equations as its primitive operation.

For the X-15A-2, the simulator was augmented by a digital computer to become a hybrid computer. This allowed modeling the more complicated dynamics of the higher speed vehicle.

Pilots said the analogue simulator was very faithful to how the X-15 actually flew. They also loved the reset button. Here is Milton O. Thompson, veteran of 14 X-15 flights.

To clarify, the calculations of nuclear weapons effects at my site in

Strangelove Slide RuleandThe Effects of Nuclear Weaponsare not my own. I simply made Web editions available of resources originally produced by the U.S. government which went out of print in the 1970s and had become collector’s items out of reach of many people interested in them.To close some kind of synchronicity loop, the nuclear bomb effects computer upon which

Strangelove Slide Ruleis based was developed under contract to the Atomic Energy Commission by The Lovelace Foundation which, to the best of my knowledge, has no connection to Lady Ada. The Lovelace Foundation were also the people who devised the devilish medical tests for the Mercury astronauts described inThe Right Stuff.I used to go to stereo stores and mess with the EQs on their “comparison” rigs, then challenge the salesmen to tell the difference.

Rolled off the high end and brought up the low end on the CDs, and they couldn’t tell the difference.

There is a family of theories, called entropic gravity, which argue that gravity is not a force like electromagnetism, but an emergent phenomenon due to quantum effects at the very small scale. Erik Verlinde is closely associated with the theory, having proposed the original model in 2009. Proponents of the theory argue that it can explain some of the effects attributed to dark matter without the need to invoke unseen mass.

Those birds annoy me, too. Though I do have substantial respect for the analog pre-amp designers of modern 24-bit audio A/D converters.

Darn you @johnwalker ! Every article you publish here sends me out into the time-wasting weeds of fascinating side topics. I thought I’d get some relief this time, as I was already a fan of Babbage, but

NO, you do itagain! /-:Is there

anytechnology topic where you don’t humble me?Ah well, good for the soul.

One of the questions I’m occasionally asked is why I bothered to build an emulator for The Analytical Engine. My answer is that I wanted to experience, and to allow others to experience, what it would have been like to solve problems on a machine with its properties. How easy or difficult would it have been to transform algorithms like those expressed in Lady Ada’s “Sketch of the Analytical Engine” into number, variable, and operation cards for the mechanical Engine? What limitations of Babbage’s design might become apparent when people tried to actually use it solve real-world problems?

The main thing I discovered was, in fact, rediscovering shortcomings the users of the very first electronic computers found in their machines, which were remedied in various ways in the next round of designs. As a computing machine, the Analytical Engine is more than adequate, albeit slow compared to electronic computers. Its numerical precision and storage capacity are much greater than early electronic computers, so there are no difficulties using fixed point arithmetic for quantities normally expressed as floating point numbers in modern computers. (If you’re unfamiliar with fixed point computation, here is a brief tutorial.)

The largest problems one encounters coding algorithms for the Engine are the lack of program-controlled memory addressing and subroutine calls. Let’s consider each of these.

Programmed-Controlled Memory AddressingIn the Analytical Engine, transferring numbers and results between the Mill and Store is performed by Variable Cards, each of which specifies the Mill axis and Store column of the transfer. These cards are prepared in advance, and the axis and Store column are punched into them. But often in a program you wish to create a table of values which grows as the computation progresses. For example, when computing Bernoulli numbers, the principal example in Ada’s paper, in the most commonly-used algorithm, each successive number in the series depends upon all of the previously-computed values, which are usually kept in a table as they are computed. In order to do this, you’d like to place each number as it’s computed in successively higher numbered Store columns, then retrieve them when computing the next number in the series. But you can’t do this on the Analytical Engine, because a program cannot specify or modify the Store columns punched into the Variable Cards. The only alternative is to “unwind” the loop manually, repeating the same operations but specifying different store columns each time. Worse, even if you know you need to add, say, 10 numbers starting in column 30 of the Store, you have to explicitly include cards to fetch and add each one—you can’t just loop over the store columns.

Contemporary computers have index registers which allow the program to modify a memory location specified in the program. Earlier computers used indirect addressing or instruction modification to accomplish the same goal. The Analytical Engine had neither an index register nor indirect addressing, and not being a stored-program machine (since the program was on cards, not kept in the Store), instruction modification was not possible. When you read Ada’s description of her program for the Bernoulli numbers, it’s evident she ran into this problem and dealt with it by ellipses and arm waving.

Subroutine CallsWhen you are programming an algorithm, it is common to have sub-computations, for example extracting the square root of a number, which are performed frequently within the program, but with different arguments and result destinations each time. On modern computers, it is possible to code these portions of the program as subroutines, which are invoked as needed by the main program, do their work, and then return to where they were called with the result. To do this, there must be a way for the computer to remember where it was before the subroutine was called, and then return there to resume the main program when it’s done. No such facility exists in the Analytical Engine. If there are twenty places in a program that you need a square root, the cards to compute it must be interpolated into the program independently in each. This increases the size of the program, but doesn’t slow it down, as the time to execute the code will be the same regardless of whether it is called as a subroutine or included in-line.

Writing programs for the Analytical Engine Emulator will make you appreciate why index registers (or indirect addressing) and subroutine call instructions were among the first features added to early electronic computers.

What do you think?

Not getting into the math (which would take embarrassingly long at this point), but just looking at the words, I spotted this:

Seeing the surface of things, not the volume inside, is such a normal act that it seems intuitively plausible, although why this would make volume “illusory” rather than just “the thing we’re not reading, since we read the surface”, I couldn’t say.

Also, my favorite Babbage quote:

I ain’t gonna

fingono hypotheses here. The fact is that all of these supposed theories of quantum gravity are highly speculative, which their proponents (although not necessarily hyperventilating journalists writing about them in the popular press) would be the first to admit, and all have challenges in confronting experimental data already in hand.This gets us pretty far away from the Analytical Engine. I’m not sure a comment thread is the best place to get into the briar patch of the holographic principle or the AdS/CFT conjecture, both of which are themselves highly speculative (both from the standpoint of theory and the utter absence of experimental evidence for them). By stating that the volume is “illusory”, the conjecture claims that all of the information in a volume of spacetime is encoded on the surface of its cosmological horizon, just as it is asserted that all of the information which falls into a black hole is encoded as information on its event horizon.

Good grief—write about 19th century brass and steam, and look where you end up!

Mr Babbage was never able to comprehend how difficult it is for politicians to grasp the concept of “garbage in, garbage out”.

So it was then, and so it ever shall be, although this may have been the first time it was stated so pellucidly.

Not to mention the rich vein of dumbness.

That said, in the early days of CDs, when the market exploded much more rapidly than the record labels anticipated and vinyl sales cratered, there were some hideous analogue transfers to CDs which gave them a bad name. To meet the demand, record companies (Deutsche Grammaphon was one of the most notorious) would simply take their cutting master tapes and transfer directly to digital with little or no equalisation to reverse the curve intended to compensate for the high-end roll-off of the cutting lathe. The result was so hideous that listening to it on a system with flat response would eliminate the need for teeth cleaning for a good six months.

This only happened for a year or two before the labels cleaned up their act, but it contributed to the bad initial reputation of CDs and especially, among the cognoscenti, anything marked AAD on the label.

I’ve been aware of Babbage engines from a wide variety of mentions in various SciFi stories over the years. It always struck me as one of those gloriously ridiculous ideas humans are prone to. It isn’t even the reality of getting the thing to work, it’s the sheer impossibility of keeping it working. Tens of thousands of mechanical parts, created before modern precision manufacturing. One part is a few mils off of spec, and you could spend years trying to find the problem.

Debugging is bad enough now, just dealing with the software stack in most systems. Doing that on a physical hardware basis would be no fun at all.

I agree about the unexpectedly rapid adoption of the CD format; it was a surprise. We early adopters tend to think everyone jumped aboard the moment a new technology goes on sale. By contrast it took color TV about ten years to introduce itself in a major way, home videotape about eight, personal computers about the same, HDTV about six or seven, same for how long it took DVD to replace VHS.

CD is a rare exception that, like black and white TV circa 1948-’52, took over much faster than planned.

Not me. I didn’t buy a CD player until I went into a Record and Tape Outlet and they didn’t have any records. (But even I was aware of the AAD thing.)

John’s crafty idea of doing a software emulation raises the exercise to an idealized plane, where it belongs as a matter of logic. But as a matter of mechanical engineering, I can see why it took so long to build so much as a mark II machine using 19th century metallurgy. How do you account for third, fourth, fifth order torque effects of propelling a varying number of wheels, planetary gears, ratchets, levers, etc, knowing that at times in the computation/storage/retrieval cycle, the rotating master shaft will be subject to unbalanced stresses at, say 15 to 29 degrees and 260 to 310 degrees, and moments later the stress points of the cycle would have shifted kaleidoscopically?

Presumably they had the ability to drill lubrication channels through brass rod stock for full pressurized lubrication. At extremely low RPMs you don’t want a “splasher”.

What about simple wear and tear? Brass is hard, but not that hard. That lever that starts out swinging smoothly starts to jiggle.