June 13, 2007

Shortcut Searches

Firefox bookmarks have a few features that can save you a lot of time, if used to their full potential. For starters, each bookmark can have an associated keyword, which, when typed in the location bar, loads the bookmarked page. For example, if your bookmark to websudoku.com has the keyword sudoku, you can type <Ctrl-L>sudoku<Enter> and attempt to solve a sudoku.

Ok, so that's not so great. I'm all for keyboard shortcuts, but mousing to that bookmark is almost as fast. But wait, there's more! If you put the string %s in your bookmark, it will be replaced by anything you type after the keyword. So a bookmark google.com/search?q=%s, with keyword g, will let you type <Ctrl-L>g optimus<Enter> to find out how much you'd have to spend to get the ultimate keyboard. That's significantly faster, mostly because you don't have to load the search page, just the results. You can get this same savings by using Firefox's search box, but it's much faster to choose a search engine by typing a keyword instead of picking it from the menu by mouse.

Choosing short, memorable keywords is important. You want to Huffman code them, so that your most frequently used searches have the shortest keywords. Some of mine:Because you have full control of these search bookmarks, you can customize them to your heart's content. For example, you can make Google return 20 results per page instead of 10: google.com/search?num=20&q=%s. (Useful for people who discard all cookies at the end of a browsing session, and therefore can't save preferences for long.)

More importantly, you can add things to the search string. I use the keyword m for Google Maps. Since Google doesn't know I live in Toronto, entering a street address without specifing a city causes problems. So I also have the keyword mt for "Maps Toronto", maps.google.com/maps?q=%s%2C%20Toronto, which appends ", Toronto" to the address I type or, more likely, paste into the location bar.

You can take this technique a long way. I have one bookmark, keyword cia, that uses Google's "I'm feeling lucky" feature to search the subdirectory of the CIA World Factbook the country profiles are served from. So I can type <Ctrl-L>cia canada<Enter> to look up the latest estimate of Canada's population and GDP.

The %s replacement isn't perfect. Most notably, when the user doesn't type anything after the keyword, the %s isn't removed, which can lead to some strange search results or even 400 Bad Request errors. To deal with this, you can introduce javascript into your bookmarks to send you to the search page if you don't enter anything and to search results if you do.

Bookmarklets, bookmarks that execute javascript, can do some amazing things, but they don't have an easy way to get user input. Combining them with keywords and %s allows you to do some neat things. The most useful one I've created so far is a site search, which uses Google to search the domain you're currently browsing.

You can look at my shortcut bookmarks here, and import them into Firefox if you want to try them out yourself. (You'll have to use a text editor to remove the geocities server generated crud at the bottom of the file (clearly marked). Sorry about that.)

June 08, 2007

How many could there possibly be?

I don't know how successful anything that caters specifically to programmer/birders can be, but I'm happy to find a webcomic tailored to my mini-demographic. The Boids

June 03, 2007

The Discoveries: Summary

Done at last!

Here's a list of the Discoveries posts, with blurbs:

The Quantum, Max Planck, 1900.
Explained black-body spectra by introducing the idea that physical quantities can be discontinuous, an idea that many other discoveries build upon. An understanding of black-body spectra can help you understand the cosmic microwave background, too.

Hormones, William Bayliss and Ernest Starling, 1902.
Discovered the first chemical messenger in living tissues. The beginnings of biochemistry.

The Particle Nature of Light, Albert Einstein, 1905.
Max Planck's quantum taken to the next step: light behaves as though it is composed of particles. Einstein effectively discovered the photon here.

Special Relativity, Albert Einstein, 1905. Unified the electric and magnetic forces, and redefined space and time. There's a good reason why "Einstein" is in the dictionary as a synonym for "genius". Special Relativity led to General Relativity, which explained gravity and predicted things like gravitational lensing. (I screwed up my original description of one of his thought experiments. Corrected now.)

The Nucleus of the Atom, Ernest Rutherford, 1911.
Bombarded atoms with radiation, and discovered evidence of internal structure. This eventually led to a much more useful understanding of the chemical properties of atoms, and opened the new field of nuclear physics.

Sidebar: The Properties of Atoms.
The chain of logic by which the masses and volumes of the various elements were originally worked out.

The Size of the Cosmos, Henrietta Leavitt, 1912.
Discovered Cepheid variable stars, the first reliable way to measure distances beyond 500 light-years. Led to the discovery of galaxies, the expansion of the universe, and much more.

X-ray Crystallography, Max von Laue, 1912.
The single most useful observational technique discovered during the 20th century. X-ray crystallography is the way we figure out the three-dimensional structure of mollecules.

The Quantum Atom, Niels Bohr, 1913.
Combined Planck and Rutherford's work to explain many of the properties of Hydrogen. This is where quantum physics began to take off. (I really dislike my summary of this discovery. If you don't understand it, you can blame it on me not understanding it.)

Neurotransmitters, Otto Loewi, 1921.
Discovered that nerve cells, previously thought to transmit signals enirely through electricity, use chemical messengers to communicate with each other. The electric signals only travel along single nerve cells. Opened the way to the study of mind-altering chemicals.

The Uncertainty Principle, Werner Heisenberg, 1927.
Examined the way measurements can perturb a system, preventing you from knowing everything about it at once. Destroyed the ideal of absolute precision and cemented the place of probability in modern physics.

The Chemical Bond, Linus Pauling, 1928.
Extended the quantum model of the atom to the point where it could explain and predict chemical bonds, and thus the structure of simple molecules. Predicting the structure of large molecules is still beyond our computational power, but the theoretical basis is there.

The Expansion of the Universe, Edwin Hubble, 1929.
Using Cepheid variables, discovered galaxies, and the fact that they're all moving away from us as though the universe itself is expanding. (I skimped on this one. It's mostly an examination of the scientists who made these discoveries. Conclusion: there are no stereotypical scientists. The great ones are usually just the students of previous great ones. Maybe I'll revise this into a sidebar and write more about the expansion of the universe, which is pretty amazing.)

Antibiotics, Alexander Fleming, 1929. Serendipity strikes: the accidental discovery of fungi that produce anti-bacterial chemicals. Led to the search for more, and eventually to drug companies designing their own. Oh, yeah, and to the near eradication of bacterial diseases.

The Krebs Cycle, Hans Krebs, 1937.
Mapped the core energy producing metabolic pathway, present in every aerobic organism. Oxygen and food is combined to produce carbon dioxide, water, and ATP, the molecule that powers everything in a cell. A key step in understanding the big networks of chemical reactions that keep us alive.

Nuclear Fission, Lise Meitner and Otto Hahn, 1939.
Discovered that with a little nudge, big atoms will fall apart. Led to nuclear power and nuclear bombs.

The Movability of Genes, Barbara McClintock, 1948.
Really, the discovery all kinds of complexities in genetics. Genes that switch other genes on and off, DNA that gets moved from one part of the chromosome to another, genes that affect mutation rate, structural changes of chromosomes, and much more.

The Structure of DNA, James Watson, Francis Crick, and Rosalind Franklin, 1953.
X-ray crystallography and cardboard models used to figure out the three-dimensional structure of the molecule that carries genetic information. The most important bit is the way the structure of DNA automatically leads to replication.

The Structure of Proteins, Max Perutz, 1960.
Showed how to understand the function of proteins (in this case Haemoglobin), the molecules that do practically everything in a cell. Opened the way to a detailed understanding of the way cells work.

The Cosmic Microwave Background, Arno Penzias and Robert Wilson, 1965.
Discovered the light leftover from a time when the universe was dense enough for all matter to take the form of plasma. Strong evidence that the universe has been expanding for quite some time now.

A Unified Theory of Forces, Steven Weinberg, 1967.
Found a way to describe electricity, magnetism, and the weak interaction as a single, unified system. Elegant.

Quarks, Jerome Friedman, Henry Kendall, and Richard Taylor, 1969.
Bombarded protons with radiation, and discoved evidence of internal structure. This time we're certain we've found the indivisible particles, right?

Genetic Engineering, Paul Berg, 1972.
Hijacked various enzymes to hijack a virus to insert arbitrary DNA into a cell's genome. Allows us to do things that were once impossible.

And that's it, for now. Credit must go to Alan Lightman for collecting these papers, researching them, and putting them all together for us in his book, which inspired me to spend so much time writing about them myself.

Hope you learned as much as I did!

Genetic Engineering

David A. Jackson, Robert H. Symons, and Paul Berg, Biochemical Method for Inserting New Genetic Information into DNA of Simian Virus 40: Circular SV40 DNA Molecules Containing Lambda Phage Genes and the Galactose Operon of Escherichia Coli, 1972

To most people, genetic engineering is practically magic. The wizard, possessed of arcane knowledge, toils in a laboratory and emerges with an organism altered by methods entirely incomprehensible to the ordinary person. The reality is much more mundane. The tools of genetic engineers are clumsy, time consuming, and mostly borrowed from nature rather than designed from scratch.

The basic tools of genetic engineering detect, isolate, and measure DNA. Centrifuges and filters can separate DNA from other substances and sort DNA fragments by size. Radioactive label molecules allow scientists to track particular samples of DNA through a series of steps. Electron microscopes can even create images of individual DNA molecules, showing the rough shapes (linear, coiled, circular, branched, etc.). All of these are observational tools: they don't alter DNA, just allow researchers to figure out what goes on in their test tubes.

The tools that allow researchers to manipulate DNA are almost all enzymes taken from bacteria and other living organisms, extracted and refined using basic tools similar to those used to study DNA. Each enzyme does one particular thing and no other: restriction enzymes cut DNA strands wherever a particular target sequence occurs; λ exonuclease snips base pairs from the 5' ends of DNA strands; terminal transferase adds new base pairs to the 3' ends of DNA strands; DNA Polymerase I replaces missing complementary DNA; DNA ligase repairs a specific type of break in DNA strands; exonuclease III converts one type of break into another. And so on, and so forth. The researchers know what these enzymes do, but for the most part don't know how they do their job. (Figuring that out requires a long study of the structure of the protein, as with Haemoglobin.)

By carefully putting samples of a particular enzyme into a test tube of DNA, at the temperature and pH it operates best at, allowing time for it to work, then separating out the DNA again, you can make one small change to a sample of DNA molecules. Well, to most of them, anyway. Every step in the process is statistical; there is always some small percentage of DNA that remains unaffected, or is changed in the wrong way. With additional filtering and centrifuging, these defective molecules can sometimes be removed before you go on to the next step, but this is not always possible.

Paul Berg and company figured out how to use newly discovered enzymes to perform what may well have been the first act of genetic engineering. It had recently been discovered that viruses insert their DNA into host cells, where it then merges with the cell's DNA. Berg worked out a general method for inserting arbitrary DNA into a virus, which could then insert it into a living cell, there to be expressed and passed on to the cell's descendants.

For his initial experiments, Berg chose to use Simian Virus 40, which infects monkey cells with few side effects. SV40 only has a short loop of DNA, short enough for Berg to find a restriction enzyme (restriction endonuclease RI) that breaks it in exactly one location (because RI's target base pair sequence, GAATTC, occurs only once in SV40). He put some broken SV40 DNA into a test tube with terminal transferase and a supply of adenine (the A base pair), which added a string of As to both ends of the DNA. Then he prepared a sample of the DNA he wanted to insert into SV40, using terminal transferase to add thymine (the T base pair, complementary to A) to both ends of it. When mixed, the adenine ends of the SV40 stuck to the thymine ends of the arbitrary DNA, and the result was a test tube full of SV40 with a bit of extra DNA spliced in.

I've skipped several steps in this (a bunch of repair enzymes are needed to finish the bonding, for example), but that's the basic idea: isolate the bit of DNA you want using restriction enzymes, build "sticky ends" onto it, build corresponding sticky ends on the target DNA, and mix the two samples together. The whole process requires the application of about five or six different enzymes, not including the work to isolate the bits of DNA you want. It's important to note that genetic engineering does not (yet) involve designing new genes. Heck, we don't even know what most "known" genes do. For now, the best we can do is move a gene whose function is known (otherwise what's the point?) from one place to another. Even then, if it's put in the wrong place it won't do anything except possibly kill your experimental organism. Genetic engineering isn't magic.

Berg and many of his colleagues quickly realized the potential dangers of genetic engineering (Berg already had the tools to insert a toxin-producing gene into E. Coli, potentially producing a new type of food-poisoning bacteria), and worked to establish guidelines for experimenters in the field. The first commercial application of genetic engineering was the production of human insulin using bacteria, starting in 1982.

June 02, 2007


J. I. Friedman, H. W. Kendall, R. E. Taylor, et. al., Observed Behavior of Highly Inelastic Electron-Proton Scattering, 1969

Ernest Rutherford investigated the structure of atoms by hitting them with various fast-moving particles. The particles came from radioactive substances, and were merely channelled toward the target by heavy shielding around the radiation source. In the years following, scientists began observing the collisions of cosmic rays (extremely fast moving particles from outer space) with Earth-based targets and using electromagnets to push and pull electrically charged particles, reliably accelerating them towards their targets at steadily increasing speeds. During the 1950s, a whole mess of new particles were discovered, some two-dozen quickly decaying fragments observed spinning away from high energy collisions.

During the 1960s, Murray Gell-Mann and others brought order to the menagerie of new particles, first with the eight-fold way, then with the quark model. This model treated the observed particles as though they were composed of two or three smaller particles. Because they were never observed, they were initially assumed to be entirely theoretical abstractions.

Particle accelerator technology was, by this time, increasing in leaps and bounds. Using the 2-mile long particle accelerator at Stanford (SLAC), the authors of this paper fired electrons at speeds in excess of 99.9999999% of the speed of light, aiming them at protons and expecting, much like Rutherford's graduate student, only to confirm that protons have no internal structure. Instead, they found that the electrons bounced off of the protons as though there were three point-like particles inside. Suddenly, quarks became much more than theoretical!

Later work went on to show how the strong force prevents quarks from ever breaking away from one another, explaining why they are never observed independently. The strong force binds quarks together into protons, neutrons, and the rest of the non-fundamental particles in much the same way that electromagnetism binds protons and electrons together into atoms. Also, in the same way that residual electric charge can hold atoms together, forming molecules, residual strong force binds protons and neutrons together to form atomic nuclei.

Yet more work has shown how the electroweak force acts on quarks, rather than protons and neutrons; how the strong force is transmitted by gluons, in the same way that the electroweak force is transmitted by the W+, W-, Z, and photon; and has attempted (several times) to unify the strong and electroweak forces.

June 01, 2007

A Unified Theory of Forces

Steven Weinberg, A Model of Leptons, 1967

As frequently happens in physics, the story of the unification of forces begins with Einstein. Pre-1905, the electric and magnetic forces were described by Maxwell's equations, which vary depending on how observers move relative to a postulated cosmic frame of rest, the ether. In the opening paragraph of his 1905 paper on special relativity, Einstein notes that this "leads to asymmetries which do not appear to be inherent in the phenomena", in particular that the pre-1905 interpretation brings two different forces into play (electric and magnetic) depending on whether a conductor or a magnet is at rest relative to the ether. He notes that the two forces have exactly the same effects if the relative motion between magnets and conductors is considered, rather than their motion relative to a cosmic frame of rest.

This "principle of motional symmetry" allowed Maxwell's equations to be considerably simplified. This is akin to the situation, frequently encountered in programming, where two sections of code are discovered to strongly resemble each other, and combined into a single subroutine. In this case, two sets of equations were found to produce identical results, and were combined into a single, unifying form. Physicists call this "discovering a principle of symmetry".

In 1967, Steven Weinberg discovered another principle of symmetry, this one noting a similarity in the way electrons, neutrinos, and their antiparticles (e-, e+, ν, and ν, collectively "leptons") interact with other particles. In the most familiar of these interactions (which isn't saying much) a neutron (n) changes to a proton (p+) and emits an e- and an ν. This is called "beta decay". Interestingly, physicists like to describe forces in terms of particle interactions. For example, a pair of electrons repelling each other can be described as one e- emitting a photon which carries some of its momentum to the other e-. Conversely, physicists will describe almost any particle interaction as the result of a force. In particular, beta decay is said to be the result of the weak force.

A whole family of additional interactions are also ascribed to the weak force, all of them involving leptons interacting with p+, p-, or n. By 1967, a way had been found to describe these interactions in terms of two intermediate "force carrier" particles (like the photon in the electron-electron example above), the W- and W+. In the case of beta decay, a n changes to a p+ and emits a W- particle, which travels a short distance and then decays into an e- and an ν.

Steven Weinberg noted that the W- particle can decay into an e- and an ν, and the W+ particle can decay into a e+ and a ν, and created a mathematical model that assumed the equivalence of electrons and neutrinos under the weak interaction. This led to the need for two more force carrying particles, one of which could decay into a ν and an ν, and another which could decay into an e- and a e+. The former he called a Z particle. The latter turns out to be well known: the photon, which acts as the force carrier for the electromagnetic force.

The symmetry here is obviously broken. The weak force and the electromagnetic force act quite differently. This is explained partly by the differences between the force carrying particles: the W and Z particles are quite massive, and thus travel slowly; and decay quickly, restricting their effects to a very short range. Photons, on the other hand, are massless, travel at c, and need not decay, giving electromagnetism its infinite range. To explain these differences, Weinberg invokes the Higgs mechanism, which I don't understand well enough to explain. The result, though, is a set of equations that predict the masses of the W, Z, and photon, the properties of the various leptons, and the probabilities of all the interactions between them. The theory also predicted several reactions involving the Z particle and neutrinos which hadn't yet been observed.

In short, Weinberg discovered a simple, yet powerful way to describe two previously distinct phenomena in a unified way.