Reported on Arstechnica:
The alternative is to try to drive a reaction that will simply not occur, but, then you have no starting signal to optimize. Instead you have to know a reasonably good pulse shape to use right from the start of the experiment.
This is what makes the experiment reported in Physical Review Letters remarkable.
The other important factor is that any sufficiently intense light pulse would cause bond formation. That meant that, like destroying molecules, the researchers could optimize a pulse shape starting from a non-zero signal. So, in this special case, molecules can be made. And, after optimization, the process occurs eight times faster than it would if driven by a generic pulse from the laser.
You might think that such a low enhancement would mean that this might mean that there is no real control here. The experiment just adds energy and hence the reaction proceeds. However, the molecule is simple enough that reasonably accurate calculations of the states were possible. These calculations showed that certain pulse shapes should enhance Mg2 production, while others should suppress production. Experiments confirmed that this was the case, indicating that molecular formation was due to a coherent process, rather than just due to the injection of lots of energy.
The way we do a lot of chemistry today is akin to brewing beer: You add a certain quantity of inputs, provide particular environmental conditions, give it some time, and you get a mix of product and waste at the end. Advancements typically come from discovering more efficient and faster recipes, or from reducing the number of steps needed. I’m oversimplifying, but not by too much I hope.
The technology quoted above is a qualitative change in methods and capability. The end-game for this tech is computer controlled arrays of coherent energy (lasers mostly) acting on a raw substrate or gaseous cloud to generate desired chemical end products, or to at least shift the statistical distribution of products toward a more useful one. The metaphor moves from cooking/brewing to something a bit closer to construction, with a precisely ordered expenditure of energy and materials used to construct molecules.
To think about this technology, I find that conceiving of information technology and computers as “control systems” is much more useful for clarity than thinking of them as just computers. A computer is something you write emails and play games on. A computer is a familiar part of your home and maybe work. But from a pure physical capability point of view, personal computers and even cell phones are among the least interesting things to come out of the entire information technology complex. They are pedestrian, mass market tools. They are not the cutting edge, and focusing too much on them draws attention away from the more radical uses of transistors operating in the GHz and higher ranges.
Like the above.
No human could turn a laser on and off in 100 femtoseconds, and most people would be lucky to do it in one tenth of a second. Our hands are our own control systems, and the computer beats them easily. Manipulation of matter and energy on a very small and precise scale in both time and space opens up new possibilities. In this case, a chemical reaction that does not naturally occur in a given environment was made to happen. A qualitative change in capabilities emerges from a quantitative change in speed and scale.
Computers are control systems that segment space and time into ultra-fine quanta, thereby enabling, with the right tools, small amounts of precisely tuned energy/information to be used to very rapidly manipulate matter and other energy. Or I should say, “smaller” amounts of precisely tuned energy/information. Femtosecond laser pulses are definitely small, but the macro-scale effect of these systems is that you replace large nukes on ballistic missiles that can only get “close enough” to a city with small warheads and smart munitions that can specify which square meter of which city block you want to explode.
The results are completely novel capabilities, and I believe we’ve only seen the tip of the iceberg so far.
The War Nerd has made a similar observation re: weapons of war (bolded emphasis is mine):
Compare the Patriot MIM PAC-3 with the MIM-14 and you see a very weird progression, from massive warhead to…no warhead at all. Computing power replaces explosive power, or tries to.
It’s an odd development. I can’t think of another moment in military history when contending powers dialed down their weapons to something close to zero, as the US and Soviet developers did.
Missiles that needed small nukes to kill another missile that they were “close enough” to can now be used as kinetic darts to precisely strike that other missile.
I can’t drive it home enough: This is what computers-as-control-systems do. Email, instant messaging, word processing, digital painting, computer gaming, and programming are all high-level derivatives of this capability. The computing device you’re reading this on is derivative. How these derivatives are used and evolved is very important, but we get distracted by and obsessed with them. The central action is taking place far, far under the hood and mostly out of sight, and it has implications for virtually every aspect of human existence–from laser control of chemical reactions to the use of supersonic missiles against entire cities.