Friday, May 4, 2012

Advanced Extremely High Frequency Satellite II


The United States Air Force launched an advanced communications satellite Friday, the second in a new fleet of spacecraft that should improve American and allied military commanders' ability to control their forces around the globe.
The Air Force's Advanced Extremely High Frequency 2 (AEHF 2) satellite lifted off at 2:42 p.m. EDT from Florida's Cape Canaveral Air Force Station, riding toward a preliminary orbit aboard an Atlas 5 rocket. The spacecraft will work its way toward its final geosynchronous orbit, about 22,300 miles (35,888 kilometers) up, over the next three months or so, officials have said.
The launch was originally slated for Thursday, but a flow problem in one of the Atlas 5's systems pushed things back a day.
The $1.7 billion satellite is part of the AEHF network, which could ultimately include up to six spacecraft. The new constellation is an upgrade over the military's current Milstar system of five functioning satellites, the first of which launched in 1994.
"The second AEHF spacecraft will provide greater connectivity, flexibility and control to U.S. and international partner forces," said Col. Michael Sarchet, the government's AEHF program manager, in a statement. "The AEHF constellation will augment and replace the venerable Milstar constellation, improving on many capabilities to include 10 times greater throughput."
AEHF will provide global, secure, jam-resistant communications for military operations on land, sea and air, officials said. The network features the highest levels of encryption, and it will allow commanders to control their forces "at all levels of conflict through general nuclear war," according to an Air Force fact sheet.
As its name implies, AEHF 2 is the second satellite in the fleet to launch. AEHF 1 blasted off in August 2010, but its main engine failed to fire as planned to lift it to its final orbit. The spacecraft's controllers managed to save it, however, using secondary thrusters to boost it to the correct location over a span of 14 months.




Aerospace firm Lockheed Martin builds the AEHF spacecraft for the Air Force. The satellites weigh about 7 tons and have power-generating solar panels 89 feet long (27 meters). They're designed to operate for at least 14 years in orbit.
The Air Force's current plan calls for launching a total of four AEHF satellites, though negotiations to add two more spacecraft to the fleet are ongoing, Air Force officials said.

Monday, March 26, 2012

The NROL-25 Mission



Within the enclosed confines of the massive Space Launch Complex 6 pad at the southern end of California's Vandenberg Air Force Base, a site once envisioned to fly the space shuttle, a Delta 4 rocket and its classified satellite cargo are undergoing final preps for blastoff next week.
Liftoff is scheduled for Thursday, March 29 on the NROL-25 mission to deploy a hush-hush payload for the U.S. National Reconnaissance Office, the secretive government agency that designs and operates the country's fleet of orbiting spy satellites.
Although the exact launch time hasn't been revealed, officials say the liftoff will happen sometime between 2 and 5:15 p.m. local time (5-8:15 p.m. EDT; 2100-0015 GMT).
The launch will be the first of four that the NRO has planned this year, a batch of missions that also includes an Atlas 5 on June 20 and a Delta 4-Heavy on June 28, both from Cape Canaveral, and another Atlas 5 from Vandenberg on Aug. 2.


"Last year we executed the most aggressive launch campaign in over 25 years. We successfully launched six satellites in seven months and this year with the same determination we're scheduled to launch four more in five months," Betty Sapp, the NRO's principal deputy director, said in testimony before Congress on March 8. [Photos: Declassified U.S. Spy Satellites Revealed]
"These successful launches are a very important and visible reminder of the space reconnaissance mission the NRO started over 50 years ago, and continues with such great success today. We are committed to smart acquisition investments and practices to ensure the continued coverage and availability of our vital national security systems and we work tirelessly to deliver these systems on time and within budget."
Spy satellite surge
Last year's remarkable launch surge used various types of Atlas and Delta rocketsto launch replacement satellites into virtually all of the NRO's networks of imaging, eavesdropping, surveillance and data-relay spacecraft, plus the small Minotaur booster lofted a research and development payload.
"From launching and operating the most technically-capable systems to continued operations of legacy satellites the NRO remains the premier space reconnaissance organization in the world," said Sapp.
The identities of the satellites going up this year are not disclosed to the public. But NRO Director Bruce Carlson recently said the upcoming deployments will refresh the agency's ability to continue guarding U.S. national security.
"The launch of these systems will not only improve on the NRO's capabilities, they will also help reduce the overall age of our constellation and better deal with today's and tomorrow's global threats," he said.
More often than not, the purpose of any NRO launch is the rejuvenation of the existing constellation by replacing an aging orbiting asset with a new satellite or bringing the next generation on line. That was the major achievement of last year's surge, which came as the NRO was celebrating its 50th anniversary.
"Most aggressive launch schedule in 25 years and the satellites we launched were more complex and technically demanding than any we have launched before," Carlson said. "Through this campaign and the dedicated efforts of the NRO workforce, we proved once again that the NRO knows how to develop, acquire, launch, and operate our nation's intelligence collection satellite constellation and our worldwide coverage is as good as it has been in years."
The average age of the NRO's satellites has been reduced thanks to the newest birds put on orbit, he added, while other spacecraft see their missions evolving from the original intent to face the current threats around the globe.
"Majority of constellation is aging, but despite age of some satellites, still very robust, adaptable," he said. "Some designed to monitor Soviet communication in Northern Fleet are now used to geo-locate sensitive signals in the war zone."
Launch date looms
Next week's deployment will use the United Launch Alliance's Delta 4 rocket flying for the first time in its Medium+ (5,2) configuration, which features a single core stage filled with liquid hydrogen and liquid oxygen, a pair of strap-on solid-fuel boosters, a five-meter-diameter cryogenic upper stage and similarly sized nose cone to shroud the payload during the climb through Earth's atmosphere. [Spaceflight Now Launch Status Updates]
The towering vehicle will stand about 217 feet tall.
This is the only version of the five Delta 4 configurations that hasn't been used in the program's 18 previous launches from Florida and California. The most recent launch in January flew a close comparison, but it had the maximum number of four strap-on boosters for extra thrust off the pad instead of just two needed for the NROL-25 mission.
The payload's size likely drove the mission planners to pick a Delta 4 with the roomier nose cone size of five meters versus the other option of four meters in diameter.
The rocket will soar away from Vandenberg leaving a smoky contrail that should be visible for miles around, heading over the Pacific towards an undisclosed orbital perch.
Hobbyist satellite observers around the world will have their eyes on the sky looking to spot the new object and figure out which segment of the NRO constellation is was launched to fill.


U.S. satellite spies
It is widely understood that the NRO operates different types of satellites that include eavesdropping for intelligence-gathering, high-resolution imaging birds that collect exquisite pictures of ground targets, all-weather radar platforms to perform surveillance day and night, ship-tracking spacecraft, and the necessary communications craft to relay data from the lower-orbiting assets when they are flying outside the range of tracking stations.
All of the information obtained is shared with analysts, policy makers and the warfighters in the global hotspots.
"In 2011 alone, NRO provided extremely valuable intelligence supporting more than 15 operations to capture or kill high value targets in combat areas. In addition, NRO supported more than 120 tactical operations locating Improvised Explosive Devices, helping to prevent the most lethal attacks against our ground combat forces.
These tactical support operations also included support to ground and air tactical actions; counter-terrorist actions; and maritime anti-piracy/interdiction. We also provided vital overhead support to 17 critical Combat Search and Rescue missions. In addition to ground combat operations support, NRO supported 33 Strait of Hormuz transits ensuring U.S. Naval Forces had the intelligence assistance needed for safe passage," Sapp said in open testimony to Congress.
"In both the U.S. Central and African Command Areas of Operations, NRO has developed and deployed more than 25 reference emitters which have been used over 13,000 times, and provided a significant enhancement in our ability to geo-locate surface to air missile radar systems. This new capability has allowed U.S. and Coalition military forces to be extremely precise in targeting these significant threats." [Top 10 Space Weapons Concepts]
What's more, the NRO has sped up the turnaround time from the collection of information by the satellites to delivering that data to users like combatant commanders through new state-of-the-art systems.
"Ongoing counter-insurgency and counter-terrorism activities have underscored the tremendous impact of these systems in support of combat operations throughout the Eastern Hemisphere," said Sapp. "NRO has responded with an accelerated fielding of these ground systems that can quickly support finding and alerting potential insurgent events and meeting United States Central Command (USCENTCOM) requirements for near-real-time situational awareness battlespace."
Sophisticated space surveillance
The NRO spacecraft are considered to be some of the most sophisticated and technologically advanced in the world. But their exact capabilities, appearances and features are classified, with the public finding out only generally what they do.
"The NRO is doing amazing things today. Our reconnaissance satellites are saving lives, protecting our nation from those who would do us harm and informing our national command authorities and policy makers," said Carlson.
"In the past, the process had built-in delays. Days passed before intelligence community analysts could analyze imagery that we recovered from space. That has all changed. Today we are putting data into the hands of analysts, products into the hands of warfighters, and critical information into the hands of policy makers in time to make a difference."
SpaceflightNow will provide complete coverage of next week's launch as the NRO's latest bird takes flight from the Central Coast of California

Friday, February 24, 2012

Mobile User Objective System (MUOS)




The Navy has launched a new communications satellite after two weather-related scrubs last week.
An Atlas V rocket carrying the Mobile User Objective System (MUOS) satellite launched from Cape Canaveral at 5:15 p.m. Friday.
The $2.1 billion narrowband tactical satellite communications system was built by Lockheed Martin and is designed to improve ground communications with mobile ground forces.


  • It's the first of five that will replace the current Ultra High Frequency Follow-On (UFO) system. MUOS is designed to move 10 times more information than the existing systems. The communications will use voice, video and data simultaneously using 3G technology.

    Monday, January 23, 2012

    Wideband Global Satellites





    Senior defense officials from six countries announced a multilateral partnership in wideband global satellite (WGS) communication, which is valued at more than $10 billion, Jan. 17 here.
    The officials from Canada, Denmark, Luxembourg, the Netherlands, New Zealand and the U.S. held an initial WGS partnership steering committee meeting prior to the announcement.
    “This new WGS partnership provides an example of how the U.S. plans to continue exploring opportunities to strengthen our existing cooperative relationship and to build new partnerships,” said Heidi Grant, the Deputy Under Secretary of the Air Force for International Affairs. “These activities will bolster our mutual trust, help to achieve further interoperability for our warfighters, and will increase the capabilities and capacity of all partners.”
    Currently, there are three WGS satellites in orbit, with six additional satellites scheduled for launches from 2012 through 2018, including a ninth satellite that is enabled by the new partnership.
    “With this arrangement, each partner’s unique level of requirement will be accommodated corresponding to each partner’s level of contribution,” Grant said. “The United States’ contribution to the agreement includes the development, fielding and operation of eight satellites, and the launch services and operations for a ninth satellite.”
    According to Grant, the multilateral partners contributed $620 million of the approximate $1 billion cost to expand the WGS System with a ninth satellite.
    “This is a model of a good way to do business,” said Maj. Gen. John Hyten, the director of Space Programs in the Office of the Secretary of the Air Force for Acquisition. “From an Air Force acquisition perspective, it improves our ability to acquire the constellation in an efficient manner because it keeps an active production line going, it allows us to achieve efficiencies in the production line (and) it saves us money in the long term by having a very efficient program.
    “From an operational perspective for our Air Force operators, it puts (them) on the same system as the coalition partners,” he said.
    The general explained that Air Force operators receive air tasking orders via wideband communications, and now each partner nation has access to the system and can receive ATOs through that same system.

    Wednesday, November 9, 2011

    The Theory of Everything






    The Theory of Everything is a term for the ultimate theory of the universe—a set of equations capable of describing all phenomena that have been observed, or that will ever be observed. It is the modern incarnation of the reductionist ideal of the ancient Greeks, an approach to the natural world that has been fabulously successful in bettering the lot of mankind and continues in many people's minds to be the central paradigm of physics. A special case of this idea, and also a beautiful instance of it, is the equation of conventional nonrelativistic quantum mechanics, which describes the everyday world of human beings—air, water, rocks, fire, people, and so forth. The details of this equation are less important than the fact that it can be written down simply and is completely specified by a handful of known quantities: the charge and mass of the electron, the charges and masses of the atomic nuclei, and Planck's constant. For experts we write

    (The logical  formula could not be printed)

    The symbols Zα and Mα are the atomic number and mass of the αth nucleus, Rα is the location of this nucleus, e and m are the electron charge and mass, rj is the location of the jth electron, and ℏ is Planck's constant.
    Less immediate things in the universe, such as the planet Jupiter, nuclear fission, the sun, or isotopic abundances of elements in space are not described by this equation, because important elements such as gravity and nuclear interactions are missing. But except for light, which is easily included, and possibly gravity, these missing parts are irrelevant to people-scale phenomena, and are, for all practical purposes, the Theory of Everything for our everyday world.
    However, it is obvious glancing through this list that the Theory of Everything is not even remotely a theory of every thing. We know this equation is correct because it has been solved accurately for small numbers of particles (isolated atoms and small molecules) and found to agree in minute detail with experiment. However, it cannot be solved accurately when the number of particles exceeds about 10. No computer existing, or that will ever exist, can break this barrier because it is a catastrophe of dimension. If the amount of computer memory required to represent the quantum wavefunction of one particle is N then the amount required to represent the wavefunction of k particles is Nk. It is possible to perform approximate calculations for larger systems, and it is through such calculations that we have learned why atoms have the size they do, why chemical bonds have the length and strength they do, why solid matter has the elastic properties it does, why some things are transparent while others reflect or absorb light. With a little more experimental input for guidance it is even possible to predict atomic conformations of small molecules, simple chemical reaction rates, structural phase transitions, ferromagnetism, and sometimes even superconducting transition temperatures. But the schemes for approximating are not first-principles deductions but are rather art keyed to experiment, and thus tend to be the least reliable precisely when reliability is most needed, i.e., when experimental information is scarce, the physical behavior has no precedent, and the key questions have not yet been identified. There are many notorious failures of alleged ab initio computation methods, including the phase diagram of liquid 3He and the entire phenomenonology of high-temperature superconductors. Predicting protein functionality or the behavior of the human brain from these equations is patently absurd. So the triumph of the reductionism of the Greeks is a pyrrhic victory: We have succeeded in reducing all of ordinary physical behavior to a simple, correct Theory of Everything only to discover that it has revealed exactly nothing about many things of great importance.
    In light of this fact it strikes a thinking person as odd that the parameters e, ℏ, and m appearing in these equations may be measured accurately in laboratory experiments involving large numbers of particles. The electron charge, for example, may be accurately measured by passing current through an electrochemical cell, plating out metal atoms, and measuring the mass deposited, the separation of the atoms in the crystal being known from x-ray diffraction. Simple electrical measurements performed on superconducting rings determine to high accuracy the quantity the quantum of magnetic flux hc/2e. A version of this phenomenon also is seen in superfluid helium, where coupling to electromagnetism is irrelevant . Four-point conductance measurements on semiconductors in the quantum Hall regime accurately determine the quantity e2/h. The magnetic field generated by a superconductor that is mechanically rotated measures e/mc. These things are clearly true, yet they cannot be deduced by direct calculation from the Theory of Everything, for exact results cannot be predicted by approximate calculations. This point is still not understood by many professional physicists, who find it easier to believe that a deductive link exists and has only to be discovered than to face the truth that there is no link. But it is true nonetheless. Experiments of this kind work because there are higher organizing principles in nature that make them work. The Josephson quantum is exact because of the principle of continuous symmetry breaking. The quantum Hall effect is exact because of localization. Neither of these things can be deduced from microscopics, and both are transcendent, in that they would continue to be true and to lead to exact results even if the Theory of Everything were changed. Thus the existence of these effects is profoundly important, for it shows us that for at least some fundamental things in nature the Theory of Everything is irrelevant. P. W. Anderson's famous and apt description of this state of affairs is “more is different”.
    The emergent physical phenomena regulated by higher organizing principles have a property, namely their insensitivity to microscopics, that is directly relevant to the broad question of what is knowable in the deepest sense of the term. The low-energy excitation spectrum of a conventional superconductor, for example, is completely generic and is characterized by a handful of parameters that may be determined experimentally but cannot, in general, be computed from first principles. An even more trivial example is the low-energy excitation spectrum of a conventional crystalline insulator, which consists of transverse and longitudinal sound and nothing else, regardless of details. It is rather obvious that one does not need to prove the existence of sound in a solid, for it follows from the existence of elastic moduli at long length scales, which in turn follows from the spontaneous breaking of translational and rotational symmetry characteristic of the crystalline state. Conversely, one therefore learns little about the atomic structure of a crystalline solid by measuring its acoustics.
    The crystalline state is the simplest known example of a quantum protectorate, a stable state of matter whose generic low-energy properties are determined by a higher organizing principle and nothing else. There are many of these, the classic prototype being the Landau fermi liquid, the state of matter represented by conventional metals and normal 3He. Landau realized that the existence of well-defined fermionic quasiparticles at a fermi surface was a universal property of such systems independent of microscopic details, and he eventually abstracted this to the more general idea that low-energy elementary excitation spectra were generic and characteristic of distinct stable states of matter. Other important quantum protectorates include superfluidity in Bose liquids such as 4He and the newly discovered atomic condensates, superconductivity, band insulation, ferromagnetism, antiferromagnetism and the quantum Hall states. The low-energy excited quantum states of these systems are particles in exactly the same sense that the electron in the vacuum of quantum electrodynamics is a particle: They carry momentum, energy, spin, and charge, scatter off one another according to simple rules, obey fermi or bose statistics depending on their nature, and in some cases are even “relativistic,” in the sense of being described quantitively by Dirac or Klein-Gordon equations at low energy scales. Yet they are not elementary, and, as in the case of sound, simply do not exist outside the context of the stable state of matter in which they live. These quantum protectorates, with their associated emergent behavior, provide us with explicit demonstrations that the underlying microscopic theory can easily have no measurable consequences whatsoever at low energies. The nature of the underlying theory is unknowable until one raises the energy scale sufficiently to escape protection.
    Thus far we have addressed the behavior of matter at comparatively low energies. But why should the universe be any different? The vacuum of space-time has a number of properties (relativity, renormalizability, gauge forces, fractional quantum numbers) that ordinary matter does not possess, and this state of affairs is alleged to be something extraordinary distinguishing the matter making up the universe from the matter we see in the laboratory. But this is incorrect. It has been known since the early 1970s that renormalizability is an emergent property of ordinary matter either in stable quantum phases, such as the superconducting state, or at particular zero-temperature phase transitions between such states called quantum critical points. In either case the low-energy excitation spectrum becomes more and more generic and less and less sensitive to microscopic details as the energy scale of the measurement is lowered, until in the extreme limit of low energy all evidence of the microscopic equations vanishes away. The emergent renormalizability of quantum critical points is formally equivalent to that postulated in the standard model of elementary particles right down to the specific phrase “relevant direction” used to describe measurable quantities surviving renormalization. At least in some cases there is thought to be an emergent relativity principle in the bargain. The rest of the strange agents in the standard model also have laboratory analogues. Particles carrying fractional quantum numbers and gauge forces between these particles occur as emergent phenomena in the fractional quantum Hall effect . The Higgs mechanism is nothing but superconductivity with a few technical modifications. Dirac fermions, spontaneous breaking of CP, and topological defects all occur in the low-energy spectrum of superfluid 3He .
    Whether the universe is near a quantum critical point is not known one way or the other, for the physics of renormalization blinds one to the underlying microscopics as a matter of principle when only low-energy measurements are available. But that is exactly the point. The belief on the part of many that the renormalizability of the universe is a constraint on an underlying microscopic Theory of Everything rather than an emergent property is nothing but an unfalsifiable article of faith. But if proximity to a quantum critical point turns out to be responsible for this behavior, then just as it is impossible to infer the atomic structure of a solid by measuring long-wavelength sound, so might it be impossible to determine the true microscopic basis of the universe with the experimental tools presently at our disposal. The standard model and models based conceptually on it would be nothing but mathematically elegant phenomenological descriptions of low-energy behavior, from which, until experiments or observations could be carried out that fall outside the its region of validity, very little could be inferred about the underlying microscopic Theory of Everything. Big Bang cosmology is vulnerable to the same criticism. No one familiar with violent high-temperature phenomena would dare to infer anything about Eqs. and by studying explosions, for they are unstable and quite unpredictable one experiment to the next. The assumption that the early universe should be exempt from this problem is not justified by anything except wishful thinking. It could very well turn out that the Big Bang is the ultimate emergent phenomenon, for it is impossible to miss the similarity between the large-scale structure recently discovered in the density of galaxies and the structure of styrofoam, popcorn, or puffed cereals.
    Self-organization and protection are not inherently quantum phenomena. They occur equally well in systems with temperatures or frequency scales of measurement so high that quantum effects are unobservable. Indeed the first experimental measurements of critical exponents were made on classical fluids near their liquid-vapor critical points. Good examples would be the spontaneous crystallization exhibited by ball bearings placed in a shallow bowl, the emission of vortices by an airplane wing, finite-temperature ferromagnetism, ordering phenomena in liquid crystals, or the spontaneous formation of micelle membranes. To this day the best experimental confirmations of the renormalization group come from measurements of finite-temperature critical points. As is the case in quantum systems, these classical ones have low-frequency dynamic properties that are regulated by principles and independent of microscopic details. The existence of classical protectorates raises the possibility that such principles might even be at work in biology.
    What do we learn from a closer examination of quantum and classical protectorates? First, that these are governed by emergent rules. This means, in practice, that if you are locked in a room with the system Hamiltonian, you can't figure the rules out in the absence of experiment, and hand-shaking between theory and experiment. Second, one can follow each of the ideas that explain the behavior of the protectorates we have mentioned as it evolved historically. In solid-state physics, the experimental tools available were mainly long-wavelength, so that one needed to exploit the atomic perfection of crystal lattices to infer the rules. Imperfection is always present, but time and again it was found that fundamental understanding of the emergent rules had to wait until the materials became sufficiently free of imperfection. Conventional superconductors, for which nonmagnetic impurities do not interfere appreciably with superconductivity, provide an interesting counterexample. In general it took a long time to establish that there really were higher organizing principles leading to quantum protectorates. The reason was partly materials, but also the indirectness of the information provided by experiment and the difficulty in consolidating that information, including throwing out the results of experiments that have been perfectly executed, but provide information on minute details of a particular sample, rather than on global principles that apply to all samples.
    Some protectorates have prototypes for which the logical path to microscopics is at least discernable. This helped in establishing the viability of their assignment as protectorates. But we now understand that this is not always the case. For example, superfluid 3He, heavy-fermion metals, and cuprate superconductors appear to be systems in which all vestiges of this link have disappeared, and one is left with nothing but the low-energy principle itself. This problem is exacerbated when the principles of self-organization responsible for emergent behavior compete. When more than one kind of ordering is possible the system decides what to do based on subtleties that are often beyond our ken. How can one distinguish between such competition, as exists for example, in the cuprate superconductors, and a “mess”? The history of physics has shown that higher organizing principles are best identified in the limiting case in which the competition is turned off, and the key breakthroughs are almost always associated with the serendipitous discovery of such limits. Indeed, one could ask whether the laws of quantum mechanics would ever have been discovered if there had been no hydrogen atom. The laws are just as true in the methane molecule and are equally simple, but their manifestations are complicated.
    The fact that the essential role played by higher organizing principles in determining emergent behavior continues to be disavowed by so many physical scientists is a poignant comment on the nature of modern science. To solid-state physicists and chemists, who are schooled in quantum mechanics and deal with it every day in the context of unpredictable electronic phenomena such as organelle, Kondo insulators  or cuprate superconductivity, the existence of these principles is so obvious that it is a cliché not discussed in polite company. However, to other kinds of scientist the idea is considered dangerous and ludicrous, for it is fundamentally at odds with the reductionist beliefs central to much of physics. But the safety that comes from acknowledging only the facts one likes is fundamentally incompatible with science. Sooner or later it must be swept away by the forces of history.
    For the biologist, evolution and emergence are part of daily life. For many physicists, on the other hand, the transition from a reductionist approach may not be easy, but should, in the long run, prove highly satisfying. Living with emergence means, among other things, focusing on what experiment tells us about candidate scenarios for the way a given system might behave before attempting to explore the consequences of any specific model. This contrasts sharply with the imperative of reductionism, which requires us never to use experiment, as its objective is to construct a deductive path from the ultimate equations to the experiment without cheating. But this is unreasonable when the behavior in question is emergent, for the higher organizing principles—the core physical ideas on which the model is based—would have to be deduced from the underlying equations, and this is, in general, impossible. Repudiation of this physically unreasonable constraint is the first step down the road to fundamental discovery. No problem in physics in our time has received more attention, and with less in the way of concrete success, than that of the behavior of the cuprate superconductors, whose superconductivity was discovered serendipitously, and whose properties, especially in the underdoped region, continue to surprise. As the high-Tc community has learned to its sorrow, deduction from microscopics has not explained, and probably cannot explain as a matter of principle, the wealth of crossover behavior discovered in the normal state of the underdoped systems, much less the remarkably high superconducting transition temperatures measured at optimal doping. Paradoxically high-Tc continues to be the most important problem in solid-state physics, and perhaps physics generally, because this very richness of behavior strongly suggests the presence of a fundamentally new and unprecedented kind of quantum emergence.
    In his book “The End of Science” John Horgan  argues that our civilization is now facing barriers to the acquisition of knowledge so fundamental that the Golden Age of Science must be thought of as over. It is an instructive and humbling experience to attempt explaining this idea to a child. The outcome is always the same. The child eventually stops listening, smiles politely, and then runs off to explore the countless infinities of new things in his or her world. Horgan's book might more properly have been called the End of Reductionism, for it is actually a call to those of us concerned with the health of physical science to face the truth that in most respects the reductionist ideal has reached its limits as a guiding principle. Rather than a Theory of Everything we appear to face a hierarchy of Theories of Things, each emerging from its parent and evolving into its children as the energy scale is lowered. The end of reductionism is, however, not the end of science, or even the end of theoretical physics. How do proteins work their wonders? Why do magnetic insulators superconduct? Why is 3He a superfluid? Why is the electron mass in some metals stupendously large? Why do turbulent fluids display patterns? Why does black hole formation so resemble a quantum phase transition? Why do galaxies emit such enormous jets? The list is endless, and it does not include the most important questions of all, namely those raised by discoveries yet to come. The central task of theoretical physics in our time is no longer to write down the ultimate equations but rather to catalogue and understand emergent behavior in its many guises, including potentially life itself. We call this physics of the next century the study of complex adaptive matter. For better or worse we are now witnessing a transition from the science of the past, so intimately linked to reductionism, to the study of complex adaptive matter, firmly based in experiment, with its hope for providing a jumping-off point for new discoveries, new concepts, and new wisdom.

    Tuesday, November 8, 2011

    DARPA & Pentagon- Satellite launch from AirPlanes






    U.S. military operations rely heavily upon satellites to spy on battlefields and coordinate friendly forces across the globe, but fast-changing ground conditions or enemy attacks on satellites can threaten to overwhelm the system. That's why the Pentagon has announced $164 million to turn airliners into airborne launch platforms that can send small satellites into orbit within 24 hours.
    An airplane-based launch means that the U.S. military could swiftly deploy satellites from any normal airfield, rather than rely upon expensive and possibly vulnerable ground-based launch pads. The Pentagon's research agency, the Defense Advanced Research Projects Agency (DARPA), also anticipates slashing small satellite payload costs from more than $30,000 per pound to less than $10,000 per pound — making such launches three times cheaper.

    Taking off from an airliner flying at 25,000 feet allows the theoretical space launch vehicle to start out above most of the atmosphere. It also adds a starting speed boost to the space launch vehicle, and allows designers to create a larger, more efficient rocket nozzle.
    DARPA wants the program to demonstrate at least 12 launches of 100-pound payloads to low Earth orbit, with each launch costing about $1 million. Launches could start as soon as 2015, according to DARPA's official announcement of the program on Nov. 4.
    The U.S. military has shown past interest in having the capability to quickly put new satellites into orbit. Its attempts to create flexible orbital spies include the reusable Air Force space plane, called the X-37B, which is currently on its second test flight above the Earth.
    Satellite replacements might also be needed in case the existing satellite network becomes disabled or compromised. Hackers have demonstrated their ability to interfere with U.S. government satellites, and countries such as Russia and China possess systems capable of shooting down or disabling satellites.
    But if the new program succeeds, the U.S. military could put new satellites or satellite replacements into any orbit without the limitations of fixed geographical launch pads. Anyone hoping to stop such launches would have to consider almost any airfield as a possible launch site

    Thursday, November 3, 2011

    AEHF-1 (Advance Extremely High Frequency) Satellite.




    The U.S. Air Force has completed the initial activation of its first jam-proof Advanced Extremely High Frequency military communications satellite, and begun on-orbit testing.
    Ground terminals at Schriever AFB, Colo. and MIT/Lincoln Labs, Mass. logged in to the Lockheed Martin-built satellite as part of its startup process. The Space and Missile Systems Center at Los Angeles AFB plans to transfer responsibility for the satellite to 14th Air Force early next year.
    “By the end of November we should have completed sufficient testing to confidently make the decision on whether to ship and subsequently launch SV-2 in April 2012,” says Dave Madden, director of SMC’s Milsatcom Systems Directorate.
    Worth more than $1 billion and built on Lockheed Martin’s A2100 satellite bus, AEHF-1 required 14 months to reach orbit more than 22,000 mi. above Earth, after its Aug. 14, 2010, launch.
    Though launch was nominal, foreign object debris in the propulsion system was later found to be the culprit that prevented the liquid apogee engine to burn properly and propel the satellite higher into orbit.
    These problems forced Lockheed to forfeit $15 million of its available remaining award fee at the time.
    The total cost of the recovery mission, including assembling a team comprising the nation’s top orbital scientists and conducting painstaking forensics and reviews on the satellite prelaunch, was estimated at $25 million.
    Total cost for development, which began in 2002, and buying six AEHF satellites is at least $13.5 billion.

    Friday, October 28, 2011

    National Polar-orbiting Operation Environmental Satellite System.























    The NPP Project is a joint effort of the National Polar-orbiting Operational 
    Environmental satellite system (NPOESS) Integrated Program Office (IPO), the National 
    Oceanic and Atmospheric Administration (NOAA) and NASA. 
    The NPP spacecraft will be an Earth observing satellite carrying four instruments into a 
    polar, sun-synchronous, 824 km orbit. NPP will be launched on a Delta II launch vehicle. 
    The design lifetime of the NPP spacecraft is 5 years. 
    Instruments 
    The following instruments will be a part of the NP spacecraft: 
    The Visible-Infrared Imager Radiometer Suite (VIIRS) instrument is a 
    multispectral scanning radiometer with 3000 km swath width and derives its 
    heritage from Advanced Very High Resolution Radiometer (AVHRR), Optical 
    Line Scanner (OLS), Moderate Resolution Imaging Spectroradiomenter 
    (MODIS), and Sea-viewing Wide Field-of-View Sensor (SeaWIFS). 
    The Cross Track Infrared Sounder (CrIS) instrument is a Michelson 
    interferometer. Its heritage is the High Resolution Infrared Sounder (HIRS), 
    Atmospheric Infrared Sounder (AIRS), and the Infrared Atmospheric Sounding 
    Interferometer (IASI) radiometer. It is co-registered with the Advanced 
    Technology Microwave Sounder (ATMS) and is designed to work in conjunction 
    with it. 
    The ATMS instrument is a passive microwave radiometer with a swath width of 
    2300 km. Its heritage is the Advanced Microwave Sounding Unit (AMSU) A1/A2 
    and the Humidity Sounder for Brazil (HSB). 
    The Ozone Mapping and Profiler Suite (OMPS) measures solar scattered radiation 
    to map the vertical and horizontal distribution of ozone in the Earth’s atmosphere 
    using a nadir ultra-violet (UV) sensor and limb-scanning UV/visible (VIS) 
    sensors. Its heritage is the Solar Backscatter Ultraviolet  (SBUV)/2 radiometer, 
    the Total ozone Mapping Spectrometer (TOMS), the Shuttle ozone Limb Scatter 
    Experiment (SOLSE) and the Limb Ozone Retrieval Experiment (LORE).


    The NPP Mission Success is determined by its capabilities to provide continuation of a 
    group of earth system observations initiated by the Earth Observing System (EOS) Terra, 
    Aqua and Aura missions. The NPP Mission Success is also judged by its ability to reduce 
    the risks associated with its advance observational capabilities as they are being 
    transitioned from the NASA research program into the NPOESS operational program in 
    support of both the Department of Defense (DoD) and NOAA. These include pre- 
    operational risk reduction demonstration and validation for selected NPOESS 
    instruments, and algorithms, as well as ground data processing, archive and distribution. 
    Together these data records will fulfill the U.S. Climate Change Research Program 
    (CCRP) objectives of understanding the earth’s climate system and its variability on a 
    decadal basis. 
    The specific NASA science criteria are: 
    1. Continue vertical temperature and moisture profiles of the Earth’s atmosphere 
    with accuracy, extent, and frequency consistent with those made with the Aqua 
    satellite sensors. 
    2. Continue a record of sea surface temperature with accuracy, extent and frequency 
    consistent with those made with Terra and Aqua sensors. 
    3. Continue a record of surface biophysical and climatic parameters with accuracy, 
    extent and frequency consistent with those made with Terra and Aqua sensors. 
    4. Continue a record of cloud and aerosol properties with accuracy, extent, and 
    frequency consistent with those made with Terra and Aqua sensors. 
    5. Continue a record of ozone total column abundance and vertical profile with 
    accuracy, extent, and frequency consistent with those made with previous US 
    spacecraft making comparable measurements.

    Tuesday, October 18, 2011

    Nano Satellite





    Nano Satellite

    LOGAN, Utah — There is big news on the small satellite front. From super-secret agencies and the U.S. military to academia and private firms, as well as world space agencies and NASA, ultra-small satellites are the big thing.
    In sizing up "smallsats," there are a range of classifications in the less-than-500- kilogram department, be they minisatellites, microsatellites, nanosatellites, picosatellites, palm-size CubeSats, even the diminutive Femto satellite, weighing in at less than 100 grams.
    Cornell University has begun to delve into a postage stamp-size "satellite on a chip" design, called Sprite, envisioning a swarm of these tiny probes exploring planetary atmospheres for organic compounds.

    Call them a powerful force in the universe. Smallsats have already shown their ability to monitor disasters, study Earth’s environment and support agriculture, cartography and earth science missions.



    Smallsats are part of the solution — when they used to be a distraction, said Matt Bille, an associate with Booz Allen Hamilton in Colorado Springs, Colo.
    So, what does this foretell?"The knowledge of how to make and use smallsats has passed the tipping point," Bille told SPACE.com. "It exists worldwide and has fostered a global generation of satellite builders and engineers. It used to be only a few organizations could build a satellite. Now, a smart teenager with a CubeSat kit and a soldering iron is a space agency. We’ve only begun to grasp the implications of that."
    "What this means for the future is that use of smallsats and satellites in general will only increase. The proliferation of smallsat capabilities has unleashed the most powerful force in the universe — human creativity," Bille said.
    That was the message from Bille, joined by about 1,100 participants who gathered here Aug. 8-11 at Utah State University. The meeting was used to reflect upon 25 years of smallsat progress and what’s ahead — a gathering of experts convened by Utah State University and the American Institute of Aeronautics and Astronautics. 




    Low-cost high-tech
    Looking back over thelast few decades and gazing forward was Siegfried Janson, a senior scientist at The Aerospace Corp. in Los Angeles.
    Janson flagged the onslaught of advances in micro- and nanoelectronics, microelectromechanical (MEM) systems, solar cell technologies, global positioning systems, and the Internet itself. Toss in for good measure personal computers, he said, stuffed with multiple processors, graphic cards, pepped up with more and more memory.
    All that low-cost high tech has allowed small teams to blueprint, build and fly progressively smaller satellites with ever-increasing capacity, Janson told the audience.
    Janson anticipates that there will be a wider diversity of missions by highly capable small satellites, like formation flying to create large but virtual antenna sizes to make possible enhanced imaging from space.
    Collaboration
    "Advancement of the technologies is no longer the primary issue," said Pat Patterson, chairman of the smallsat conference and director of the strategic and military space division at Utah State University’s Space Dynamics Laboratory. "It’s still the mission that matters. It has to give the customer some value," he told SPACE.com.
    "It’s kind of all coming together," Patterson said, pointing to smallsat attitude- control devices, batteries and solar cells, new ways to beat the heat and cold of space, coupled with smaller, lower-costing launchers.
    Collaboration is the key, said Doug Sinclair, owner of Sinclair Interplanetary in Toronto, Canada. He advised that universities building CubeSats need to focus on what they do best and rely on other groups to supply other resources.
    "For instance, exchange a radio for a computer. Both groups end up with a CubeSat, but now they’ve got much better odds of succeeding," Sinclair said.
    Common utility
    As for what’s the future of smallsats, there will be growth, new missions, and new ways of working together, said Bille,expressing his own views and not speaking for Booz Allen Hamilton policy. "CubeSats are like the personal computer of this industry."
    Hunsaker’s personal crystal ball predicts networked satellites with individual IP addresses controlled through the Internet and providing individualized positioning, communications, social and multimedia capability."Perhaps just like personal computing and cell phones that have common utility among individual consumers today, smallsats will also follow that trend," said colleague Tom Hunsaker, also an associate at Booz Allen Hamilton.
    Bille concluded: “The age of microspacecraft is on solid ground now. There’s a definite trend toward putting small things together to do big accomplishments.”

    Thursday, October 13, 2011

    Boeing- CST-100 & GEOINT



    According to a September 15th press release, the agreement states that Boeing is continuing to advance its design for the CST-100 spacecraft under NASA's Commercial Crew Development Space Act Agreement. In mid-July, Boeing released several artist's renderings of it's CST(Crew Space Transportation)-100 spacecraft which will deliver it's passengers to both the ISS and the Bigelow Aerospace Orbital Space Complex (image below). The CST-100 is a bit larger than Apollo but smaller than Orion, with the ability to be able to launch on several different rockets including the Atlas, Delta and Falcon. The price hasn't quite been set for a seat on the CST-100 and the co-founder of Space Adventures, Eric Anderson, stated that the company isn't quite ready to talk about the price yet. They did however state that the pricing matrix would be competitive to the current Russian launches on the Soyuz spacecraft which is currently used by Space Adventures. To give you an idea, the Canadian billionaire, Guy Laliberte ponied up about $40 million for his last trip to the International Space Station. Find out a bit more about all of the extended training that he had to go through in order to qualify for his trip by clicking here. Mr. Laliberte had to undergo almost 200 days of intense training to prepare for this spaceflight to the Space Station -- Find out a bit more about the training by reading extracts from Guy's blog. If you navigate around the Space Adventures website, you can find lots of interesting videos, demonstrations and even video blogs and recordings from past passengers.
    So, although the pricing will most likely be far to much for an average individual to reserve a seat, it's like anything else—pricing eventually goes down as more players enter into the game. Perhaps in a couple decades, pricing might be a bit more reasonable, and we'll all be headed into orbit.

    The Company's newest technology 

    The Boeing Company [NYSE: BA] will demo its geospatial data management technologies for the Intelligence Community, defense and national security customers at the GEOINT Symposium, October 16-19 in San Antonio, Texas. The Boeing exhibit will introduce the company’s newest “Human Geography” technology. 

    “The Boeing Human Geography solution provides community data in categories such as political ideology, ethnicity, cultural habits, language, education and health care — and how these have contributed to the intelligence picture,” said Dewey Houck, Intelligence Systems Group vice president. “It offers historical trends and patterns to help give the analyst a holistic understanding of nations and regions by broadening and deepening their analytic expertise.” In addition to Human Geography, the Boeing exhibit at Booth 313 also will feature the following technologies: 
    • TAC – An analytical tool that enables real-time collaborative analysis through the persistent querying of streaming and stored data, giving users immediate access to data relevant to their topic of interest
    • 3-D Ladar – A mapping capability that uses laser light technology to produce a precise 3-D image of the terrain. The laser radar, or ladar, weighs less than 20 pounds (8 kg), enabling multi-platform use and supporting a variety of surveillance and sensing applications
    • SAR Agility – The Synthetic Aperture Radar (SAR) image analysis tool draws on the power of mass-market Graphic Processing Units (GPUs) to provide real-time processing and user interaction, resulting in fast and comprehensive extraction of actionable information from complex SAR imagery
    Boeing also will showcase its comprehensive, web-based GEOINT source-discovery solution. This solution allows online, on-demand access to search across internal and external data sources, as well as different classification levels, using Boeing eXMeritus HardwareWall and a variety of industry standard protocols and messaging formats.