This week’s episode deals with a rather pressing matter for astronomers and cosmologists. Shortly after Einstein revealed his Theory of General Relativity in 1916, scientists began pondering how it predicted that the Universe was either expanding or contracting. The debate was settled when Georges Lemaitre and Edwin Hubble confirmed that it was expanding (in 1927 and 1929, respectively). In honor of their accomplishments, the rate at which the cosmos is expanding was named the “Hubble-Lemaitre Constant” (or more commonly, the “Hubble Constant”).
As the field of astronomy expanded and telescopes improved, scientists were able to make distance measurements for objects located farther and farther away. However, these observations were restricted to objects within 4 billion light-years due to the way Earth’s atmosphere distorts light. Depending on the distances involved, astronomers relied on different methods, which came to be known as the “Cosmic Distance Ladder.” In addition to determining the age and size of the Universe, these measurements allowed astronomers to refine their estimates of the Constant.
The Hubble Space Telescope revolutionized astronomy by gradually pushing the boundaries of the “observable Universe” to less than 1 billion years after the Big Bang (13 billion light years!) That’s when scientists noticed some discrepancies. Not only did they learn that the rate of expansion had accelerated over time, but distance measurements to objects located 13 billion light-years away (the earliest galaxies) yielded different values than local measurements. This came to be known as the “Hubble Tension” or the “Crisis in Cosmology.”
While it was hoped that the James Webb Space Telescope would resolve this crisis, its observations have only confirmed that Hubble was right on the money! The crisis endures, and scientists are seeking answers. Is Einstein’s Theory of General Relativity, which is foundational to our cosmological models, wrong? Or are there additional physics/forces at work that we haven’t yet accounted for? Once we know that, we’ll know how just about everything in the Universe works!
This week was a bit of a treat for me because it allowed me to talk about something that has been unfolding over at NASA for many years. As a science communicator, I have been privileged enough to watch it unfold and have had the opportunity to comment along the way. I am referring to NASA’s long-held plans for sending crewed missions to Mars in the next decade and the many decades’ worth of planning that went into it.
Since 2010, the plan has been to send missions back to the Moon in this decade, build habitats and other infrastructure there, and use it as a testing ground to prepare for Mars. The next step was to send the first crewed mission in 2033, followed by additional missions every 26 months (coinciding with Mars being at its closest to Earth). While there have been doubts for years that NASA could accomplish this goal in that timeline, it was announced this past summer that 2033 won’t happen.
While a flyby mission could occur that year, a crewed mission where astronauts land on the surface is not likely to happen until 2040 – and that would be an ambitious goal. Meanwhile, China is still hoping to make it there by 2033, and SpaceX wants to land there even sooner. But they aren’t in any position to promise that right now since they are dealing with similar challenges and delays. Check out the episode below to learn more about how we got here and what’s likely to follow:
This week’s episode is focused on the upcoming Mars Sample Return (MSR) mission, a joint NASA-ESA venture to send a robotic mission to Mars to collect the Perseverancesamples. This will constitute the first sample-return mission from Mars, something that scientists have been planning for decades. In 2028, the mission will launch and is expected to return the samples to Earth no sooner than 2033. Unfortunately, due to recent budget cuts, the mission may be delayed or scaled back significantly.
Meanwhile, China is planning its own sample-return mission as part of the Tianwen-3 mission. This mission will launch in 2028 but is scheduled to return by July 2031. As we enter the new Space Race, it is clear that the brass ring is the Red Planet! But regardless of who secures samples from Mars and returns them to Earth first, the scientific returns will be immeasurable. Much like the Moon rocks returned by the Apollo astronauts, scientists will be able to study these samples for generations using the most cutting-edge instruments available.
These studies could finally answer questions that have remained unresolved since the days of the Viking missions. Was there ever life on Mars? Is there life there today (and where can it be found)? When did it go from being a warmer, wetter environment to the freezing, desiccated world we see today? Check out the episode to learn more:
It’s officially Launch Day! My podcast series, Stories from Space, just released its first episode. The topic, “We’re Going Back to the Moon!” talks about Artemis and related programs that will send astronauts back to the lunar surface with the long-term goal of establishing a sustained human presence on the Moon. Mostly, the episode addresses the question: why did it take us over fifty years to go back?
Answering a question like that takes about half an hour (or the length of a podcast episode). You can check it out at the Stories from Space homepage (https://www.itspmagazine.com/stories-from-space), or just click on the play button below. The episode is also available for streaming on Spotify and Apple
Good news, everyone! My services as a freelance writer were recently enlisted by the good folks who run HeroX and Universe Today. Thanks to my old friend and mentor, Fraser Cain (who consequently got me started in the indie publishing bizz), I’m going to be bringing the experience I’ve garnered writing my own blog to a more professional format – writing about space exploration, innovation and technological development.
As you can imagine, this means I’ll be doing less in the way of writing for this here website. But I promise I’ll still be around! After all, I’ve got lost more work to do on my stories, and there are always articles and headlines that need to be written about that I won’t get a chance to cover at those other sites. So rest assured, storiesbywilliams will be in operation for a long time to come.
For those unfamiliar, HeroX is a spinoff of the XPRIZE Foundation, the non-profit organization that runs public competitions intended to encourage technological development and innovation. It’s directors includes such luminaries as Google’s Elon Musk and Larry Page, director James Cameron, author and columnist Arianna Huffington, and businessman/ philanthropist Ratan Tata, and more. In short, they are kind of a big deal!
Fraser Cain, founder of Universe Today, began HeroX as a way of combining the best of the XPRIZE with a crowdfunding platform similar to Kickstarter. Basically, the site brings together people with ideas for new inventions, finds the people with the talent and resources to make them happen, and funnels contributions and donations to them to bankroll their research and development.
Universe Today, on the other hand, is kind of an old stomping ground for me. Years back, I did articles for them that dealt with a range of topics, including geology, natural science, physics, environmentalism, and astronomy. In both cases, I’ll be doing write ups on news items that involve technological development and innovation, and doing interviews with some of the people in the business.
If possible, I’ll try to link articles done for these sources to this page so people can check them out. And stay tuned for more updates on the upcoming release of Flash Forward, Oscar Mike, and my various other projects. Peace out!
For decades, the Big Bang Theory has remained the accepted theory of how the universe came to be, beating out challengers like the Steady State Theory. However, many unresolved issues remain with this theory, the most notable of which is the question of what could have existed prior to the big bang. Because of this, scientists have been looking for way to refine the theory.
Luckily, a group of theoretical physicists from the Perimeter Institute (PI) for Theoretical Physics in Waterloo, Ontario have announced a new interpretation on how the universe came to be. Essentially, they postulate that the birth of the universe could have happened after a four-dimensional star collapsed into a black hole and began ejecting debris.
This represents a big revision of the current theory, which is that universe grew from an infinitely dense point or singularity. But as to what was there before that remain unknown, and is one of a few limitations of the Big Bang. In addition, it’s hard to predict why it would have produced a universe that has an almost uniform temperature, because the age of our universe (about 13.8 billion years) does not give enough time to reach a temperature equilibrium.
Most cosmologists say the universe must have been expanding faster than the speed of light for this to happen. But according to Niayesh Afshordi, an astrophysicist with PI who co-authored the study, even that theory has problems:
For all physicists know, dragons could have come flying out of the singularity. The Big Bang was so chaotic, it’s not clear there would have been even a small homogenous patch for inflation to start working on.
The model Afshordi and her colleagues are proposing is basically a three-dimensional universe floating as a membrane (or brane) in a “bulk universe” that has four dimensions. If this “bulk universe” has four-dimensional stars, these stars could go through the same life cycles as the three-dimensional ones we are familiar with. The most massive ones would explode as supernovae, shed their skin and have the innermost parts collapse as a black hole.
The 4-D black hole would then have an “event horizon”, the boundary between the inside and the outside of a black hole. In a 3-D universe, an event horizon appears as a two-dimensional surface; but in a 4-D universe, the event horizon would be a 3-D object called a hypersphere. And when this 4-D star blows apart, the leftover material would create a 3-D brane surrounding a 3-D event horizon, and then expand.
To simplify it a little, they are postulating that the expansion of the universe was triggered by the motion of the universe through a higher-dimensional reality. While it may sound complicated, the theory does explain how the universe continues to expand and is indeed accelerating. Whereas previous theories have credited a mysterious invisible force known as “dark energy” with this, this new theory claims it is the result of the 3-D brane’s growth.
However, there is one limitation to this theory which has to do with the nearly uniform temperature of the universe. While the model does explain how this could be, the ESA’s Planck telesceop recently mapped out the universe and discovered small temperature variations in the cosmic microwave background (CBM). These patches were believed to be leftovers of the universe’s beginnings, which were a further indication that the Big Bang model holds true.
The PI team’s own CBM readings differ from this highly accurate survey by about four percent, so now they too are going back to the table and looking to refine their theory. How ironic! However, the IP team still feel the model has worth. While the Planck observations show that inflation is happening, they do not show why the inflation is happening.
Needless to say, we are nowhere near to resolving how the universe came to be, at least not in a way that resolves all the theoretical issues. But that’s the things about the Big Bang – it’s the scientific equivalent of a Hydra. No matter how many times people attempt to discredit it, it always comes back to reassert its dominance!
It occurs to me that I really haven’t given the ERB site its due over the years. They’ve provided me with endless hours of enjoyment and all I ever did was post about one of their videos. Granted, I have nowhere near the kind of following that would needed to actually give their traffic a shot in the arm, but it’s the thought that counts!
And so I thought I’d a little compilation here of some of their funniest, and educational, videos. Whether it was the match ups between Steve Jobs and Bill Gates (made shortly after Jobs death as a tribute to his life), Einstein and Stephen Hawking, or Thomas Edison and Nikola Tesla, these guys have shown a real commitment to their art and are clearly willing to do their homework!
Enjoy!
Steve Jobs vs. Bill Gates:
Einstein vs. Stephen Hawking:
Thomas Edison vs. Nikola Tesla:
Note:Though I am well aware of their existence, I have assiduously avoided posting the videos of Darth Vader vs. Adolph Hitler. Though I found them hilarious, such material is bound to offensive to some. Although, if people were willing to give me permission… 😉 😉
Yeah, that title might be a bit misleading. Technically, the news comes from Earth, but has everything to do with our study of the heavens. And this story comes to you from my own neck of the woods where – just a few kilometers from my house – the Dominion Astrophysical Observatory is about to shut down due to budget cuts.
Typically, it goes by the name of Center of the Universe, a national historic site and a hub for astronomy education in Victoria. And at the end of the summer, in what I can only say its a tragedy, it will be closed to the public for good. The National Research Council (NRC) put the official closing date at the end of August, right after the last of the student summer camps ends.
In addition, the facility houses historical artifacts like the original 1.8 metre mirror from the Plaskett Telescope and runs historical tours, multimedia shows, and youth programs. Unfortunately, this all costs about $32,000 to operate and $245,000 in employee wages, and brings in only about $47,000 per year in revenue. This gives the NRC a deficit of about $230,000 a year for this facility alone.
Naturally, Charles Drouin, spokesman for the NRC in Ottawa, said that the decision did not come easy, but was necessary. He confirmed that the active astronomy facility and national historic site will have no public outreach come late August or early September, and locals and visitors will no longer be able to tour the Plastkett Telescope, in operation since May 6, 1918.
On the bright side, the historical artifacts and displays in the Centre of the Universe building will remain in place after the facility is closed. The NRC will also be working with local community groups to find volunteers to use the space, so it will remain in operation, though in a limited capacity. This much is good news, since the loss of the site in its entirety would be an immeasurable loss for this community.
Interestingly enough, Drouin also claimed that the decision to close the facility was unrelated to the federal governments announcement in May to reorganize the NRC as an “industry-focused research and technology organization.” In short, the budget-driven decision is not being blamed on funding cuts or the desire to privatize. I wonder…
Personally, I am sad and ashamed to hear this news. The wife and I have been saying for ages that we need to go to this place and take a tour. Granted, that is not the easiest thing in the world to arrange, what with all the booked tours and the way the place seems to have an odd schedule. But you’d think we could have arranged something by now. It’s a national observatory, and right in my backyard for God sakes! To think we might have missed our chance is just plain sad…
However, there is still time, and I strongly recommend that anybody in the Saanich, Victoria, or Vancouver and Island region get their butts out and do what they can to see the place in operation before it shuts down. No telling what kind of hours and limited services it will be offering once its got only volunteers manning it. We need to take a gander at this star-gazing facility now before we lose the opportunity!
Back in January, National Geographic Magazine celebrated its 125th anniversary. In honor of this occasion, they released a special issue which commemorated the past 125 years of human exploration and looked ahead at what the future might hold. As I sat in the doctor’s office, waiting on a prescription for antibiotics to combat my awful cold, I found myself terribly inspired by the article.
So naturally, once I got home, I looked up the article and its source material and got to work. The issue of exploration, especially the future thereof, is not something I can ever pass up! So for the next few minutes (or hours, depending on how much you like to nurse a read), I present you with some possible scenarios about the coming age of deep space exploration.
Suffice it to say, National Geographic’s appraisal of the future of space travel was informative and hit on all the right subjects for me. When one considers the sheer distances involved, not to mention the amount of time, energy, and resources it would take to allow people to get there, the question of reaching into the next great frontier poses a great deal of questions and challenges.
Already, NASA, Earth’s various space agencies and even private companies have several ideas in the works or returning to the Moon, going to Mars, and to the Asteroid Belt. These include the SLS (Space Launch System), the re-purposed and upgraded version of the Saturn V rocket which took the Apollo astronauts to the Moon. Years from now, it may even be taking crews to Mars, which is slated for 2030.
And when it comes to settling the Moon, Mars, and turning the Asteroid Belt into our primary source of mineral extraction and manufacturing, these same agencies, and a number of private corporations are all invested in getting it done. SpaceX is busy testing its reusable-launch rocket, known as the Grasshopper, in the hopes of making space flight more affordable. And NASA and the ESA are perfecting a process known as “sintering” to turn Moon regolith into bases and asteroids into manufactured goods.
Meanwhile, Virgin Galactic, Reaction Engines and Golden Spike are planning to make commercial trips into space and to the Moon possible within a few years time. And with companies like Deep Space Industries and Google-backed Planetary Resources prospeting asteroids and planning expeditions, it’s only a matter of time before everything from Earth to the Jovian is being explored and claimed for our human use.
Space Colony by Stephan Martiniere
But when it comes to deep-space exploration, the stuff that would take us to the outer reaches of the Solar System and beyond, that’s where things get tricky and pretty speculative. Ideas have been on the table for some time, since the last great Space Race forced scientists to consider the long-term and come up with proposed ways of closing the gap between Earth and the stars. But to this day, they remain a scholarly footnote, conceptual and not yet realizable.
But as we embark of a renewed era of space exploration, where the stuff of science fiction is quickly becoming the stuff of science fact, these old ideas are being dusted off, paired up with newer concepts, and seriously considered. While they might not be feasible at the moment, who know what tomorrow holds? From the issues of propulsion, to housing, to cost and time expenditures, the human race is once again taking a serious look at extra-Solar exploration.
And here are some of the top contenders for the “Final Frontier”:
Nuclear Propulsion: The concept of using nuclear bombs (no joke) to propel a spacecraft was first proposed in 1946 by Stanislaw Ulam, a Polish-American mathematician who participated in the Manhattan Project. Preliminary calculations were then made by F. Reines and Ulam in 1947, and the actual project – known as Project Orion was initiated in 1958 and led by Ted Taylor at General Atomics and physicist Freeman Dyson from the Institute for Advanced Study in Princeton.
In short, the Orion design involves a large spacecraft with a high supply of thermonuclear warheads achieving propulsion by releasing a bomb behind it and then riding the detonation wave with the help of a rear-mounted pad called a “pusher”. After each blast, the explosive force is absorbed by this pusher pad, which then translates the thrust into forward momentum.
Though hardly elegant by modern standards, the proposed design offered a way of delivering the explosive (literally!) force necessary to propel a rocket over extreme distances, and solved the issue of how to utilize that force without containing it within the rocket itself. However, the drawbacks of this design are numerous and noticeable.
F0r starters, the ship itself is rather staggering in size, weighing in anywhere from 2000 to 8,000,000 tonnes, and the propulsion design releases a dangerous amount of radiation, and not just for the crew! If we are to rely on ships that utilize nuclear bombs to achieve thrust, we better find a course that will take them away from any inhabited or habitable areas. What’s more, the cost of producing a behemoth of this size (even the modest 2000 tonne version) is also staggering.
Antimatter Engine: Most science fiction authors who write about deep space exploration (at least those who want to be taken seriously) rely on anti-matter to power ships in their stories. This is no accident, since antimatter is the most potent fuel known to humanity right now. While tons of chemical fuel would be needed to propel a human mission to Mars, just tens of milligrams of antimatter, if properly harnessed, would be able to supply the requisite energy.
Fission and fusion reactions convert just a fraction of 1 percent of their mass into energy. But by combine matter with antimatter, its mirror twin, a reaction of 100 percent efficiency is achieved. For years, physicists at the CERN Laboratory in Geneva have been creating tiny quantities of antimatter by smashing subatomic particles together at near-light speeds. Given time and considerable investment, it is entirely possible this could be turned into a form of advanced propulsion.
In an antimatter rocket, a dose of antihydrogen would be mixed with an equal amount of hydrogen in a combustion chamber. The mutual annihilation of a half pound of each, for instance, would unleash more energy than a 10-megaton hydrogen bomb, along with a shower of subatomic particles called pions and muons. These particles, confined within a magnetic nozzle similar to the type necessary for a fission rocket, would fly out the back at one-third the speed of light.
However, there are natural drawback to this design as well. While a top speed of 33% the speed of light per rocket is very impressive, there’s the question of how much fuel will be needed. For example, while it would be nice to be able to reach Alpha Centauri – a mere 4.5 light years away – in 13.5 years instead of the 130 it would take using a nuclear rocket, the amount of antimatter needed would be immense.
No means exist to produce antimatter in such quantities right now, and the cost of building the kind of rocket required would be equally immense. Considerable refinements would therefore be needed and a sharp drop in the cost associated with building such a vessel before any of its kind could be deployed.
Laser Sail: Thinking beyond rockets and engines, there are some concepts which would allow a spaceship to go into deep space without the need for fuel at all. In 1948, Robert Forward put forward a twist on the ancient technique of sailing, capturing wind in a fabric sail, to propose a new form of space travel. Much like how our world is permeated by wind currents, space is filled with cosmic radiation – largely in the form of photons and energy associated with stars – that push a cosmic sail in the same way.
This was followed up again in the 1970’s, when Forward again proposed his beam-powered propulsion schemes using either lasers or masers (micro-wave lasers) to push giant sails to a significant fraction of the speed of light. When photons in the laser beam strike the sail, they would transfer their momentum and push the sail onward. The spaceship would then steadily builds up speed while the laser that propels it stays put in our solar system.
Much the same process would be used to slow the sail down as it neared its destination. This would be done by having the outer portion of the sail detach, which would then refocus and reflect the lasers back onto a smaller, inner sail. This would provide braking thrust to slow the ship down as it reached the target star system, eventually bringing it to a slow enough speed that it could achieve orbit around one of its planets.
Once more, there are challenges, foremost of which is cost. While the solar sail itself, which could be built around a central, crew-carrying vessel, would be fuel free, there’s the little matter of the lasers needed to propel it. Not only would these need to operate for years continuously at gigawatt strength, the cost of building such a monster would be astronomical, no pun intended!
A solution proposed by Forward was to use a series of enormous solar panel arrays on or near the planet Mercury. However, this just replaced one financial burden with another, as the mirror or fresnel lens would have to be planet-sized in scope in order for the Sun to keep the lasers focused on the sail. What’s more, this would require that a giant braking sail would have to be mounted on the ship as well, and it would have to very precisely focus the deceleration beam.
So while solar sails do present a highly feasible means of sending people to Mars or the Inner Solar System, it is not the best concept for interstellar space travel. While it accomplishes certain cost-saving measures with its ability to reach high speeds without fuel, these are more than recouped thanks to the power demands and apparatus needed to be it moving.
Generation/Cryo-Ship: Here we have a concept which has been explored extensively in fiction. Known as an Interstellar Ark, an O’Neill Cylinder, a Bernal Sphere, or a Stanford Torus, the basic philosophy is to create a ship that would be self-contained world, which would travel the cosmos at a slow pace and keep the crew housed, fed, or sustained until they finally reached their destination. And one of the main reasons that this concept appears so much in science fiction literature is that many of the writers who made use of it were themselves scientists.
The first known written examples include Robert H. Goddard “The Last Migration” in 1918, where he describes an “interstellar ark” containing cryogenic ally frozen people that set out for another star system after the sun died. Konstantin E. Tsiolkovsky later wrote of “Noah’s Ark” in his essay “The Future of Earth and Mankind” in 1928. Here, the crews were kept in wakeful conditions until they reached their destination thousands of years later.
By the latter half of the 20th century, with authors like Robert A. Heinlein’s Orphans of the Sky, Arthur C. Clarke’s Rendezvous with Rama and Ursula K. Le Guin’s Paradises Lost, the concept began to be explored as a distant possibility for interstellar space travel. And in 1964, Dr. Robert Enzmann proposed a concept for an interstellar spacecraft known as the Enzmann Starship that included detailed notes on how it would be constructed.
Enzmann’s concept would be powered by deuterium engines similar to what was called for with the Orion Spacecraft, the ship would measure some 600 meters (2000 feet) long and would support an initial crew of 200 people with room for expansion. An entirely serious proposal, with a detailed assessment of how it would be constructed, the Enzmann concept began appearing in a number of science fiction and fact magazines by the 1970’s.
Despite the fact that this sort of ship frees its makers from the burden of coming up with a sufficiently fast or fuel-efficient engine design, it comes with its own share of problems. First and foremost, there’s the cost of building such a behemoth. Slow-boat or no, the financial and resource burden of building a mobile space ship is beyond most countries annual GDP. Only through sheer desperation and global cooperation could anyone conceive of building such a thing.
Second, there’s the issue of the crew’s needs, which would require self-sustaining systems to ensure food, water, energy, and sanitation over a very long haul. This would almost certainly require that the crew remain aware of all its technical needs and continue to maintain it, generation after generation. And given that the people aboard the ship would be stuck in a comparatively confined space for so long, there’s the extreme likelihood of breakdown and degenerating conditions aboard.
Third, there’s the fact that the radiation environment of deep space is very different from that on the Earth’s surface or in low earth orbit. The presence of high-energy cosmic rays would pose all kinds of health risks to a crew traveling through deep space, so the effects and preventative measures would be difficult to anticipate. And last, there’s the possibility that while the slow boat is taking centuries to get through space, another, better means of space travel will be invented.
Faster-Than-Light (FTL) Travel: Last, we have the most popular concept to come out of science fiction, but which has received very little support from scientific community. Whether it was the warp drive, the hyperdrive, the jump drive, or the subspace drive, science fiction has sought to exploit the holes in our knowledge of the universe and its physical laws in order to speculate that one day, it might be possible to bridge the vast distances between star systems.
However, there are numerous science based challenges to this notion that make an FTL enthusiast want to give up before they even get started. For one, there’s Einstein’s Theory of General Relativity, which establishes the speed of light (c) as the uppermost speed at which anything can travel. For subatomic particles like photons, which have no mass and do not experience time, the speed of light is a given. But for stable matter, which has mass and is effected by time, the speed of light is a physical impossibility.
For one, the amount of energy needed to accelerate an object to such speeds is unfathomable, and the effects of time dilation – time slowing down as the speed of light approaches – would be unforeseeable. What’s more, achieving the speed of light would most likely result in our stable matter (i.e. our ships and bodies) to fly apart and become pure energy. In essence, we’d die!
Naturally, there have been those who have tried to use the basis of Special Relativity, which allows for the existence of wormholes, to postulate that it would be possible to instantaneously move from one point in the universe to another. These theories for “folding space”, or “jumping” through space time, suffer from the same problem. Not only are they purely speculative, but they raise all kinds of questions about temporal mechanics and causality. If these wormholes are portals, why just portals in space and not time?
And then there’s the concept of a quantum singularity, which is often featured in talk of FTL. The belief here is that an artificial singularity could be generated, thus opening a corridor in space-time which could then be traversed. The main problem here is that such an idea is likely suicide. A quantum singularity, aka. a black hole, is a point in space where the laws of nature break down and become indistinguishable from each other – hence the term singularity.
Also, they are created by a gravitational force so strong that it tears a hole in space time, and that resulting hole absorbs all things, including light itself, into its maw. It is therefore impossible to know what resides on the other side of one, and astronomers routinely observe black holes (most notably Sagittarius A at the center of our galaxy) swallow entire planets and belch out X-rays, evidence of their destruction. How anyone could think these were a means of safe space travel is beyond me! But then again, they are a plot device, not a serious idea…
But before you go thinking that I’m dismissing FTL in it’s entirety, there is one possibility which has the scientific community buzzing and even looking into it. It’s known as the Alcubierre Drive, a concept which was proposed by physicist Miguel Alcubierre in his 1994 paper: “The Warp Drive: Hyper-Fast Travel Within General Relativity.”
The equations and theory behind his concept postulate that since space-time can be contracted and expanded, empty space behind a starship could be made to expand rapidly, pushing the craft in a forward direction. Passengers would perceive it as movement despite the complete lack of acceleration, and vast distances (i.e. light years) could be passed in a matter of days and weeks instead of decades. What’s more, this “warp drive” would allow for FTL while at the same time remaining consistent with Einstein’s theory of Relativity.
In October 2011, physicist Harold White attempted to rework the equations while in Florida where he was helping to kick off NASA and DARPA’s joint 100 Year Starship project. While putting together his presentation on warp, he began toying with Alcubierre’s field equations and came to the conclusion that something truly workable was there. In October of 2012, he announced that he and his NASA team would be working towards its realization.
But while White himself claims its feasible, and has the support of NASA behind him, the mechanics behind it all are still theoretical, and White himself admits that the energy required to pull off this kind of “warping” of space time is beyond our means at the current time. Clearly, more time and development are needed before anything of this nature can be realized. Fingers crossed, the field equations hold, because that will mean it is at least theoretically possible!
Summary: In case it hasn’t been made manifestly obvious by now, there’s no simple solution. In fact, just about all possibilities currently under scrutiny suffer from the exact same problem: the means just don’t exist yet to make them happen. But even if we can’t reach for the stars, that shouldn’t deter us from reaching for objects that are significantly closer to our reach. In the many decades it will take us to reach the Moon, Mars, the Asteroid Belt, and Jupiter’s Moons, we are likely to revisit this problem many times over.
And I’m sure that in course of creating off-world colonies, reducing the burden on planet Earth, developing solar power and other alternative fuels, and basically working towards this thing known as the Technological Singularity, we’re likely to find that we are capable of far more than we ever thought before. After all, what is money, resources, or energy requirements when you can harness quantum energy, mine asteroids, and turn AIs and augmented minds onto the problems of solving field equations?
Yeah, take it from me, the odds are pretty much even that we will be making it to the stars in the not-too-distant future, one way or another. As far as probabilities go, there’s virtually no chance that we will be confined to this rock forever. Either we will branch out to colonize new planets and new star systems, or go extinct before we ever get the chance. I for one find that encouraging… and deeply disturbing!
A few times now, the website known as Envisioning Technology has snared me with their predictive posters. First there was their “Emerging Technologies” infographic for the year of 2012. That was followed shortly thereafter by “The future of health” and “The future of education“. They even took a look at popular dystopian and apocalyptic scenarios and asked the question “Should I be afraid“?
And now, in their latest infographic, they’ve tackled the future of finance. Looking at the financial industry as a whole, they attempt to gauge its readiness to technological change. While looking at trends that are likely to influence the very notion of value in the coming decades, they ask the question “are [organizations] paying enough attention to the imminent changes that will define the future of society or if they are running the risk of letting accelerating change vanquish existing business models?”
And as usual, the information is presented in an interconnected, multi-layered fashion. Dividing all aspects of the financial sector into the categories of Data, Automation, Security, Disintermediation (i.e. removing the “middle men”), Crowds (crowd-sourcing, crowd-funding), Mobile technology, Currencies, and Reputation, potential technologies are then listed based on whether or not they are under development, likely to be in development in the near future, or are currently being overlooked.
Take a gander and see what you think. As usual, its packed full of interesting concepts, speculative reasoning, and a ton of statistical data. And be sure to check out the website in case you have yet to see their other infographics.