It’s officially Launch Day! My podcast series, Stories from Space, just released its first episode. The topic, “We’re Going Back to the Moon!” talks about Artemis and related programs that will send astronauts back to the lunar surface with the long-term goal of establishing a sustained human presence on the Moon. Mostly, the episode addresses the question: why did it take us over fifty years to go back?
Answering a question like that takes about half an hour (or the length of a podcast episode). You can check it out at the Stories from Space homepage (https://www.itspmagazine.com/stories-from-space), or just click on the play button below. The episode is also available for streaming on Spotify and Apple
Good news, everyone! My services as a freelance writer were recently enlisted by the good folks who run HeroX and Universe Today. Thanks to my old friend and mentor, Fraser Cain (who consequently got me started in the indie publishing bizz), I’m going to be bringing the experience I’ve garnered writing my own blog to a more professional format – writing about space exploration, innovation and technological development.
As you can imagine, this means I’ll be doing less in the way of writing for this here website. But I promise I’ll still be around! After all, I’ve got lost more work to do on my stories, and there are always articles and headlines that need to be written about that I won’t get a chance to cover at those other sites. So rest assured, storiesbywilliams will be in operation for a long time to come.
For those unfamiliar, HeroX is a spinoff of the XPRIZE Foundation, the non-profit organization that runs public competitions intended to encourage technological development and innovation. It’s directors includes such luminaries as Google’s Elon Musk and Larry Page, director James Cameron, author and columnist Arianna Huffington, and businessman/ philanthropist Ratan Tata, and more. In short, they are kind of a big deal!
Fraser Cain, founder of Universe Today, began HeroX as a way of combining the best of the XPRIZE with a crowdfunding platform similar to Kickstarter. Basically, the site brings together people with ideas for new inventions, finds the people with the talent and resources to make them happen, and funnels contributions and donations to them to bankroll their research and development.
Universe Today, on the other hand, is kind of an old stomping ground for me. Years back, I did articles for them that dealt with a range of topics, including geology, natural science, physics, environmentalism, and astronomy. In both cases, I’ll be doing write ups on news items that involve technological development and innovation, and doing interviews with some of the people in the business.
If possible, I’ll try to link articles done for these sources to this page so people can check them out. And stay tuned for more updates on the upcoming release of Flash Forward, Oscar Mike, and my various other projects. Peace out!
For decades, the Big Bang Theory has remained the accepted theory of how the universe came to be, beating out challengers like the Steady State Theory. However, many unresolved issues remain with this theory, the most notable of which is the question of what could have existed prior to the big bang. Because of this, scientists have been looking for way to refine the theory.
Luckily, a group of theoretical physicists from the Perimeter Institute (PI) for Theoretical Physics in Waterloo, Ontario have announced a new interpretation on how the universe came to be. Essentially, they postulate that the birth of the universe could have happened after a four-dimensional star collapsed into a black hole and began ejecting debris.
This represents a big revision of the current theory, which is that universe grew from an infinitely dense point or singularity. But as to what was there before that remain unknown, and is one of a few limitations of the Big Bang. In addition, it’s hard to predict why it would have produced a universe that has an almost uniform temperature, because the age of our universe (about 13.8 billion years) does not give enough time to reach a temperature equilibrium.
Most cosmologists say the universe must have been expanding faster than the speed of light for this to happen. But according to Niayesh Afshordi, an astrophysicist with PI who co-authored the study, even that theory has problems:
For all physicists know, dragons could have come flying out of the singularity. The Big Bang was so chaotic, it’s not clear there would have been even a small homogenous patch for inflation to start working on.
The model Afshordi and her colleagues are proposing is basically a three-dimensional universe floating as a membrane (or brane) in a “bulk universe” that has four dimensions. If this “bulk universe” has four-dimensional stars, these stars could go through the same life cycles as the three-dimensional ones we are familiar with. The most massive ones would explode as supernovae, shed their skin and have the innermost parts collapse as a black hole.
The 4-D black hole would then have an “event horizon”, the boundary between the inside and the outside of a black hole. In a 3-D universe, an event horizon appears as a two-dimensional surface; but in a 4-D universe, the event horizon would be a 3-D object called a hypersphere. And when this 4-D star blows apart, the leftover material would create a 3-D brane surrounding a 3-D event horizon, and then expand.
To simplify it a little, they are postulating that the expansion of the universe was triggered by the motion of the universe through a higher-dimensional reality. While it may sound complicated, the theory does explain how the universe continues to expand and is indeed accelerating. Whereas previous theories have credited a mysterious invisible force known as “dark energy” with this, this new theory claims it is the result of the 3-D brane’s growth.
However, there is one limitation to this theory which has to do with the nearly uniform temperature of the universe. While the model does explain how this could be, the ESA’s Planck telesceop recently mapped out the universe and discovered small temperature variations in the cosmic microwave background (CBM). These patches were believed to be leftovers of the universe’s beginnings, which were a further indication that the Big Bang model holds true.
The PI team’s own CBM readings differ from this highly accurate survey by about four percent, so now they too are going back to the table and looking to refine their theory. How ironic! However, the IP team still feel the model has worth. While the Planck observations show that inflation is happening, they do not show why the inflation is happening.
Needless to say, we are nowhere near to resolving how the universe came to be, at least not in a way that resolves all the theoretical issues. But that’s the things about the Big Bang – it’s the scientific equivalent of a Hydra. No matter how many times people attempt to discredit it, it always comes back to reassert its dominance!
It occurs to me that I really haven’t given the ERB site its due over the years. They’ve provided me with endless hours of enjoyment and all I ever did was post about one of their videos. Granted, I have nowhere near the kind of following that would needed to actually give their traffic a shot in the arm, but it’s the thought that counts!
And so I thought I’d a little compilation here of some of their funniest, and educational, videos. Whether it was the match ups between Steve Jobs and Bill Gates (made shortly after Jobs death as a tribute to his life), Einstein and Stephen Hawking, or Thomas Edison and Nikola Tesla, these guys have shown a real commitment to their art and are clearly willing to do their homework!
Enjoy!
Steve Jobs vs. Bill Gates:
Einstein vs. Stephen Hawking:
Thomas Edison vs. Nikola Tesla:
Note:Though I am well aware of their existence, I have assiduously avoided posting the videos of Darth Vader vs. Adolph Hitler. Though I found them hilarious, such material is bound to offensive to some. Although, if people were willing to give me permission… 😉 😉
Yeah, that title might be a bit misleading. Technically, the news comes from Earth, but has everything to do with our study of the heavens. And this story comes to you from my own neck of the woods where – just a few kilometers from my house – the Dominion Astrophysical Observatory is about to shut down due to budget cuts.
Typically, it goes by the name of Center of the Universe, a national historic site and a hub for astronomy education in Victoria. And at the end of the summer, in what I can only say its a tragedy, it will be closed to the public for good. The National Research Council (NRC) put the official closing date at the end of August, right after the last of the student summer camps ends.
In addition, the facility houses historical artifacts like the original 1.8 metre mirror from the Plaskett Telescope and runs historical tours, multimedia shows, and youth programs. Unfortunately, this all costs about $32,000 to operate and $245,000 in employee wages, and brings in only about $47,000 per year in revenue. This gives the NRC a deficit of about $230,000 a year for this facility alone.
Naturally, Charles Drouin, spokesman for the NRC in Ottawa, said that the decision did not come easy, but was necessary. He confirmed that the active astronomy facility and national historic site will have no public outreach come late August or early September, and locals and visitors will no longer be able to tour the Plastkett Telescope, in operation since May 6, 1918.
On the bright side, the historical artifacts and displays in the Centre of the Universe building will remain in place after the facility is closed. The NRC will also be working with local community groups to find volunteers to use the space, so it will remain in operation, though in a limited capacity. This much is good news, since the loss of the site in its entirety would be an immeasurable loss for this community.
Interestingly enough, Drouin also claimed that the decision to close the facility was unrelated to the federal governments announcement in May to reorganize the NRC as an “industry-focused research and technology organization.” In short, the budget-driven decision is not being blamed on funding cuts or the desire to privatize. I wonder…
Personally, I am sad and ashamed to hear this news. The wife and I have been saying for ages that we need to go to this place and take a tour. Granted, that is not the easiest thing in the world to arrange, what with all the booked tours and the way the place seems to have an odd schedule. But you’d think we could have arranged something by now. It’s a national observatory, and right in my backyard for God sakes! To think we might have missed our chance is just plain sad…
However, there is still time, and I strongly recommend that anybody in the Saanich, Victoria, or Vancouver and Island region get their butts out and do what they can to see the place in operation before it shuts down. No telling what kind of hours and limited services it will be offering once its got only volunteers manning it. We need to take a gander at this star-gazing facility now before we lose the opportunity!
Back in January, National Geographic Magazine celebrated its 125th anniversary. In honor of this occasion, they released a special issue which commemorated the past 125 years of human exploration and looked ahead at what the future might hold. As I sat in the doctor’s office, waiting on a prescription for antibiotics to combat my awful cold, I found myself terribly inspired by the article.
So naturally, once I got home, I looked up the article and its source material and got to work. The issue of exploration, especially the future thereof, is not something I can ever pass up! So for the next few minutes (or hours, depending on how much you like to nurse a read), I present you with some possible scenarios about the coming age of deep space exploration.
Suffice it to say, National Geographic’s appraisal of the future of space travel was informative and hit on all the right subjects for me. When one considers the sheer distances involved, not to mention the amount of time, energy, and resources it would take to allow people to get there, the question of reaching into the next great frontier poses a great deal of questions and challenges.
Already, NASA, Earth’s various space agencies and even private companies have several ideas in the works or returning to the Moon, going to Mars, and to the Asteroid Belt. These include the SLS (Space Launch System), the re-purposed and upgraded version of the Saturn V rocket which took the Apollo astronauts to the Moon. Years from now, it may even be taking crews to Mars, which is slated for 2030.
And when it comes to settling the Moon, Mars, and turning the Asteroid Belt into our primary source of mineral extraction and manufacturing, these same agencies, and a number of private corporations are all invested in getting it done. SpaceX is busy testing its reusable-launch rocket, known as the Grasshopper, in the hopes of making space flight more affordable. And NASA and the ESA are perfecting a process known as “sintering” to turn Moon regolith into bases and asteroids into manufactured goods.
Meanwhile, Virgin Galactic, Reaction Engines and Golden Spike are planning to make commercial trips into space and to the Moon possible within a few years time. And with companies like Deep Space Industries and Google-backed Planetary Resources prospeting asteroids and planning expeditions, it’s only a matter of time before everything from Earth to the Jovian is being explored and claimed for our human use.
Space Colony by Stephan Martiniere
But when it comes to deep-space exploration, the stuff that would take us to the outer reaches of the Solar System and beyond, that’s where things get tricky and pretty speculative. Ideas have been on the table for some time, since the last great Space Race forced scientists to consider the long-term and come up with proposed ways of closing the gap between Earth and the stars. But to this day, they remain a scholarly footnote, conceptual and not yet realizable.
But as we embark of a renewed era of space exploration, where the stuff of science fiction is quickly becoming the stuff of science fact, these old ideas are being dusted off, paired up with newer concepts, and seriously considered. While they might not be feasible at the moment, who know what tomorrow holds? From the issues of propulsion, to housing, to cost and time expenditures, the human race is once again taking a serious look at extra-Solar exploration.
And here are some of the top contenders for the “Final Frontier”:
Nuclear Propulsion: The concept of using nuclear bombs (no joke) to propel a spacecraft was first proposed in 1946 by Stanislaw Ulam, a Polish-American mathematician who participated in the Manhattan Project. Preliminary calculations were then made by F. Reines and Ulam in 1947, and the actual project – known as Project Orion was initiated in 1958 and led by Ted Taylor at General Atomics and physicist Freeman Dyson from the Institute for Advanced Study in Princeton.
In short, the Orion design involves a large spacecraft with a high supply of thermonuclear warheads achieving propulsion by releasing a bomb behind it and then riding the detonation wave with the help of a rear-mounted pad called a “pusher”. After each blast, the explosive force is absorbed by this pusher pad, which then translates the thrust into forward momentum.
Though hardly elegant by modern standards, the proposed design offered a way of delivering the explosive (literally!) force necessary to propel a rocket over extreme distances, and solved the issue of how to utilize that force without containing it within the rocket itself. However, the drawbacks of this design are numerous and noticeable.
F0r starters, the ship itself is rather staggering in size, weighing in anywhere from 2000 to 8,000,000 tonnes, and the propulsion design releases a dangerous amount of radiation, and not just for the crew! If we are to rely on ships that utilize nuclear bombs to achieve thrust, we better find a course that will take them away from any inhabited or habitable areas. What’s more, the cost of producing a behemoth of this size (even the modest 2000 tonne version) is also staggering.
Antimatter Engine: Most science fiction authors who write about deep space exploration (at least those who want to be taken seriously) rely on anti-matter to power ships in their stories. This is no accident, since antimatter is the most potent fuel known to humanity right now. While tons of chemical fuel would be needed to propel a human mission to Mars, just tens of milligrams of antimatter, if properly harnessed, would be able to supply the requisite energy.
Fission and fusion reactions convert just a fraction of 1 percent of their mass into energy. But by combine matter with antimatter, its mirror twin, a reaction of 100 percent efficiency is achieved. For years, physicists at the CERN Laboratory in Geneva have been creating tiny quantities of antimatter by smashing subatomic particles together at near-light speeds. Given time and considerable investment, it is entirely possible this could be turned into a form of advanced propulsion.
In an antimatter rocket, a dose of antihydrogen would be mixed with an equal amount of hydrogen in a combustion chamber. The mutual annihilation of a half pound of each, for instance, would unleash more energy than a 10-megaton hydrogen bomb, along with a shower of subatomic particles called pions and muons. These particles, confined within a magnetic nozzle similar to the type necessary for a fission rocket, would fly out the back at one-third the speed of light.
However, there are natural drawback to this design as well. While a top speed of 33% the speed of light per rocket is very impressive, there’s the question of how much fuel will be needed. For example, while it would be nice to be able to reach Alpha Centauri – a mere 4.5 light years away – in 13.5 years instead of the 130 it would take using a nuclear rocket, the amount of antimatter needed would be immense.
No means exist to produce antimatter in such quantities right now, and the cost of building the kind of rocket required would be equally immense. Considerable refinements would therefore be needed and a sharp drop in the cost associated with building such a vessel before any of its kind could be deployed.
Laser Sail: Thinking beyond rockets and engines, there are some concepts which would allow a spaceship to go into deep space without the need for fuel at all. In 1948, Robert Forward put forward a twist on the ancient technique of sailing, capturing wind in a fabric sail, to propose a new form of space travel. Much like how our world is permeated by wind currents, space is filled with cosmic radiation – largely in the form of photons and energy associated with stars – that push a cosmic sail in the same way.
This was followed up again in the 1970’s, when Forward again proposed his beam-powered propulsion schemes using either lasers or masers (micro-wave lasers) to push giant sails to a significant fraction of the speed of light. When photons in the laser beam strike the sail, they would transfer their momentum and push the sail onward. The spaceship would then steadily builds up speed while the laser that propels it stays put in our solar system.
Much the same process would be used to slow the sail down as it neared its destination. This would be done by having the outer portion of the sail detach, which would then refocus and reflect the lasers back onto a smaller, inner sail. This would provide braking thrust to slow the ship down as it reached the target star system, eventually bringing it to a slow enough speed that it could achieve orbit around one of its planets.
Once more, there are challenges, foremost of which is cost. While the solar sail itself, which could be built around a central, crew-carrying vessel, would be fuel free, there’s the little matter of the lasers needed to propel it. Not only would these need to operate for years continuously at gigawatt strength, the cost of building such a monster would be astronomical, no pun intended!
A solution proposed by Forward was to use a series of enormous solar panel arrays on or near the planet Mercury. However, this just replaced one financial burden with another, as the mirror or fresnel lens would have to be planet-sized in scope in order for the Sun to keep the lasers focused on the sail. What’s more, this would require that a giant braking sail would have to be mounted on the ship as well, and it would have to very precisely focus the deceleration beam.
So while solar sails do present a highly feasible means of sending people to Mars or the Inner Solar System, it is not the best concept for interstellar space travel. While it accomplishes certain cost-saving measures with its ability to reach high speeds without fuel, these are more than recouped thanks to the power demands and apparatus needed to be it moving.
Generation/Cryo-Ship: Here we have a concept which has been explored extensively in fiction. Known as an Interstellar Ark, an O’Neill Cylinder, a Bernal Sphere, or a Stanford Torus, the basic philosophy is to create a ship that would be self-contained world, which would travel the cosmos at a slow pace and keep the crew housed, fed, or sustained until they finally reached their destination. And one of the main reasons that this concept appears so much in science fiction literature is that many of the writers who made use of it were themselves scientists.
The first known written examples include Robert H. Goddard “The Last Migration” in 1918, where he describes an “interstellar ark” containing cryogenic ally frozen people that set out for another star system after the sun died. Konstantin E. Tsiolkovsky later wrote of “Noah’s Ark” in his essay “The Future of Earth and Mankind” in 1928. Here, the crews were kept in wakeful conditions until they reached their destination thousands of years later.
By the latter half of the 20th century, with authors like Robert A. Heinlein’s Orphans of the Sky, Arthur C. Clarke’s Rendezvous with Rama and Ursula K. Le Guin’s Paradises Lost, the concept began to be explored as a distant possibility for interstellar space travel. And in 1964, Dr. Robert Enzmann proposed a concept for an interstellar spacecraft known as the Enzmann Starship that included detailed notes on how it would be constructed.
Enzmann’s concept would be powered by deuterium engines similar to what was called for with the Orion Spacecraft, the ship would measure some 600 meters (2000 feet) long and would support an initial crew of 200 people with room for expansion. An entirely serious proposal, with a detailed assessment of how it would be constructed, the Enzmann concept began appearing in a number of science fiction and fact magazines by the 1970’s.
Despite the fact that this sort of ship frees its makers from the burden of coming up with a sufficiently fast or fuel-efficient engine design, it comes with its own share of problems. First and foremost, there’s the cost of building such a behemoth. Slow-boat or no, the financial and resource burden of building a mobile space ship is beyond most countries annual GDP. Only through sheer desperation and global cooperation could anyone conceive of building such a thing.
Second, there’s the issue of the crew’s needs, which would require self-sustaining systems to ensure food, water, energy, and sanitation over a very long haul. This would almost certainly require that the crew remain aware of all its technical needs and continue to maintain it, generation after generation. And given that the people aboard the ship would be stuck in a comparatively confined space for so long, there’s the extreme likelihood of breakdown and degenerating conditions aboard.
Third, there’s the fact that the radiation environment of deep space is very different from that on the Earth’s surface or in low earth orbit. The presence of high-energy cosmic rays would pose all kinds of health risks to a crew traveling through deep space, so the effects and preventative measures would be difficult to anticipate. And last, there’s the possibility that while the slow boat is taking centuries to get through space, another, better means of space travel will be invented.
Faster-Than-Light (FTL) Travel: Last, we have the most popular concept to come out of science fiction, but which has received very little support from scientific community. Whether it was the warp drive, the hyperdrive, the jump drive, or the subspace drive, science fiction has sought to exploit the holes in our knowledge of the universe and its physical laws in order to speculate that one day, it might be possible to bridge the vast distances between star systems.
However, there are numerous science based challenges to this notion that make an FTL enthusiast want to give up before they even get started. For one, there’s Einstein’s Theory of General Relativity, which establishes the speed of light (c) as the uppermost speed at which anything can travel. For subatomic particles like photons, which have no mass and do not experience time, the speed of light is a given. But for stable matter, which has mass and is effected by time, the speed of light is a physical impossibility.
For one, the amount of energy needed to accelerate an object to such speeds is unfathomable, and the effects of time dilation – time slowing down as the speed of light approaches – would be unforeseeable. What’s more, achieving the speed of light would most likely result in our stable matter (i.e. our ships and bodies) to fly apart and become pure energy. In essence, we’d die!
Naturally, there have been those who have tried to use the basis of Special Relativity, which allows for the existence of wormholes, to postulate that it would be possible to instantaneously move from one point in the universe to another. These theories for “folding space”, or “jumping” through space time, suffer from the same problem. Not only are they purely speculative, but they raise all kinds of questions about temporal mechanics and causality. If these wormholes are portals, why just portals in space and not time?
And then there’s the concept of a quantum singularity, which is often featured in talk of FTL. The belief here is that an artificial singularity could be generated, thus opening a corridor in space-time which could then be traversed. The main problem here is that such an idea is likely suicide. A quantum singularity, aka. a black hole, is a point in space where the laws of nature break down and become indistinguishable from each other – hence the term singularity.
Also, they are created by a gravitational force so strong that it tears a hole in space time, and that resulting hole absorbs all things, including light itself, into its maw. It is therefore impossible to know what resides on the other side of one, and astronomers routinely observe black holes (most notably Sagittarius A at the center of our galaxy) swallow entire planets and belch out X-rays, evidence of their destruction. How anyone could think these were a means of safe space travel is beyond me! But then again, they are a plot device, not a serious idea…
But before you go thinking that I’m dismissing FTL in it’s entirety, there is one possibility which has the scientific community buzzing and even looking into it. It’s known as the Alcubierre Drive, a concept which was proposed by physicist Miguel Alcubierre in his 1994 paper: “The Warp Drive: Hyper-Fast Travel Within General Relativity.”
The equations and theory behind his concept postulate that since space-time can be contracted and expanded, empty space behind a starship could be made to expand rapidly, pushing the craft in a forward direction. Passengers would perceive it as movement despite the complete lack of acceleration, and vast distances (i.e. light years) could be passed in a matter of days and weeks instead of decades. What’s more, this “warp drive” would allow for FTL while at the same time remaining consistent with Einstein’s theory of Relativity.
In October 2011, physicist Harold White attempted to rework the equations while in Florida where he was helping to kick off NASA and DARPA’s joint 100 Year Starship project. While putting together his presentation on warp, he began toying with Alcubierre’s field equations and came to the conclusion that something truly workable was there. In October of 2012, he announced that he and his NASA team would be working towards its realization.
But while White himself claims its feasible, and has the support of NASA behind him, the mechanics behind it all are still theoretical, and White himself admits that the energy required to pull off this kind of “warping” of space time is beyond our means at the current time. Clearly, more time and development are needed before anything of this nature can be realized. Fingers crossed, the field equations hold, because that will mean it is at least theoretically possible!
Summary: In case it hasn’t been made manifestly obvious by now, there’s no simple solution. In fact, just about all possibilities currently under scrutiny suffer from the exact same problem: the means just don’t exist yet to make them happen. But even if we can’t reach for the stars, that shouldn’t deter us from reaching for objects that are significantly closer to our reach. In the many decades it will take us to reach the Moon, Mars, the Asteroid Belt, and Jupiter’s Moons, we are likely to revisit this problem many times over.
And I’m sure that in course of creating off-world colonies, reducing the burden on planet Earth, developing solar power and other alternative fuels, and basically working towards this thing known as the Technological Singularity, we’re likely to find that we are capable of far more than we ever thought before. After all, what is money, resources, or energy requirements when you can harness quantum energy, mine asteroids, and turn AIs and augmented minds onto the problems of solving field equations?
Yeah, take it from me, the odds are pretty much even that we will be making it to the stars in the not-too-distant future, one way or another. As far as probabilities go, there’s virtually no chance that we will be confined to this rock forever. Either we will branch out to colonize new planets and new star systems, or go extinct before we ever get the chance. I for one find that encouraging… and deeply disturbing!
A few times now, the website known as Envisioning Technology has snared me with their predictive posters. First there was their “Emerging Technologies” infographic for the year of 2012. That was followed shortly thereafter by “The future of health” and “The future of education“. They even took a look at popular dystopian and apocalyptic scenarios and asked the question “Should I be afraid“?
And now, in their latest infographic, they’ve tackled the future of finance. Looking at the financial industry as a whole, they attempt to gauge its readiness to technological change. While looking at trends that are likely to influence the very notion of value in the coming decades, they ask the question “are [organizations] paying enough attention to the imminent changes that will define the future of society or if they are running the risk of letting accelerating change vanquish existing business models?”
And as usual, the information is presented in an interconnected, multi-layered fashion. Dividing all aspects of the financial sector into the categories of Data, Automation, Security, Disintermediation (i.e. removing the “middle men”), Crowds (crowd-sourcing, crowd-funding), Mobile technology, Currencies, and Reputation, potential technologies are then listed based on whether or not they are under development, likely to be in development in the near future, or are currently being overlooked.
Take a gander and see what you think. As usual, its packed full of interesting concepts, speculative reasoning, and a ton of statistical data. And be sure to check out the website in case you have yet to see their other infographics.
As I learned not long ago, today is the 540th birthday of the late great man who definitely proved that the Earth revolved around the sun. And so I thought I’d take some time out of my busy (not so much today!) schedule to honor this great man and the massive contribution he made to astronomy, science and our understanding of the universe.
Given the importance of these contributions, I shall do my best to be pay homage to him while at the same time being as brief and succinct as I possibly can. Ready? Here goes…
Background: Born in Toruń (Thorn), Poland on 19 February 1473, Mikolaj Kopernik was the youngest of four children to be born into his wealthy merchant family. Given his background, Copernicus’ family was able to provide an extensive education for their son, which took him from Thorn to Włocławek to Krakow, where he attended university. In this time, he learned to speak many languages – including Polish, Greek, Italian, German and Latin (the language of academia in his day) – and also showed himself to be adept at mathematics and science.
During this time, he also received a great deal of exposure to astronomy, since it was during his years in Krakow (1491-1495) that the Krakow astronomical-mathematical school was experiencing its heyday. He was also exposed to the writings of Aristotle and Averroes, and became very self-guided in his learning, collecting numerous books on the subject of astronomy for his personal library.
Leaving Krakow without taking a degree, Copernicus moved to Warmia (northern Poland) where he turned to the study of canon law, perhaps in part because of his family’s strong Roman Catholic background. However, his love for the humanities and astronomy never left him, and he seemed to devote himself to these subjects even as he worked to obtain his doctorate in law. It was also during his time in Warmia that he met the famous astronomer Domenico Maria Novara da Ferrara and became his disciple and assistant.
Under Ferrara, Copernicus traveled to Bologna, Italy and began critiquing the logical contradictions in the two most popular systems of astronomy – Aristotle’s theory of homocentric spheres, and Ptolemy’s mechanism of eccentrics and epicycles – that would eventually lead him to doubt both models. In the early 1500’s, while studying medicine at the University of Padua in Italy, he used the opportunity to pour over the libraries many ancient Greek and Latin texts to find historic information about ancient astronomical, cosmological and calendar systems.
In 1503, having finally earned his doctorate in canon law, Copernicus returned to Warmia where he would spend the remaining 40 years of his life. It was here that all of his observations about the movement of the planets, and the contradictions in the current astronomic models, would crystallize into his model for the heliocentric universe. However, due to fears that the publication of his theories would lead to official sanction from the church, he withheld his research until a year before he died.
It was only in 1542, after he had been seized with apoplexy and paralysis, that he sent his treaties, De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres) to Nuremberg to be published. It is said that on the day of his death, May 24th 1543 at the age of 70, he was presented with an advance copy of his book.
Impact and Legacy: The immediate reaction of the church to the publication of Copernicus’ theories was quite limited. In time, Dominican scholars would seek to refute based on logical arguments and Aquinism, ranging from the positions of planets in the sky to very idea that Earth could be in motion. However, in attempting to disprove Copernicus’ theory, his detractors merely fostered a debate which would provide the impetus for reevaluating the field of physics and proving the heliocentric model correct.
And in time, with the help of such astronomers and mathematicians as Galileo, the debate would come to a head. Using the telescope, a technology he helped pioneer, he was able to demonstrate that the size of the planets during various times in the year did indeed conform to the heliocentric model, and that it was only through distortions caused by observing with the naked eye that made them seem larger (hence, closer to Earth) than they really were.
And although Galileo would eventually be forced to recant and placed under house arrest for his last few years on this Earth, the Copernican system became the defacto model of astronomy henceforth, and would help to launch the Scientific Revolution whereby several long-established theories would come to be challenged. These included the age of the Earth, the existence of other moons in our Solar System, Universal Gravitation, and the belief in the universe as a giant, rationalized clockwork mechanism.
Final Thoughts:
Naturally, there are those purists who would point out that he was not the first to propose a heliocentric planet system. In fact, the concept of a universe with the sun at the epicenter dates back Ancient Greece. However, Copernicus would be the first astronomer to propose a comprehensive model, which would later be refined by Galileo Galilee.
Other purists would point out that his system, when he developed it, had numerous observation and/or mathematical flaws, and that it was only after Galileo’s observations of the heavens with his telescope that his theories were made to work. But it is precisely because he was able to realize the truth of our corner of the universe, sans a reliable telescope, that makes this accomplishment so meaningful.
In Copernicus’ time, the rigors of the Aristotelian and Ptolemaic models were still seem by the majority of astronomers to be the correct one, regardless of church doctrine or religious bias. In purely mathematical terms, there was little reason to make an intuitive leap and suppose that the great minds on which Scholastic science was based had got it all wrong.
So when it comes right down to it, Copernicus was an intuitive genius the likes of which is seen only once in a lifetime. What’s more, his discoveries and the publication thereof helped bring humanity out of the Dark Ages – a time where learning and the hearts and minds of men were still under the iron grip of the Church – and helped usher in the modern age of science.
And if I could get a bit polemic for a second, I would like to say that it is unfortunate then that much of what Copernicus helped to overcome is once prevalent in society today. In recent years, long-established scientific truths like Evolution, Global Warming, and Homosexuality have being challenged by individuals who claim they are lies or merely “theories” that have yet to be proven. In all cases, it is clear what the agenda is, and once again faith and God are being used as a justification.
In fact, despite the monumental growth in learning and the explosion in information sharing that has come with the digital age, it seems that misinformation is being spread like never before. Whereas previous generations could always blame ignorance or lack of education, we few who are privileged enough to live in a modern, secular, democratic and industrialized nation have no such excuses.
And yet, it seems that some decidedly medieval trends are determined to persist. Despite living in a time when the vast and infinite nature of the universe is plain to see, there are still those who would insist on making it smaller just so they can sleep soundly in their beds. As if that’s not enough, they feel the need to villify that which they don’t understand, or openly threaten to kill those who preach it.
Sorry, like I said, polemic! And on this day of days, we can’t help but remember the lessons of history and how so often they are ignored. So if I might offer a suggestion to all people on this day, it would be to choose a subject they feel uninformed about and learn what they can about it. And do not trust just any source, consider the built-in biases and political slants of whatever it is you are reading. And if possible, go out and hug a scientist! Tell them you accept them, do not fear what they have to say, and will not be sending them death threats for doing what they do.
Well, it seems that science and pop culture are coming together once again thanks to the hit show Big Bang Theory. Only this time, it seems things are flowing in the other direction, with scientists paying an homage to the show that has made being a geek cool in the eyes of so many. And it all began recently when a Brazilian biologist discovered a new species of bee that had been eluding scientists for years.
According to Andre Nemesio of the Universidade Federal de Uberlandia, this new species closely resembles the Euglossa ignita, a more common Western Brazilian orchid bee. Because of this, it remained unrecognized as a seperate species by biologists until very recently. In essence, the bee managed to trick scientists, which is why Nemesio decided to name it “Euglossa bazinga”, in honor of Sheldon Cooper.
In a recent paper, he explained his decision: “The specific epithet honors the clever, funny, captivating ‘nerd’ character Sheldon Cooper… Sheldon Cooper’s favorite comic word ‘bazinga,’ used by him when tricking somebody, was here chosen to represent the character. Euglossa bazinga sp. n. has tricked us for some time due to its similarity to E. ignita, what led us to use ‘bazinga.'”
In response, Steven Molaro – an executive producer of “The Big Bang Theory” – said “we are always extremely flattered when the science community embraces our show. Sheldon would be honored to know that Euglossa bazinga was inspired by him. In fact, after ‘Mothra’ and griffins, bees are his third-favorite flying creatures.”
Kudos Sheldon, you weird, annoying, but always entertaining and brilliantly acted nerd! I do hope they write this into the show, it would be comical to see his reaction to the news! And while we’re at it, here’s some of his greatest hits from over the years:
Medimachine: noun, a nanotechnological device used for medical applications. Granted, that’s not a working definition, but it does encompass what the technology is all about. And, as it happens, researchers at Standford created the world’s first device which is capable of traveling through the human bloodstream and which is controlled and powered wirelessly just this past year.
This development came in the midst of a similar significant development over at MIT. In January of this year, they announced that they had developed the world’s first implantable microchip that could deliver drugs directly into the bloodstream. This chip is also controlled wirelessly, and is the first step towards remote implants that could contain an entire pharmacy.
According to Ada Poon, the lead developer of the Standford team, the next step in the development of this device will be creating models that incorporate sensors and drug delivery systems for the ultimate in pin-point accurate medicine. If successful, Poon and her team could very well be responsible for creating the prototype device that will inspire entire generations of medical machines that are conducting exploratory exams, cleaning our arteries, removing tumors, destroying pathogens and viruses, and even repairing internal injuries.
And just think, if this development triggers further research and development, it could very well lead to nanomachines which are capable of making even tinier nanomachines. These devices could in turn manipulate matter on the mitochondrial level, correcting faults in our DNA and turning harmful or unwanted cells into something more useful for our bodies.
Just another step on the road to transhumanism and post-mortality!