A responsibility to reach out
11 Aug 2015 by Evoluted New Media
In the age of the science press release, it’s tempting to think the public are engaged in scientists work. But we can’t rest on our laurels says Alex Codoreanu – scientists have a responsibility to make sure they communicate their work in an accessible way to the public at large
In the age of the science press release, it’s tempting to think the public are engaged in scientists work. But we can’t rest on our laurels says Alex Codoreanu – scientists have a responsibility to make sure they communicate their work in an accessible way to the public at large
In the era of science news releases and with the advent of Twitter momentum, the general public is at once more connected and separated from the scientific process. They can find themselves unable to process, incorporate and take in new discoveries so it then becomes more important than at any other time in history for us to effectively communicate and educate the general public.
Many of us that have had the opportunity to teach introductory courses and have seen the struggle of incoming students with understanding the context of what they learn. Overwhelmed by new information, they don’t see the connection to the historical past of the material. In my case, introductory courses in physics and astronomy were more than just an opportunity to teach how to define the equation of motion for a system or trying to grasp the size of the Milky Way galaxy. These courses are a de facto introduction to the history of knowledge acquisition by human kind. A first year student quickly goes through centuries of information that has been compacted into laws and formulas. To even get to that point, they have already mastered the majority of Greek mathematical knowledge and have reached the 17th century by tackling the mathematical and physical discoveries of Sir Isaac Newton.
As they progress through the material they slowly approach the 20th century and a new physical paradigm is introduced, the probabilistic nature of quantum mechanics. This new shift from deterministic Newtonian physics does not invalidate their previous classes but rather opens a new parameter space into which they can apply their problem solving skills. This transition exemplifies one of the fundamental aspects of how science, or rather the collection of human knowledge, works. We do the best with the tools available to us and, as new tools and ideas become available, we push those old boundaries in order to reach new discoveries.
A great example of this process can be glimpsed in the evolution and interpretation of a simple concept, parallax. For those not familiar, discovering and understanding parallax is quite simple. Simply extend your index finger out whilst holding your arm still, and alternate closing one eye. The relative change of the observed position of your index finger to the background is parallax at work. This simple concept can be used as a distance measurement technique to nearby stars, within thousands of light-years away from us. One of the fundamental assumptions underlying this technique is that the Earth rotates about the Sun and this baseline measurement allows one to measure the distance to more distant objects by taking measurements at different time intervals and seeing how the relative position of the object changes with respect to the background.
The Greeks, as well as current day astronomers, shared this theoretical understanding but our ability to measure angular separation led to very different results. The lack of an observed parallactic effect led the Greeks to conclude that the Earth was not rotating about the Sun but rather that the Sun was rotating the Earth, an accurate result to their hypothesis testing method. An accurate result that we now know to be wrong so, what was the problem? The problem with the Greek investigation was that they fundamentally underestimated the distance to stars and, because they lacked the instrumentation, they failed to accurately measure the small changes in position due to parallax.
This connection between fundamental assumptions, experimental options and theoretical understandings leads to the very natural question of what exactly is science, the scientific method or more simply put, what is knowledge? Of course the acquisition of knowledge has to be reproducible and repeatable but there is a more subtle definition. It is the intersection of fundamental assumptions and theoretical propositions tested with our best experimental methods. It is not the most advanced option in just one of those branches.
How then does this general definition fit into one of the most complicated questions that we’re trying to tackle, what is the fate of our Universe? Will there be a Big Crunch or will our dark energy dominated Universe continue expanding until the night sky becomes a blank canvass? The unfortunate answer is that it might be a while until such a big question can have a satisfactory answer. While sophisticated cosmological simulations can provide us with many possible scenarios, it is not until theses simulations provide observable predictions that can be tackled by current and next generation science facilities that an answer can be presented. Until then, we have only possibilities and very coarse constraints. This long duration process simply underlines the complexity of the questions currently asked.
This limiting factor resulting from the current state of facilities and experimental methods is not a new problem but rather one that has been faced by every generation of scientists. This is also one of the many ways that the scientific community interacts and influences the private industry sector. Whether it’s hand-held telescopes built by Galileo, CCD technology making its way into consumer products or building a new generation super computer for the Square Kilometre Array Radio Telescope, current experimental requirements challenge and influence the current state of technology of every generation.
However, we are not always limited in our acquisition of new information by technology and experimental methods. Einstein did not use a super computer or a new generation telescope to develop his theories. He challenged hundreds of years of Newtonian mechanics by tackling fundamental frame of reference assumptions and managed to revolutionise our understanding of the Universe. Still though, it was not until Eddington’s observational evidence that the community at large was able to incorporate these new theories into the breadth of human knowledge. This combination of great theoretical hypotheses combined with an experimental result is the very definition of the scientific process.
How does the evolution of new perspectives impact past theories? Do apples still fall in the new realm of Special and General Relativity as described by Einstein’s theories? The simple answer is that of course they do but should that apple fall at relativistic speeds and have an ant riding it, we now have the tools to describe the outside perspective of the apple falling as well as the perspective of the ant. Is this a real concern? Well, were it not for Special Relativity our global Ground Positioning System would not work. Anytime a satellite transmits information to a ground station, relativistic effects have to be considered.
The fact that a blackboard exploration in the early stages of the 20th century proved to be the solution to a technological problem that did not even exist at the time of its inception also leads to another important aspect of science. We can’t really anticipate where the next paradigm shift will be and as such, we must explore many theoretical possibilities. Unfortunately, until a testable hypothesis of these theoretical approaches can be pursued they will be relegated to the realm of possibilities. It is not until technology and experimental methods evolve to catch up with the theoretical works that these past possibilities can be investigated. However, the more complex of a theoretical question faced, the longer the wait expected but also a greater reward is to be had.
The construction of the Large Hadron Collider, LHC, is such an example. Away from the construction and political influence that had to be leveraged in order for the first workman to be on site, the analysis of the data also ushered in a global analysis network whose data transfer protocols led to the standardisation of data transfer on what we now call the world wide web. The fact that the LHC was a success before even taking data is an incredible feat. Its contribution to our understanding of the fundamental structure of space-time fabric will be an immeasurable contribution to the whole of human knowledge for upcoming generations, in ways that are not imaginable now.
So now with the advent of almost instantaneous science communication how does one create a context for their results? Media releases are designed to focus on a very specific result with minimal distractions and filled with very digestible morsels of information. In this format there is little room for additional information but creating the context can also lead to a more powerful impact statement. Introducing a more comprehensive yet still digestible perspective can shape the direction of the media follow-ups by directly informing them of the bigger picture before their questions.
Understanding the intrinsic connection between research and public domain technology as well as clarifying the scientific process to the general public is an extra burden on the shoulders of today’s researchers as well as today’s media, but it is an integral part of how effective scientific public communication can survive in the new digital age.
The author:
Alex Codoreanu, PhD Candidate, Swinburne University Center for Astrophysics and Supercomputing
Our new outreach platform Shout It Out aims to support and showcase science communication talent in the scientific community. Even if you don’t know you have communication talent yet, we want you to try anyway. Why? Because the public are ravenous for scientific insight, and it should be the people doing the science who give it to them