Jason Lisle’s new book “The Physics of Einstein”

Today I have nearly finished reading Jason Lisle’s new book “The Physics of Einstein”. I highly recommend it to you if you ever wondered about any of the major questions it deals with:

  1. Does light from distant galaxies really take billions of years to reach Earth? 
  2. Is time-travel possible? 
  3. Are black holes real? 
  4. What are some of the weird effects of travelling at near the speed of light? 
  5. And how do we really know?  

The physics discovered by Albert Einstein allows us to answer all of these questions.  In this easy-to-read book, we learn how Einstein was able to deduce what happens when an object approaches the speed of light.  The results are as amazing as they are strange.  Designed for readers with no background in physics, this book explores one of the strangest and most fascinating branches of science.

Soon I will write a review of the book but before that I would strongly recommend you buy it and read it. It can be ordered from Jason’s website’s shop.

It is written for the layman and the sections that involve any mathematics are sectioned in boxes and can be skipped without losing the flow of the points being made.

The book explains in extensive detail, that a non-specialist can understand, the simplest solution to the biblical creationist starlight travel time problem.

By making the reasonable assumption, based on textual evidence, that the language used in the Bible for the timing of events, especially the creation of the stars, implicitly involves the scientifically valid Anisotropic Synchrony Convention (ASC) the starlight travel-time problem disappears.

If the question of how do we see distant stars in an enormously large universe, billions of light-years in extension, has been a big problem for you, this book is a must read. Even if you only read the last 4 chapters of the book that deal with this question you would be greatly enlightened. If you are pressed for time, start with chapter 17 “The Curious Case of the One-Way Speed of Light”. But really you should read the whole book. The preceding chapters clearly explain the physics discovered by Einstein, which build the case for the arguments presented and the refutations of the criticisms against his main thesis.

New cosmologies converge on the ASC model

— a review of two cosmology papers presented at the International Conference on Creationism in 2018  (to be published in Journal of Creation)

Introduction

In 2001 Jason Lisle (under the pen name Robert Newton) introduced the idea of Anisotropic Synchrony Convention (ASC) into the discussion amongst biblical creationists to solve the starlight travel-time problem.1 The ASC is a convention on clock synchronisation, or put another way, the conventionality of the simultaneity of distant events in spacetime.

This topic is relevant to the discussion of the creation of the stars in the universe on Day 4 of Creation week 6 thousand years ago. The ASC posits that an event occurs when an Earth observer sees, or could have seen, the event happen. And Lisle proposed that the ASC is the language used in the Bible. As such it leads to the initial simultaneous2 creation of all stars in the universe on Day 4, where, in principle, the event is timestamped3 as occurring when the starlight from all stars arrived on Earth for the first time. This means there is no light travel-time problem because the events were seen to occur (on Earth) simultaneously (or at least, within the period of one Earth day, that is, on Day 4). Therefore, there is no light travel-time problem.

In 2010 Lisle strengthened his original arguments with a discussion of the past light cone and Special Relativity.4 In that paper he introduced the ASC model, a model that uses the ASC. And his ASC model makes testable predictions.5

Lisle also carried further the notion of the one-way speed of light. Since the one-way speed of light cannot be measured it really has no physical meaning in the universe.6 Thus there is a free choice. And by Lisle’s choice of the ASC it follows that the incoming speed of light is infinite, and thus the outgoing speed must be ½ c (where c ≡ 299 792 458 m/s is the canonical isotropic—i.e. two-way—speed of light that we are very familiar with).7

Many people, biblical creationists included, have expressed disbelief, concern, and other emotions over the concept of the one-way speed of light being any different from the usually assumed isotropic speed c. Nevertheless it is important to note that concepts around the one-way speed of light are based on real physics.

The choice of a timing convention in no way affects any underlying physics. The physics is always the same no matter what convention one may choose.8 Einstein chose a value of the clock synchronisation parameter, known as the Reichenbach synchronization parameter (ε), in his equations for Special Relativity that defines the one-way speed of light as being equal to the two-way speed.9 Any value for the parameter ε between 0 and 1 may be chosen. Nature itself does not choose, nor impose any requirement on its value within this domain. The parameter represents our free choice of a timing convention. Hence we are free to choose any value of the Reichenbach synchronisation parameter ε, provided it is between 0 and 1. Einstein chose ε = ½ (ESC) and Lisle chose ε = 1 (ASC). Choosing a value for this parameter is in no way dissimilar to a choice of a different coordinate system. And regardless of which coordinate system one may choose the underlying physics is unaffected. What is different is only how we represent the physics in the different coordinate system. The equations of motion may be more complex in one coordinate system than in another but in all cases the physics is unaffected.10

Thus no amount of appealing to Maxwell’s equations (derived pre-Einstein)11 or any other well-known physics can refute the notion of free choice for the one-way speed of light, or more precisely, the conventionality thesis of distant simultaneity. Continue reading

Update on the ASC model and the one-way speed of light

In 2001 Jason Lisle (under the pen name Robert Newton) introduced the idea of Anisotropic Synchrony Convention (ASC) into the discussion amongst biblical creationists to solve the starlight travel time problem. For full understanding of those issues read here, here and/or watch this.  With that came the notion of the one-way speed of light. Many people, creationists included, have since expressed disbelief, concern, and other emotions over the concept, but what is important to say at this point is that it is based on real physics. The point is that the one-way speed of light cannot be measured and as a result it really has no physical meaning in the universe. And this might sound crazy, but as a result, we are free to choose its value. In the ASC model, proposed by Lisle, and supported by myself, the incoming speed of light is chosen as infinite and the outgoing speed as ½ c (where c ≡ 299 792 458 m/s is the canonical speed of light that we are nowadays familiar with).

I note that at the 2018 International Conference on Creationism (ICC) two papers were presented that largely boil down to the same model that Lisle originally presented. Those papers are

  1. T.G. Tenev, J. Baumgardner, M.F. Horstemeyer, A solution for the distant starlight problem using Creation Time Coordinates. (PDF available here)
  2. P.W. Dennis, Consistent young earth relativistic cosmology  (PDF available here)

This is all quite significant because, since 2001, I have largely supported the ideas that Dr Lisle has presented. Others within the creationist community have ridiculed them. Personally I now take the position that a biblical creationist model based on the ASC or at least the concept of defining an initial creation scenario which involves the ASC or a variant of that, such as Tenev et al have suggested in their paper, is the best solution to the creationist starlight travel time problem. In such a case, there is no problem.

Many months ago I received a paper wherein the authors attempted to show that the one-way speed of light could be measured by an experiment sending a light signal around a ring bouncing it off a few mirrors.  (See the figure to the right) But any such experimenter who thinks it does that assumes the conclusion (begs the question) by not properly understanding the physics and the underlying assumptions of such an experiment.  There are components (relative to the Source measured at the Timer) of outbound and inbound light vectors that must be considered. So no such experiment is ever only one way, it is always two-way, and as such it can never measure the one-way speed of light. (Besides the ASC is a convention, it is not something that can be refuted. We use a convention to define the basis under which we make a measurement, not the reverse.)

The authors of the same paper(s) also must have sent it to Dr Lisle for a review. He sent me his response to their paper(s) and I publish it below with his permission. Continue reading

Sapphire clock wins the Eureka prize!

Last night the sapphire clock, which I have worked on for the past 15 years, won one of the categories of the Australian Museum Eureka prize.

The category is DST Eureka Prize for Outstanding Science in Safeguarding Australia, and the prize was awarded to the Sapphire Clock Team at Adelaide University and my company Cryoclock Pty Ltd. I did not attend the Eureka prize dinner and award night due to ill health.

The following is a media release from the Australian government Minister of Defence.


UNCLASSIFIED

Please find below a media release from the Hon Christopher Pyne MP, Minister for Defence, Leader of The House and Federal Member for Sturt.

—————————————————————————————————————————————

OFFICE OF THE HON CHRISTOPHER PYNE MP

Member for Sturt | Leader of the House | Minister for Defence

Parliament House, CANBERRA ACT 2600

T: 02 6277 7840

THE HON CHRISTOPHER PYNE MP

Minister for Defence

Leader of the House

Federal Member for Sturt

 

MEDIA RELEASE

30 August 2018

ADVANCE IN TIME-KEEPING CLOCKS UP EUREKA WIN 

Technology using a pure sapphire crystal to accurately measure time has taken out the 2018 Defence Science and Technology (DST) Eureka Prize for Outstanding Science in Safeguarding Australia.

Minister for Defence, the Hon Christopher Pyne MP, today congratulated Professor Andre Luiten and his team from the University of Adelaide on developing the sapphire clock, a device so accurate it can keep time within one second over tens of millions of years.

The sapphire clock team is working closely with Defence scientists to use the technology for upgrades and enhancements to the Jindalee Operational Radar Network (JORN).

The sapphire clock has the potential to produce the purest of signals which, when fed into JORN, could generate high quality surveillance data.

“This innovation delivers a step change in radar frequency, purity and overall performance over conventional devices giving Defence a significant capability edge,” Minister Pyne said.

“This is an example of world-leading research with a positive impact on Australia’s defence and national security.

“It is a fantastic result which will be a game-changer for Defence capability.”

The DST Eureka Prize for Outstanding Science in Safeguarding Australia is awarded annually for outstanding science or technology that has developed, or has potential to develop, innovative solutions for Australia’s defence or national security.


See links

ABC TV Interview with Prof. Andre Luiten, Director of IPAS at University of Adelaide and my business partner in Cryoclock Pty Ltd.

Confirmed: Physical association between parent galaxies and quasar families

In a paper,just published, that looked for an association between putative parent galaxies and pairs of quasars, the authors found many such quasar families, suggesting that the association is real, and not just coincidental. They used the Sloan Digital Sky Survey (SDSS) data release 7 and the 2MASS (Two Micron All Sky Survey) Redshift Survey (2MRS) Ks ≤ 11.75 mag data release to test for the physical association of candidate companion quasars with putative parent galaxies by virtue of Karlsson periodicity in quasar redshifts.

Karlsson proposed that quasars have an intrinsic non-cosmological redshift component which comes in discrete values (z= 0.060, 0.302, 0.598, 0.963, 1.410, …). However, to properly detect any physical association the candidate quasar redshift must be transformed into the rest frame of its putative parent galaxy’s redshift. (This assumes either the parent galaxy redshift is cosmological or if not that it is Hubble law related but not due to expansion of the universe.) Then the transformed redshift of the candidate companion quasar is associated with the closest Karlsson redshift, zK, so that the remaining redshift velocity component—the putative velocity of ejection away from the parent object—can be obtained.  In this manner it is possible to detect a physical association, even in the case where parent galaxies have high redshift values. If this process is neglected no association may be found. Such was done in several papers, applied to large galaxy/quasar surveys, claiming to debunk the Arp hypothesis.

Figure 1: Detected families in a 4 square degree area centered at 09h00m00s+11d00m00s. The open circles are galaxies, the filled diamonds are quasars, with lines connecting each galaxy to its detected quasar family members. The object colours indicate stepped redshift increase from black to red over the redshift range 0.0 ≤ z ≤ 5.5. The central unshaded area shows the galaxies under examination and the entire area shows the candidate companion quasars.

In this new paper, the authors used the method described above, and the detected correlation was demonstrated to be much higher than just a random association. Many such associations were found. As an example in one instance, within one 4 degree area on the sky, 7 quasar families were found to be statistically correlated with parent galaxies.  See Fig. 1 (right). The probability of this occurring by random chance was calculated as follows.

For a binomial distribution … the probability of 7 hits for one 4 square degree area is … = 1.089 × 10-9. Under these conditions, the detection of 7 families with this particular constraint set is extraordinary. [emphasis added]

Generally, the results of this paper are a confirmation of the quasar family detection algorithm described in Fulton and Arp (Astrophys. J. 754:134, 2012), which was used to analyze the 2dF Galaxy Redshift Survey (2dFGRS) and the 2dF Quasar Redshift Survey (2QZ) data sets. This means that using the SDSS and 2MRS data sets the correlation found in Fulton and Arp (2012) is further strengthened.

This means that to a very high probability, much higher than a random association, certain quasars are physically associated with lower redshift galaxies. The quasars are found in pairs or higher multiples of 2. The results further imply that these quasar redshifts indicate a real ejection velocity component and a large intrinsic non-velocity or non-cosmological redshift component. Continue reading

Has light from the first stars after the big bang been detected?

“Astronomers detect light from the Universe’s first stars” is the headline of a Nature news article, which appeared February 28, 2018.1  It relates to observations made by a team of astronomers led by Judd Bowman of Arizona State University in Tempe. The team published their results in Nature the same week.2 According to Bowman,

“This is the first time we’ve seen any signal from this early in the Universe, aside from the afterglow of the Big Bang.”

They used a small radio-telescope situated in the Western Australian desert, far away from human settlement to minimise interference from radio signals generated by human technology. (See Fig. 1.) The antenna was tuned to a waveband of about 78 MHz, which is at the low end of FM radio, so isolation from human generated radio signals was essential.

Figure 1: The small radio telescope in Western Australia used to detect evidence of light allegedly from the Universe’s first stars. Credit: CSIRO

To understand what the astronomers interpret from this research I quote an editorial summary from Nature:3

“As the first stars heated hydrogen in the early Universe, the 21-cm hyperfine line—an astronomical standard that represents the spin-flip transition in the ground state of atomic hydrogen—was altered, causing the hydrogen gas to absorb photons from the microwave background. This should produce an observable absorption signal at frequencies of less than 200 megahertz (MHz). Judd Bowman and colleagues report the observation of an absorption profile centred at a frequency of 78 MHz that is about 19 MHz wide and 0.5 kelvin deep. The profile is generally in line with expectations, although it is deeper than predicted. An accompanying paper by Rennan Barkana suggests that baryons were interacting with cold dark-matter particles in the early Universe, cooling the gas more than had been expected.”

Let’s look at this in two stages. What was observed and what is the interpretation of the recorded data. Continue reading

Cosmology’s fatal weakness—underdetermination

Can we definitively know the global structure of spacetime? This is a good question. It is one that is actively discussed in the area of the philosophy of modern physics.1,2

However it is a question that highlights the fundamental weakness of cosmology and hence of cosmogony. (Cosmology is the study of the structure of the cosmos whereas cosmogony is the study of the origin of the universe.)  That weakness is the inherent inability to accurately construct any global cosmological model, i.e. a model that accurately represents the structure of the universe at all times and locations. The reason for this is underdetermination.3

“There seems to be a robust sense in which the global structure of every cosmological model is underdetermined.”1

In the philosophy of science, underdetermination means that the available evidence is insufficient to be able to determine which belief one should hold about that evidence. That means that no matter what cosmological model one might conceive of, in an attempt to describe the structure of the universe, every model will be underdetermined. Or said another way, no matter what amount of observational data one might ever (even in principle) gather, the cosmological evidence does not force one particular model upon us. And this underdetermination has been rigorously proven.1 Continue reading