Poster Abstracts (Alphabetical by last name of presenter)
Stefan Aimet (PhD student in Physics, Freie Universität Berlin)
"Maxwell’s demon and beyond: A short theistic reflection on the thermodynamics of information"
Maxwell's demon is a thought experiment challenging the second law of thermodynamics by proposing a being capable of decreasing entropy. This led to heated debates in the field, culminating in Landauer's principle and the phrase "Information is physical," which raised philosophical questions. In this study, we aim to explore the potential misuse of these concepts as evidence for the philosophical standpoint of physicalism while countering such claims. Our article offers a concise theistic reflection on the thermodynamics of information. Additionally, we delve into how Aristotelian teleology may establish a relationship between information theory, thermodynamics, and teleology itself. Grounded in a Christian lens, our reflections integrate quantum information theory, thermodynamics, philosophy, and theology through captivating paintings, visually illustrating the harmonious interplay between these disciplines.
Avignolo, Carlo (UNITRE – Università delle Tre Età – Loano, ITALY)
“Environment and Health”
The study of the Health of the Planet Earth is a multifaceted effort that encompasses disclosing the secrets of nature and approaching moral questions on our relationship with creatures. Ethical principles to be assumed as guidelines to elaborate these issues include care of creation, human dignity and human rights, the common good. Some questions are presented here.
- Can science and religion enter a dialogue fruitful to both?
The existence of nature is totally independent of man. Recognizing common fundamental plans in the organization of living and non living objects implies recognizing the existence of an origin of this order. Contemplation of its magnificence imparts peace and serenity and is the ground of inspiration for scientific achievements.
- Are faith and science a matter of reasoning or experience?
Looking at common elements that bridge faith and science leads to consider both as paths of a spiritual endeavor. Wonder and desire of the truth play significant roles in both the scientific process and faith.
- Do Environmental Sciences contribute to stimulating a healing process?
Our very contact with nature has a deep restorative power. The effects of nature therapy, through visual, tactile and olfactory stimulations, can be assessed by a number of physiological indicators, such as activities of brain as well as endocrine, immune and autonomic nervous systems.
References
Saint Bonaventura, “Itinerarium mentis in Deum”, Prologue,par 3
E Cantore, “Scientific Man. The Humanistic Significance of Science”
C Song, H Ikei and Y Miyazaki “Physiological Effects of Nature Therapy: A Review of the Research in Japan” Int. J Environ res Public Health (2016)
Katherine V. Bulinski (Professor of Geosciences, Bellarmine University)
“Critical Conversations: Science and Faith in the Undergraduate Classroom”
In the Spring semester of 2023, a new course entitled “Exploring Scientific Controversies” was offered for first-year honors students at Bellarmine University, a Catholic liberal arts institution in Louisville, Kentucky. Taught by a Catholic paleontologist and featuring guest speakers including an astronomer from the Vatican Observatory, this course examined intersections of faith, science, and society through a variety of different lenses: philosophy, history, theology, psychology, culture and more. Case studies included explorations of the Galileo Affair, evolution and creationism, vaccine hesitancy, and climate change.
While not explicitly focused on faith and science, the course required exploration of how science and faith intersect within different contexts. Many of the most prominent scientific controversies are connected to the ways in which people of faith perceive scientific concepts and how they relate to their beliefs and ethical frameworks.
As the semester progressed, it became clear that enormous misconceptions about how science works and how faith is understood were entrenched. Notably, this included misconceptions about what the Catholic Church specifically teaches about the compatibility of faith and science. This was true even for students who benefitted from prior private Catholic schooling and chose to attend a Catholic university.
As revealed in journaling assignments and class discussions, two different students who self-identified as atheists at the start of the semester later began identifying as agnostic, and others began to examine their own prior assumptions about faith and reason. Changing student attitudes or beliefs was a surprising and unintended outcome of the course and speaks to the benefits of engaging on the topic of science and faith as a vehicle for self-exploration that engages even the most skeptical student.
Wojciech Chrosny (Chief Scientific Officer, TreeAge Software)
“Epistemic boundaries as guideposts to transcendence”
I will present my personal insights of how different scientific domains can benefit from deeper considerations of the “epistemic boundaries”. Two different kinds of epistemic boundaries will be briefly considered. Boundaries internal to the specific scientific fields such as in Physics – Heisenberg’s Uncertainty Principle, Mathematical Logic – Gödel’s Incompleteness Theorem, Theory of Computation - Turing’s Halting. Each one more or less widely known and often ignored as a curiosity to be avoided. The other category of the epistemic boundaries are the problems of irreducibility of problems from one scientific domain to another. For example, problems of reducing biology to chemistry, chemistry into physics, etc. Understanding of the nature (epistemic or ontological) of these boundaries will be discussed.
How can Catholic Faith shed light on these boundaries and open doors for scientists to seriously consider pursuits of real relationships between the transcendent and scientific domains?
A brief sketch of my PhD work on genetic programming in artificial intelligence will be provided. How that work led me to reassess my relationship to my artificially created world and launched me on pursuit to study Thomas Aquinas’ philosophy and Theology of Being.
I will attempt to extrapolate how a similar process might work in other scientific fields. Finally, I will pose some potential future questions regarding the possibility of assessing epistemic and ontological nature of different boundary problems. For example, the quest of physicists to understand the nature of Heisenberg’s Uncertainty Principle (is it part of “reality” or just limitation of our observational toolkit). I will draw attention to avoidance of two pitfalls “God of the gaps” and its related counterpart “given more time we will understand everything about the human being, etc.”
Patrick Duffley (Prof. of Linguistics, Univ. of Laval)
“The brain/mind distinction and the question of whether linguistics can be a science”
It is argued in this paper that linguistic explanation cannot be done without reference to meaning and that meaning is an immaterial object not subject to quantitative measurement. This raises the question as to whether meaning can constitute scientific data and whether linguistics can be a science. Following Artigas (2000), four basic conditions are taken as necessary to guarantee science’s reliability. The most basic is that the object under study must be intersubjectively observable, i.e. there must be a consensus of all competent observers as to what it is. Linguistic meaning is argued to satisfy this first criterion with flying colors, since for language to serve as an instrument of communication there must necessarily be intersubjective consensus about the meanings of words. Artigas’ other three criteria are also applicable to language in a qualified way. The second one, empirical control, implies the ability to perform repeatable controlled experiments on the object, which is not the case for human speakers because they are free and therefore not manipulatable in this way. However, with a very broad database one can observe whether people placed in similar situations generate similar linguistic outputs. The fact that speakers possess free will also limits the applicability of the third criterion, predictive power, as it is not possible to predict precisely what someone is going to say or not say in any given situation. One can however make certain conditional predictions, such as if the speaker chooses to use the verb enjoy followed by a verbal complement, that complement will take the form of the gerund. The fourth criterion, cumulative progress in knowledge, is also applicable in linguistics, as illustrated by the explanation for the use of the bare infinitive after the wh-word why. The upshot is that under certain conditions non-material reality can be the object of scientific enquiry.
Kin Tung Michael Ho (PhD student, Imperial College London)
“Ethics of Quantum Computing: A Catholic Perspective”
Quantum computing (QC) has made remarkable advancements in recent years and has attracted significant investments due to its potential for computational speedup. However, its ethical challenges from a Catholic perspective remain largely unaddressed. This work explores how Catholics should approach this emerging technology by considering its usage intention, accessibility, and environmental impact.
First, users of QC should be mindful of their intentions. QC may threaten the security of the existing cryptographic protocols, leading to potential data-privacy violations. For example, Shor’s algorithm can exponentially accelerate factorization, potentially undermining current public-key encryption schemes. Therefore, stakeholders involved in QC should ensure the technology is used for the benefit of humanity and for the common good, safeguarding human dignity (CCC 1926).
Second, there is a risk of monopolization of QC. High costs and expertise requirements may grant greater access to QC for countries and companies with more financial resources, widening the global socioeconomic gap. This can be resolved with the help of solidarity, which is ‘a direct demand of human and Christian brotherhood’ (CCC 1939-41). As such, international efforts should be made to distribute the benefits of QC equitably.
Lastly, the development of QC hardware can have negative environmental implications. The manufacturing of chips often involves the extraction of scarce materials, and the operation of QC may consume an extensive amount of energy. Future research should prioritize minimizing the technology's carbon footprint for the care of our common home as outlined by Pope Francis in his encyclical Laudato Si’.
Although QC has not reached maturity, its ethical concerns must not be overlooked. As with all scientific and technological advancements, QC should be harnessed ‘at the service of the human person, of his inalienable rights, of his true and integral good, in conformity with the plan and the will of God’ (CCC 2294).
Jeffrey W. Herrmann (Professor, Mechanical Engineering, Univ. of Maryland)
“Metareasoning for Robots”
Because an autonomous robot has limited onboard computational resources (due to size, weight, and power limits), maximizing the robot’s performance, efficiency, safety, and reliability requires optimizing the use of those resources. Metareasoning, a branch of artificial intelligence, enables a robot to monitor and control its perception, mapping, planning, and other reasoning processes in response to changes in the robot and its environment. Metareasoning is implemented in a meta-level that is logically separate from the object level (which performs the reasoning processes).
This poster describes the motivation for metareasoning, defines metareasoning, and presents an approach for synthesizing metareasoning policies. The synthesis approach requires defining the problem, characterizing the performance of the reasoning options, and specifying a metareasoning policy. The metareasoning policy selects a reasoning option based on the current values of the state variables.
This poster will also highlight recent work on characterizing metareasoning approaches that can recover from a path planning failure (such as becoming stuck behind a barrier or obstacle). These approaches switch the robot’s path planning algorithms when a failure occurs. The results of our experiments with an autonomous ground vehicle show that adding metareasoning can increase the likelihood of mission success and reduce the time needed to complete its mission, which conserves resources. Ultimately, comprehensive metareasoning that can control the most important aspects of object level reasoning will enable an autonomous robot to deploy its limited computational resources more effectively and complete its mission more reliably.
John G.W. Kelley (Affiliate Research Professor, Institute for the Study of Earth, Oceans, and Space, The University of New Hampshire)
“Weather during the 50 Years of the March for Life in Washington, DC”
The March for Life is an annual outdoor rally and march to protest abortion and celebrate the right to life, held in Washington, DC, on or around the anniversary of the U.S. Supreme Court’s 1973 Roe v. Wade ruling. The first March for Life was founded by Nellie Gray in 1974 and has continued for the past 50 years. The March is considered the largest annual human-rights demonstration in the world. Organizers have estimated the number of marchers to range from tens of thousands to hundreds of thousands. People from across the U.S. and beyond travel via cars buses, trains, and airplanes to attend the rally on the National Mall and the march. As it is for any outdoor event, weather is an important factor for both the organizers of the March and participants who travel hundreds of miles to attend the March. This is especially true because of the possibility for multiple types of severe and extreme weather in Washington at the time of year that the March occurs. The 50 th Anniversary of the March in 2023 provided an opportunity to look back at the weather and its impact on the March since 1974.
Denis Larrivee (Visiting Scholar, Mind and Brain Institute, Univ. de Navarra Medical School)
“Insights from Cognitive Diseases on Agent Representation in Action Execution”
Impaired self regulation features prominently in studies of schizophrenia - often characterized as a disease of agency - as well as other cognitive diseases. Because self regulatory mechanisms entail not just top down processes required for executing decisions, but also the neural representation of the self/agent - generally regarded as the source of decision making capacity - impairments of the latter can be expected to weaken self initiated, action execution. The etiological basis of these diseases has remained elusive, however. Genomic studies have thus far failed to identify gene candidates that exert more than a marginal influence on behavioral symptoms and pool sizes of risk alleles potentially run into the thousands, indicating that most if not all possible genetic players have been interrogated. The apparent lack of genetic specificity suggested by these studies implicates a higher order, organizational impairment in which the individual risk allele products are elements but not essential constituents of the higher order organization. If so, this is likely to mean that the neural substrate impaired in these dysfunctions is chiefly determined by higher order features of the nervous system and only non-specifically effected via genetic influences. Such a substrate could be embedded within the interactive properties of large cell clusters like those comprising neural circuits or even large-scale networks that are known to occur in cases of memory or motor behaviors. Given that impaired body representations and the inability to attribute actions to oneself are hallmarks of cognitive diseases like schizophrenia, several leading proposals have linked the sensorial representation of the body known as the peripersonal space (PPS) to the self/agent. As currently understood, the PPS is constructed through a process of multisensory integration originating from multiple sensory modalities distributed throughout the body. The failure of the PPS to explain impairments of goal directed activity, however - seen in mirror system defects - suggests that the relationship between the body and the agent in the performance of intentional actions is more profound than that implicated by a neural representation such as the PPS. This poster will consider the physical basis of this relationship through the lens of cognitive dysfunctions that impair goal directed actions.
Kevin McGouldrick (Research Scientist, Univ. of Colorado at Boulder)
“Modeling Aerosols in the Clouds of Venus”
With very little undisputed information about the characteristics (i.e., composition, size, vertical structure) of the aerosols in the Venus atmosphere, we turn to the influence of microphysical properties on the cloud structure as a means of constraining the identities of the venusian aerosols. The microphysics of cloud formation is a highly coupled process, and one in which differences in the physical properties of one constituent – perhaps one that is not easily observed or measured – can lead to observable changes in another constituent. Here, we present results that build upon recently-published work that demonstrates how changes in the coagulation efficiency of putative, photochemically-produced, involatile aerosols can bring about large scale changes in the total aerosol opacity in the clouds of Venus. Specifically, we show that a suppression of self-coagulation of the involatile constituent in the Venus Upper Clouds can lead to total cloud opacity variations on the order of 10 optical depths over time scales on the order of 100 days.
This work utilizes PlanetCARMA, a microphysical and radiative transfer model developed from the Community Aerosol and Radiation Model for Atmospheres (CARMA). The microphysics model treats transport by eddy diffusion, particle fall velocity, and vertical winds; it also treat particle growth by condensation (and evaporation) and coagulation / coalescence. Pending work utilizing this model includes an investigation into the sustainability of microbial life in the cloudy environments of early Earth and modern-day Venus.
David Poister (Professor of Chemistry, St. Norbert’s College)
“The Power of Catholicism in an Evolving World”
Catholicism and other organized religions can play a critical role in the continued improvement of the human condition. When viewed correctly, religious moral codes and practices can guide evolutionary processes toward increased peace and cooperation. Because much of this positive evolutionary change is occurring at the symbolic, behavioral, and epigenetic level, maintaining the momentum of the evolutionary change requires continued reinforcement on a large scale and organized religions are well-suited to provide this reinforcement.
In this presentation, scholarship from evolutionary biology, neuroscience, and history will be combined with the theological writings of Pierre Teilhard de Chardin and John Haught to develop a view of Catholicism as an evolutionary force containing specific mechanisms to foster human progress. The historical effectiveness of this transformative force will be evaluated. In addition, the evolutionary view will be explored as a tool for fostering spiritual growth, enhancing religious experience, and shaping change within religious institutions.
Joe Rice (Leader, Remote Sensing Group, National Institute of Standards and Technology (NIST))
“Let There Be Calibrated Light: Celestial Flux Standards”
The light from the Sun, Moon, and standard stars is used for flux calibration, but improved uncertainties are needed. Accurate measurements of the solar and lunar irradiance have applications in Earth climate monitoring. Daily measurements since 2018 via satellite of the exo-atmospheric solar spectrum have reached uncertainties of 0.3 % in the visible-near-infrared. This improvement, coupled with the extremely stable lunar surface reflectance, enables the Moon to serve as a stable celestial flux standard for long-term satellite-based Earth-climate monitoring. After correcting for the regular phase and libration cycles, lunar images captured by satellites have been used to correct for response degradation of Earth-viewing satellites used for climate monitoring. NIST is working with collaborators to improve the absolute lunar irradiance uncertainty from several percent to below 1 %, to establish the Moon as an absolute flux reference standard in space.
The set of standard stars that are known to be stable enough for use astronomical observatory flux calibration have been calibrated on scales uncertain at the several percent level, are not very traceable to NIST flux scales, and are mostly based on models that disagree spectrally at great than percent levels. However, future astrophysics such as dark energy studies based on supernova cosmology, and exoplanet habitability research, require flux calibration with uncertainly of better than 0.5 % of ground telescopes such as the Vera Rubin observatory and space telescopes such as JWST and the future Nancy Grace Roman Space Telescope. NIST laboratory flux standards provide uncertainty well below this level, so we are setting up an observatory in Chile to recalibrate the standard stars. We are also developing an artificial star to be launched into space, Calibration using an Artificial Star with NIST-traceable Distribution of Luminous Energy (CANDLE), that will provide an alternative for astronomical flux calibration.
Fr. Javier Sánchez-Cañizares (Professor, Mind-Brain Group, Institute for Culture and Society and CRYF, University of Navarra)
“Exploring the Role of the Maximum Entropy Production Principle in Understanding Causation and the Mind-Brain Problem”
The brain is a highly intricate and dynamic system that has yet to be fully understood. Unfortunately, there is currently no specific theory for brain dynamics that can accurately predict its emerging features. Researchers must instead rely on phenomenological approaches, which have limited scope and may not integrate coherently into a broader picture. This is largely due to the diverse range of physical scales that are relevant to the brain’s functioning. Despite these limitations, it is widely believed that brain dynamics comply with the Maximum Entropy Production Principle (MEPP). This principle is a leading candidate for governing non-equilibrium processes and has the potential to explain brain-dependent phenomena, such as cognition, as particularizations of the MEPP in nature.
However, the MEPP’s status as a governing principle is controversial. While recent conjectures suggest that biological systems maximize entropy production over maximal spatial and temporal scales, the MEPP has mainly been applied to non-living systems. Additionally, the MEPP’s theoretical foundations and range of applicability require further assumptions, which can limit the rigor of its predictions and increase the potential for confusion in its interpretations. More interestingly, the MEPP’s epistemic status is uncertain. Some view the MEPP as a heuristic guide for Bayesian inference, while others believe that it goes beyond heuristics and selects the physical solution in multistable systems, ensuring its correspondence with reality.
To address these controversies, this contribution aims to provide a more fundamental philosophical perspective. First, it is essential to recognize that heuristics and ontology are not mutually exclusive; the former can help scientists and philosophers build the latter. Second, while the MEPP is primarily valued for its ability to inspire and spawn effective models of non-equilibrium steady states, its power may also derive from its ability to single out new forms of causation. Finally, the discussion on the MEPP's ontology is contingent on the metaphysical problem of determination in nature and whether natural causation encompasses microphysically efficient together with formally informative causes. If the latter is true, new possibilities would emerge for understanding cognition and the mind-brain problem.
Mark R. Scafonas (Assistant Professor of Physics and Mathematics, Gwynedd Mercy University)
“Ignatian Pedagogy in the Introductory Physics Laboratory”
Ignatian pedagogy is a teaching process, derived from St. Ignatius of Loyola’s Spiritual Exercises, that integrates the Christian worldview into education and, as a result, is directed towards the good of the learner. The focus on the learner emphasizes the human-centeredness of education and guides the teacher away from approaching pedagogy and methodology in purely utilitarian terms. The Ignatian Pedagogical model is a continuous cycle consisting of five steps: Context, Experience, Reflection, Action, and Evaluation. This paradigm is applied to the university-level, General Physics laboratory, where existing pre-lab exercises were not adequately preparing students for experiments in the laboratory. Using this paradigm, as well as including video media and physics education technology in the pre-lab assignments, the goal is to provide the student with an affective and exploratory experience, which guides the student towards developing a physical intuition prior to entering the lab and constructing a viable hypothesis through reflection. Preliminary feedback from a small group of students has been positive, with preference given for the re-designed pre-lab materials over existing assignments based only on the written lab background and procedure.
Thomas P. Sheahen (Science Advisor, Institute for Theological Encounter with Science and Technology)
“Why is There Something Instead of Nothing?”
Scientists agree that questions of the form “Why …?” cannot be answered from within a science. Such answers derive from sources beyond the limits of that science. A long-enduring question among physicists is “Why is there something rather than nothing? The question pre-dates Galileo. When we marvel at the stunning symmetry and beauty of a set of equations that govern a domain of physics, we are drawn to ask “Why?”
We learned from St. Augustine that God created Space and Time together; the entire coordinate system is a created entity (although people routinely take it for granted). It has also gone unnoticed that God created much more, notably the symmetry principles that underlie the laws of physics. In fact, mathematics, logic, thought and reasoning are all part of a package that God created -- a universe which makes sense, which is subject to rational study and investigation. There is such a thing a science.
The most elegant explanation we have so far says that, below the limits of observation, strings form, guided by symmetry principles. Subsequently those form quarks, leptons, baryons, nuclei, atoms, and molecules; along the way, their interactions are constrained by the laws of physics, which themselves come from symmetry principles. Our science points strongly toward the origin of all this being a transcendent Creator of supreme intelligence. Alternative hypotheses have been shown to be incoherent, self-contradictory and totally unsatisfactory.
Appreciating the story of evolution, we look back across enormous time and wonder “why?” Acknowledging the symmetry principles that underlie the laws of nature, and following the mathematics, scientists see that it all makes sense. But the series of step points directly toward the astonishing brilliance of the Creator who gave us such an intelligible universe. We conclude that God had a specific goal in mind.
It is humbling to realize that in this way, an all-loving God could actually create an intelligent being that is capable of loving God in return. Perhaps this is why there is something instead of nothing.
Andrew Sicree (Adj. Prof. of Geology, Penn State University)
“Teaching the Connection of Science and Faith in Science Classes at Secular Universities”
Teaching science at secular universities presents a unique set of challenges to the faith of Catholic scientists. Some authors argue that the late paleontologist Stephen Jay Gould’s “NOMA” (non-overlapping magisteria) concept violates the concept of the unity of truth (c.f, Lineweaver, C. H., “Increasingly Overlapping Magisteria of Science and Religion” p. 155-170 in Seckbach, J., Divine Action and Natural Selection: Science, Faith, and Evolution, World Scientific, 2009). But many Catholic professors, even while recognizing the unity of truth, feel that they cannot talk about their Christian faith on campus, nor bring anything “Catholic” or “Christian” into the science classroom. Materialism is accepted uncritically as necessary to the scientific enterprise.
Christianity in general, however, and Catholicism in particular have deep historical connections with sciences ranging from astronomy and physics to geology and biology. Studies of the lives of Catholic scientists support this thesis. If we accept that teaching students some parts of the biographies of important scientists is a valid and important facet of science education, then one cannot escape teaching about some aspects of the faith if one presents honest and full portraits of many important scientists. One only need mention Galileo, Copernicus, Mendel, Becquerel, Descartes, Ampere, Steno, Agricola, Marconi, Pasteur, or Lemaître, to begin a discussion of the connections between science and men of faith.
Explication of some of the connections between faith and scientific concepts is thus not an invalid aspect to the teaching of science at the university level. It is possible to teach these interconnections in ways that are respectful of students’ differing (or non-existent) religious positions. Teaching these aspects of science is most relevant in general education science courses where it can constitute a small but not unimportant portion of the syllabus. I will offer some real examples of how Catholic professors have introduced Catholic concepts honestly in their classrooms and some discussion of the results of the integration of the unity of truth approach.
Chris Stoughton (Senior Scientist, Fermi National Accelerator Laboratory)
“Measuring Magnetic Moments to Test the Standard Model of Particle Physics”
The “Standard Model of Particle Physics” is a physico-mathematical model of ordinary matter and accounts for wide-ranging, detailed measurements. Magnetic moments hold a special place in the history of testing models of fundamental particles. For protons and neutrons, anomalous magnetic moments were early clues that these nucleons are more than simple fundamental particles. The measurement of the magnetic moment of the electron is a triumph of experimental and theoretical techniques; it yields the most precise agreement in all physical science between model and measurement, to ten significant digits!
Recent work on the magnetic moment of the muon continues this adventure. I review the history and current state of the muon g-2 experiment at Fermilab and our plan to complete the project, along with related experimental and theoretical projects. This instance illustrates how we can evaluate trustworthiness of claims in the physical sciences.
Mark Temple-Raston (Co-Founder, CIO and Chief Data Scientist, Precision Alpha)
“Faith and Scientific Machine Learning”
Scientific machine learning (ML) integrates scientific reasoning (inference) with evidence (field data) by constraining Maximum Entropy to exactly what is observed. Equivalently, the adage “I believe what I measure” is manifest in scientific ML. Constraints with field data suggest that Maximum Entropy must generalize the “indifference principle” of statistics (science of equilibrium), because when the field data behaves statistically scientific machine learning must reduce to statistics. Using maximum entropy instead of indifference, scientific ML leads to the following:
• Statistics to probability theory,
• Closed system to open (energy can enter and exit the system),
• Equilibrium to non-equilibrium,
• Objective and subjective (unity with human involvement),
• Direct measurement of “emotions”.
The above is made intelligible by looking at an important special case of scientific ML that applies to many common real-world situations: the Science of Counting. The Science of Counting is exactly solvable, so that unique, closed-form solutions can be deduced and implemented as web services. We show how the Science of Counting web service demonstrates the bullets above.
Cristina Ventura (PhD student in Chemistry and Biochemistry, Seton Hall University
“The Illuminative Photobuforin II”
Buforin II is a highly potent cationic antimicrobial peptide. Its proposed mechanism of action is translocation across the cell membrane and subsequent binding to DNA. The sequence of buforin II is identical to a portion of core histone protein H2A. The central proline in buforin II creates a helix-hinge-helix motif that has been found to play an important role to its ability to translocate across the cell membrane. To study the structure-function relationship of this proline residue this study has replaced P11 with a meta-substituted azobenzene amino acid (Z). The resultant peptide, Photobuforin II, retained the secondary structure and membrane activity of the naturally occurring peptide while gaining new spectroscopic properties. Photobuforin II can be isomerized from its trans to cis isomer upon irradiation with ultra-violet (UV) light and from its cis to trans isomer upon irradiation with visible (VL). Photobuforin II is also fluorescent with an emission peak at 390 nm. The intrinsic fluorescence of the peptide was also used to determine binding to the membrane and to DNA. Photobuforin II provides insights into the importance of structure function relationships in membrane active peptides while also demonstrating that azobenzene can be used in certain peptide sequences to produce intrinsic fluorescence.
Amanda Waelde (Ph.D. student in Biological Sciences, Univ. of Notre Dame)
“Optimization of Tryp-N for Pyroglutamic Acid Avoidance and Design of a Novel Colorimetric Substrate for N-Terminal Proteases”
Authors: Amanda Waelde, Daniel Hu, Matthew M. Champion
Mycobacterium tuberculosis is a leading cause of death worldwide, leading to about 1.2 million deaths from tuberculosis each year. Post-translational modifications of proteins, including N-terminal acetylation (NTA) of proteins, is a major means by which M. tuberculosis regulates its virulence. Our lab has developed a method for efficient quantification of NTA that can be used in M. tuberculosis research. The method uses trypsin, a common protease that cleaves C-terminal to lysine and arginine residues, to digest proteins into individual peptides. However, if the first amino acid residue after a trypsin cleavage site is glutamine or glutamic acid, the N-terminus exposed after tryptic digestion can cyclize into N-terminal pyro-glutamic acid (pyro-Glu), yielding a contaminant in our NTA quantification method. The pyro-Glu can be removed, but the procedure is time-consuming and alters the amino acid sequence, complicating computation-based sequencing of the peptide. We are evaluating Tryp-N, an N-terminal protease that cleaves before arginine and lysine, yielding peptides beginning with one of those two residues. Since the N-terminal amino acid residue cannot be glutamine or glutamic acid, the cyclization of those residues into pyro-Glu and the resulting contaminant in our quantification method will be eliminated. Digestion conditions for Tryp-N were previously unknown and are not readily investigable due to the lack of model substrates for N-terminal proteases. Most common proteases have colorimetric substrates which facilitate optimization of enzymology and determination of steady-state kinetic parameters. Here, we show optimization of Tryp-N using a dye-bound protein-substrate and the design and synthesis of a novel colorimetric substrate specific for N-terminal proteases. Establishing digestion conditions for Tryp-N will enable us to demonstrate its ability to prevent pyro-Glu formation and use it instead of trypsin in our quantification of NTA, preventing contamination, eliminating the need for a lengthy pyro-Glu removal process, and aiding research efforts into M. tuberculosis.
Matthew P. Wiesner (Prof. of Physics, Benedictine University)
“Fides et Ratio at 25: Reflections from a Catholic Scientist”
Twenty-five years ago, on September 14, 1998, Pope St. John Paul II released his 13th encyclical, entitled Fides et Ratio, or Faith and Reason. Its focus is on the interface between faith and reason in the modern world. It begins with the famous words, “Faith and reason are like two wings on which the human spirit rises to the contemplation of truth…” This encyclical gives a blueprint for the life of a Catholic scientist, a life spent pursuing reason based on empirical observations, but reason conducted in harmony with a consistent philosophy and the theology of the Catholic Church. In this presentation I will give a brief history of this encyclical, summarize its main points and reflect on how to live a scientific life following the guide of Fides et Ratio. In one of the final paragraphs, St. John Paul II exhorts that, “Scientists are well aware that ‘the search for truth, even when it concerns a finite reality of the world or of man, is never-ending, but always points beyond to something higher than the immediate object of study, to the questions which give access to Mystery.’” I will consider how to continue this search for truth in nature while keeping an eye on the transcendent.
Geoffrey Woollard (Ph.D. student in Structural Biology, University of British Columbia)
“Discerning "Myth" and "the Sacred": how theology can help science and indigenous wisdom benefit from each other”
Theology, as queen of the sciences, can oblige science with the rationality (logos) it appropriately demands (cf. Giuseppe Tanzella-Nitti), while also welcoming wisdom alive in the teachings and stories (mythos) of indigenous cultures throughout the globe. Historically, the rise of scientific rationality in the West also brought a critique of myth and desacralization of nature, where theology also played a decisive role (cf. Stanley Jaki). However, the French Dominican theologian Jean-Michel Maldamé has argued that myth survives triumphantly even in science, through language related to myth in physics, some narratives of biological evolution, and the return to or arrival at a utopia that is technologically saturated or void. In these areas, theology can mediate dialogue between science and indigenous wisdom and welcome both logos and mythos, which complement each other in a wider rationality. Secondly, the loss of the "sense of the sacred" has been experienced as an alienation by many indigenous cultures, for instance in the Americas and Oceania, that continue to struggle today (cf. Robin Wall Kimmerer). I argue that there were fumbling and tragic historical interactions between indigenous cultures and technocratic paradigms where questionable ideology and particular perspectives were unfairly disguised under a mirage of 'scientific progress' and universal comprehensiveness. Interestingly, a similar loss of the sense of the sacred has also challenged the (Western) Catholic intellectual tradition, as articulated by Louis Bouyer in Cosmos: The World and the Glory of God (1982). So both theology and indigenous wisdom have a certain solidarity in alienation, with a shared history of how science was philosophically digested. Thus, inroads to recovering the sense of the sacred from theology and from indigenous wisdom help each other, and reassure science that its unique gifts and perspective have an important role to play in human development and for a living a well examined life in accord with reason (cf. Anthony Rizzi), although as emissary and not master (cf. Iain McGilchrist).