Home // Posts tagged "cosmology"

Nosso universo vai congelar como uma cerveja super-resfriada…

SCIENTIFIC METHOD / SCIENCE & EXPLORATION

Finding the Higgs? Good news. Finding its mass? Not so good.

“Fireballs of doom” from a quantum phase change would wipe out present Universe.

by  – Feb 19 2013, 8:55pm HB

A collision in the LHC’s CMS detector.

Ohio State’s Christopher Hill joked he was showing scenes of an impending i-Product launch, and it was easy to believe him: young people were setting up mats in a hallway, ready to spend the night to secure a space in line for the big reveal. Except the date was July 3 and the location was CERN—where the discovery of the Higgs boson would be announced the next day.

It’s clear the LHC worked as intended and has definitively identified a Higgs-like particle. Hill put the chance of the ATLAS detector having registered a statistical fluke at less than 10-11, and he noted that wasn’t even considering the data generated by its partner, the CMS detector. But is it really the one-and-only Higgs and, if so, what does that mean? Hill was part of a panel that discussed those questions at the meeting of the American Association for the Advancement of Science.

As theorist Joe Lykken of Fermilab pointed out, the answers matter. If current results hold up, they indicate the Universe is currently inhabiting what’s called a false quantum vacuum. If it were ever to reach the real one, its existing structures (including us), would go away in what Lykken called “fireballs of doom.”

We’ll look at the less depressing stuff first, shall we?

Zeroing in on the Higgs

Thanks to the Standard Model, we were able to make some very specific predictions about the Higgs. These include the frequency with which it will decay via different pathways: two gamma-rays, two Z bosons (which further decay to four muons), etc. We can also predict the frequency of similar looking events that would occur if there were no Higgs. We can then scan each of the decay pathways (called channels), looking for energies where there is an excess of events, or bump. Bumps have shown up in several channels in roughly the same place in both CMS and ATLAS, which is why we know there’s a new particle.

But we still don’t know precisely what particle it is. The Standard Model Higgs should have a couple of properties: it should be scalar and should have a spin of zero. According to Hill, the new particle is almost certainly scalar; he showed a graph where the alternative, pseudoscalar, was nearly ruled out. Right now, spin is less clearly defined. It’s likely to be zero, but we haven’t yet ruled out a spin of two. So far, so Higgs-like.

The Higgs is the particle form of a quantum field that pervades our Universe (it’s a single quantum of the field), providing other particles with mass. In order to do that, its interactions with other particles vary—particles are heavier if they have stronger interactions with the Higgs. So, teams at CERN are sifting through the LHC data, checking for the strengths of these interactions. So far, with a few exceptions, the new particle is acting like the Higgs, although the error bars on these measurements are rather large.

As we said above, the Higgs is detected in a number of channels and each of them produces an independent estimate of its mass (along with an estimated error). As of the data Hill showed, not all of these estimates had converged on the same value, although they were all consistent within the given errors. These can also be combined mathematically for a single estimate, with each of the two detectors producing a value. So far, these overall estimates are quite close: CMS has the particle at 125.8GeV, Atlas at 125.2GeV. Again, the error bars on these values overlap.

Oops, there goes the Universe

That specific mass may seem fairly trivial—if it were 130GeV, would you care? Lykken made the argument you probably should. But he took some time to build to that.

Lykken pointed out, as the measurements mentioned above get more precise, we may find the Higgs isn’t decaying at precisely the rates we expect it to. This may be because we have some details of the Standard Model wrong. Or, it could be a sign the Higgs is also decaying into some particles we don’t know about—particles that are dark matter candidates would be a prime choice. The behavior of the Higgs might also provide some indication of why there’s such a large excess of matter in the Universe.

But much of Lykken’s talk focused on the mass. As we mentioned above, the Higgs field pervades the entire Universe; the vacuum of space is filled with it. And, with a value for the Higgs mass, we can start looking into the properties of the Higgs filed and thus the vacuum itself. “When we do this calculation,” Lykken said, “we get a nasty surprise.”

It turns out we’re not living in a stable vacuum. Eventually, the Universe will reach a point where the contents of the vacuum are the lowest energy possible, which means it will reach the most stable state possible. The mass of the Higgs tells us we’re not there yet, but are stuck in a metastable state at a somewhat higher energy. That means the Universe will be looking for an excuse to undergo a phase transition and enter the lower state.

What would that transition look like? In Lykken’s words, again, “fireballs of doom will form spontaneously and destroy the Universe.” Since the change would alter the very fabric of the Universe, anything embedded in that fabric—galaxies, planets, us—would be trashed during the transition. When an audience member asked “Are the fireballs of doom like ice-9?” Lykken replied, “They’re even worse than that.”

Lykken offered a couple of reasons for hope. He noted the outcome of these calculations is extremely sensitive to the values involved. Simply shifting the top quark’s mass by two percent to a value that’s still within the error bars of most measurements, would make for a far more stable Universe.

And then there’s supersymmetry. The news for supersymmetry out of the LHC has generally been negative, as various models with low-mass particles have been ruled out by the existing data (we’ll have more on that shortly). But supersymmetry actually predicts five Higgs particles. (Lykken noted this by showing a slide with five different photos of Higgs taken at various points in his career, in which he was “differing in mass and other properties, as happens to all of us.”) So, when the LHC starts up at higher energies in a couple of years, we’ll actually be looking for additional, heavier versions of the Higgs.

If those are found, then the destruction of our Universe would be permanently put on hold. “If you don’t like that fate of the Universe,” Lykken said, “root for supersymmetry”

Planetas extra-solares, Kepler 62 e o Paradoxo de Fermi local

Conforme aumentam o número de planetas extra-solares descobertos, também aumentamos vínculos sobre as previsões do modelo de percolação galática (Paradoxo de Fermi Local).
A previsão é que, se assumirmos que Biosferas Meméticas (Biosferas culturais ou Tecnosferas) são um resultado provável de Biosferas Genéticas, então devemos estar dentro de uma região com pucos planetas habitáveis. Pois se existirem planetas habitados (por seres inteligentes) por perto, com grande probabilidade eles são bem mais avançados do que nós, e já teriam nos colonizado.
Como isso ainda não ocorreu (a menos que se acredite nas teorias de conspiração dos ufólogos e nas teorias de Jesus ET, deuses astronautas etc.), segue que quanto mais os astronomos obtiverem dados, mais ficará evidente que nosso sistema solar é uma anomalia dentro de nossa vizinhança cósmica (1000 anos-luz?), ou seja, não podemos assumir o Princípio Copernicano em relação ao sistema solar: nosso sistema solar não é tipico em nossa vizinhança.  Bom, pelo menos, essa conclusão está batendo com os dados coletados até hoje…
Assim, é possível fazer a previsão de que uma maior análise dos planetas Kepler 62-e e Kepler 62-f revelará que eles não possuem uma atmosfera com oxigênio ou metano, sinais de um planeta com biosfera.

Persistence solves Fermi Paradox but challenges SETI projects

Osame Kinouchi (DFM-FFCLRP-Usp)
(Submitted on 8 Dec 2001)

Persistence phenomena in colonization processes could explain the negative results of SETI search preserving the possibility of a galactic civilization. However, persistence phenomena also indicates that search of technological civilizations in stars in the neighbourhood of Sun is a misdirected SETI strategy. This last conclusion is also suggested by a weaker form of the Fermi paradox. A simple model of a branching colonization which includes emergence, decay and branching of civilizations is proposed. The model could also be used in the context of ant nests diffusion.

03/05/2013 – 03h10

Possibilidade de vida não se resume a planetas similares à Terra, diz estudo

SALVADOR NOGUEIRA
COLABORAÇÃO PARA A FOLHA

Com as diferentes composições, massas e órbitas possíveis para os planetas fora do Sistema Solar, a vida talvez não esteja limitada a mundos similares à Terra em órbitas equivalentes à terrestre.

Editoria de arte/Folhapress

Essa é uma das conclusões apresentada por Sara Seager, do MIT (Instituto de Tecnologia de Massachusetts), nos EUA, em artigo de revisão publicado no periódico “Science“, com base na análise estatística dos cerca de 900 mundos já detectados ao redor de mais de 400 estrelas.

Seager destaca a possível existência de planetas cuja atmosfera seria tão densa a ponto de preservar água líquida na superfície mesmo a temperaturas bem mais baixas que a terrestre. Read more [+]

Palestra no Instituto de Estudos Avançados (RP) sobre Ciência e Religião

 

sexta-feira, 9 de novembro de 2012

Ciência e Religião: quatro perspectivas

Escrito por 

Data e Horário: 26/11 às 14h30
Local: Salão de Eventos do Centro de Informática de Ribeirão Preto – CIRP/USP (localização)

O evento, que será apresentado por Osame Kinouchi, discutirá quatro diferentes visões sobre a interação entre Ciência e Religião: o conflito, a separação, o diálogo e a integração. Examinando as fontes de conflito recentes (Culture Wars), o professor sugere que elas têm origem no Romantismo Anticientífico, religioso ou laico.

Segundo Osame, a ideia de separação entre os campos Religioso e Científico já não parece ser viável devido aos avanços da Ciência em tópicos antes considerados metafísicos, tais como as origens do Universo (Cosmologia), da Vida (Astrobiologia), da Mente (Neurociências) e mesmo das Religiões (Neuroteologia, Psicologia Evolucionária e Ciências da Religião).
A palestra mostrará também que tentativas de integração forçada ou prematura entre Religião e Ciência correm o risco de derivar para a Pseudociência. Sendo assim, na visão do professor, uma posição mais acadêmica de diálogo de alto nível pode ser um antídoto para uma polarização cultural ingênua entre Ateísmo e Religiosidade.

Vídeo do evento

Seleção Artificial Cosmológica: primeiras referências

Tive a mesma ideia em 1995, mas não publiquei. Sexta feira passada, achei numa pasta abandonada os escritos que estão digitalizados aqui.  Por um erro de memória, confundi Lee Smolin (em inglês e mais completo aqui) com Sidney Coleman.

Meduso-anthropic principle

The meduso-anthropic principle is a quasi-organic universe theory originally proposed by mathematician and quantum gravity scholar Louis Crane in 1994.

Contents

[hide]

Universes and black holes as potential life cycle partners

Crane’s MAP is a variant of the hypothesis of cosmological natural selection (fecund universes), originally proposed by cosmologist Lee Smolin (1992). It is perhaps the first published hypothesis of cosmological natural selection with intelligence (CNS-I), where intelligence plays some proposed functional role in universe reproduction. It is also an interpretation of the anthropic principle (fine-tuning problem). The MAP suggests the development and life cycle of the universe is similar to that of Corals and Jellyfish, in which dynamic Medusa are analogs for universal intelligence, in co-evolution and co-development with sessile Polyp generations, which are analogs for both black-holes and universes. In the proposed life cycle, the Universe develops intelligent life and intelligent life produces new baby universes. Crane further speculates that our universe may also exist as a black hole in a parallel universe, and extraterrestrial life there may have created that black hole.

Crane’s work was published in 1994 as a preprint on arXiv.org. In 1995, in an an article in QJRAS, emeritus cosmologist Edward Harrison (1919-2007) independently proposed that the purpose of intelligent life is to produce successor universes, in a process driven by natural selection at the universal scale. Harrison’s work was apparently the first CNS-I hypothesis to be published in a peer-reviewed journal.

Why future civilizations might create black holes

Crane speculates that successful industrial civilizations will eventually create black holes, perhaps for scientific research, for energy production, or for waste disposal. After the hydrogen of the universe is exhausted civilizations may need to create black holes in order to survive and give their descendants the chance to survive. He proposes that Hawking radiation from very small, carefully engineered black holes would provide the energy enabling civilizations to continue living when other sources are exhausted.

Philosophical implications

According to Crane, Harrison, and other proponents of CNS-I, mind and matter are linked in an organic-like paradigm applied at the universe scale. Natural selection in living systems has given organisms the imperative to survive and reproduce, and directed their intelligence to that purpose. Crane’s MAP proposes a functional purpose for intelligence with respect to universe maintenance and reproduction. Universes of matter produce intelligence, and intelligent entities are ultimately driven to produce new universes.

See also

References

Os deuses de Richard Dawkins

File:NASA child bubble exploration.jpgMy personal theology is described in the Gifford lectures that I gave at Aberdeen in Scotland in 1985, published under the title, Infinite In All Directions. Here is a brief summary of my thinking. The universe shows evidence of the operations of mind on three levels. The first level is elementary physical processes, as we see them when we study atoms in the laboratory. The second level is our direct human experience of our own consciousness. The third level is the universe as a whole. Atoms in the laboratory are weird stuff, behaving like active agents rather than inert substances. They make unpredictable choices between alternative possibilities according to the laws of quantum mechanics. It appears that mind, as manifested by the capacity to make choices, is to some extent inherent in every atom. The universe as a whole is also weird, with laws of nature that make it hospitable to the growth of mind. I do not make any clear distinction between mind and God. God is what mind becomes when it has passed beyond the scale of our comprehension. God may be either a world-soul or a collection of world-souls. So I am thinking that atoms and humans and God may have minds that differ in degree but not in kind. We stand, in a manner of speaking, midway between the unpredictability of atoms and the unpredictability of God. Atoms are small pieces of our mental apparatus, and we are small pieces of God’s mental apparatus. Our minds may receive inputs equally from atoms and from God. This view of our place in the cosmos may not be true, but it is compatible with the active nature of atoms as revealed in the experiments of modern physics. I don’t say that this personal theology is supported or proved by scientific evidence. I only say that it is consistent with scientific evidence.  Freeman Dyson

Parece que Dawkins está rumando para uma posição similar à de Gardner, Clément Vidal e outros da comunidade Evo-Devo Universe.

Human Gods

After two hours of conversation, Professor Dawkins walks far afield. He talks of the possibility that we might co-evolve with computers, a silicon destiny. And he’s intrigued by the playful, even soul-stirring writings of Freeman Dyson, the theoretical physicist.

In one essay, Professor Dyson casts millions of speculative years into the future. Our galaxy is dying and humans have evolved into something like bolts of superpowerful intelligent and moral energy.

Doesn’t that description sound an awful lot like God?

“Certainly,” Professor Dawkins replies. “It’s highly plausible that in the universe there are God-like creatures.”

He raises his hand, just in case a reader thinks he’s gone around a religious bend. “It’s very important to understand that these Gods came into being by an explicable scientific progression of incremental evolution.”

Could they be immortal? The professor shrugs.

“Probably not.” He smiles and adds, “But I wouldn’t want to be too dogmatic about that.”

O melhor livro de divulgação científica que encontrei em quarenta anos de leituras

Depois escrevo minha resenha…

A REALIDADE OCULTA – Universos paralelos e as leis profundas do cosmo
Brian Greene
R$ 59,00 Comprar
R$ 39,00 E-Book
Indique Comente
É necessário estar logado para utilizar este recurso. Acompanhe

Meio século atrás, os cientistas encaravam com ironia a possibilidade de existirem outros universos além deste que habitamos. Tal hipótese não passava de um delírio digno de Alice no País das Maravilhas – e que, de todo modo, jamais poderia ser comprovada experimentalmente. Os desafios propostos pela Teoria da Relatividade e pela física quântica para o entendimento de nosso próprio universo já eram suficientemente complexos para ocupar gerações e gerações de pesquisadores. Entretanto, diversos estudos independentes entre si, conduzidos por cientistas respeitados em suas áreas de atuação – teoria das cordas, eletrodinâmica quântica, teoria da informação -, começaram a convergir para o mesmo ponto: a existência de universos paralelos – o multiverso – não só é provável como passou a ser a explicação mais plausível para diversos enigmas cosmológicos.
Em A realidade oculta, Brian Greene – um dos maiores especialistas mundiais em cosmologia e física de partículas – expõe o fantástico desenvolvimento da física do multiverso ao longo das últimas décadas. O autor de O universo elegante passa em revista as diferentes teorias sobre os universos paralelos a partir dos fundamentos da relatividade e da mecânica quântica. Por meio de uma linguagem acessível e valendo-se de numerosas figuras explicativas, Greene orienta o leitor pelos labirintos da realidade mais profunda da matéria e do pensamento.

“Se extraterrestres aparecessem amanhã e pedissem para conhecer as capacidades da mente humana, não poderíamos fazer nada melhor que lhes oferecer um exemplar deste livro.” – Timothy Ferris, New York Times Book Review

Determinando se vivemos dentro da Matrix

The Measurement That Would Reveal The Universe As A Computer Simulation

If the cosmos is a numerical simulation, there ought to be clues in the spectrum of high energy cosmic rays, say theorists

1 comment

THE PHYSICS ARXIV BLOG

Wednesday, October 10, 2012

One of modern physics’ most cherished ideas is quantum chromodynamics, the theory that describes the strong nuclear force, how it binds quarks and gluons into protons and neutrons, how these form nuclei that themselves interact. This is the universe at its most fundamental.

So an interesting pursuit is to simulate quantum chromodynamics on a computer to see what kind of complexity arises. The promise is that simulating physics on such a fundamental level is more or less equivalent to simulating the universe itself.

There are one or two challenges of course. The physics is mind-bogglingly complex and operates on a vanishingly small scale. So even using the world’s most powerful supercomputers, physicists have only managed to simulate tiny corners of the cosmos just a few femtometers across. (A femtometer is 10^-15 metres.)

That may not sound like much but the significant point is that the simulation is essentially indistinguishable from the real thing (at least as far as we understand it).

It’s not hard to imagine that Moore’s Law-type progress will allow physicists to simulate significantly larger regions of space. A region just a few micrometres across could encapsulate the entire workings of a human cell.

Again, the behaviour of this human cell would be indistinguishable from the real thing.

It’s this kind of thinking that forces physicists to consider the possibility that our entire cosmos could be running on a vastly powerful computer. If so, is there any way we could ever know?

Today, we get an answer of sorts from Silas Beane, at the University of Bonn in Germany, and a few pals.  They say there is a way to see evidence that we are being simulated, at least in certain scenarios.

First, some background. The problem with all simulations is that the laws of physics, which appear continuous, have to be superimposed onto a discrete three dimensional lattice which advances in steps of time.

The question that Beane and co ask is whether the lattice spacing imposes any kind of limitation on the physical processes we see in the universe. They examine, in particular, high energy processes, which probe smaller regions of space as they get more energetic

What they find is interesting. They say that the lattice spacing imposes a fundamental limit on the energy that particles can have. That’s because nothing can exist that is smaller than the lattice itself.

So if our cosmos is merely a simulation, there ought to be a cut off in the spectrum of high energy particles.

It turns out there is exactly this kind of cut off in the energy of cosmic ray particles,  a limit known as the Greisen–Zatsepin–Kuzmin or GZK cut off.

This cut-off has been well studied and comes about because high energy particles interact with the cosmic microwave background and so lose energy as they travel  long distances.

But Beane and co calculate that the lattice spacing imposes some additional features on the spectrum. “The most striking feature…is that the angular distribution of the highest energy components would exhibit cubic symmetry in the rest frame of the lattice, deviating significantly from isotropy,” they say.

In other words, the cosmic rays would travel preferentially along the axes of the lattice, so we wouldn’t see them equally in all directions.

That’s a measurement we could do now with current technology. Finding the effect would be equivalent to being able to to ‘see’ the orientation of lattice on which our universe is simulated.

That’s cool, mind-blowing even. But the calculations by Beane and co are not without some important caveats. One problem is that the computer lattice may be constructed in an entirely different way to the one envisaged by these guys.

Another is that this effect is only measurable if the lattice cut off is the same as the GZK cut off. This occurs when the lattice spacing is about 10^-12 femtometers. If the spacing is significantly smaller than that, we’ll see nothing.

Nevertheless, it’s surely worth looking for, if only to rule out the possibility that we’re part of a simulation of this particular kind but secretly in the hope that we’ll find good evidence of our robotic overlords once and for all.

Ref: arxiv.org/abs/1210.1847: Constraints on the Universe as a Numerical Simulation

Uma prova matemática de que o Universo teve um início?

Mathematics of Eternity Prove The Universe Must Have Had A Beginning — Part II

Heavyweight cosmologists are battling it out over whether the universe had a beginning. And despite appearances, they may actually agree

11 comments

THE PHYSICS ARXIV BLOG

Friday, April 27, 2012

Earlier this week, Audrey Mithani and Alexander Vilenkin at Tufts University in Massachusetts argued that the mathematical properties of eternity prove that the universe must have had a beginning.

Today, another heavyweight from the world of cosmology weighs in with an additional argument. Leonard Susskind at Stanford University in California, says that even if the universe had a beginning, it can be thought of as eternal for all practical purposes.

Susskind is good enough to give a semi-popular version of his argument:

“To make the point simply, imagine Hilbertville, a one-dimensional semi-infinite city, whose border is at x = 0: The population is infinite and uniformly fills the positive axis x > 0: Each citizen has an identical telescope with a finite power. Each wants to know if there is a boundary to the city. It is obvious that only a finite number of citizens can see the boundary at x = 0. For the infinite majority the city might just as well extend to the infinite negative axis.

Thus, assuming he is typical, a citizen who has not yet studied the situation should bet with great confidence that he cannot detect a boundary. This conclusion is independent of the power of the telescopes as long as it is finite.”

He goes on to discuss various thermodynamic arguments that suggest the universe cannot have existed for ever. The bottom line is that the inevitable increase of entropy over time ensures that a past eternal universe ought to have long since lost any semblance of order. Since we can see order all around us, the universe cannot be eternal in the past.

He finishes with this: “We may conclude that there is a beginning, but in any kind of inflating cosmology the odds strongly (infinitely) favor the beginning to be so far in the past that it is eff ectively at minus infinity.”

Susskind is a big hitter: a founder of string theory and one of the most influential thinkers in this area. However, it’s hard to agree with his statement that this argument represents the opposing view to Mithani and Vilenkin’s.

His argument is equivalent to saying that the cosmos must have had a beginning even if it looks eternal in the past, which is rather similar to Mithani and Vilenkin’s view. The distinction that Susskind does make is that his focus is purely on the practical implications of this–although what he means by ‘practical’ isn’t clear.

That the universe did or did not have a beginning is profoundly important from a philosophical point of view, so much so that a definitive answer may well have practical implications for humanity.

But perhaps the real significance of this debate lies elsewhere. The need to disagree in the face of imminent agreement probably tells us more about the nature of cosmologists than about the cosmos itself.

Ref: arxiv.org/abs/1204.5385: Was There a Beginning?

Mais um passo rumo ao Darwinismo Cosmológico

Why Our Universe Must Have Been Born Inside a Black Hole


Posted: 12 Jul 2010 09:10 PM PDT

A small change to the theory of gravity implies that our universe inherited its arrow of time from the black hole in which it was born.

“Accordingly, our own Universe may be the interior of a black hole existing in another universe.” So concludes Nikodem Poplawski at Indiana University in a remarkable paper about the nature of space and the origin of time.

The idea that new universes can be created inside black holes and that our own may have originated in this way has been the raw fodder of science fiction for many years. But a proper scientific derivation of the notion has never emerged.

Today Poplawski provides such a derivation. He says the idea that black holes are the cosmic mothers of new universes is a natural consequence of a simple new assumption about the nature of spacetime.

Poplawski points out that the standard derivation of general relativity takes no account of the intrinsic momentum of spin half particles. However there is another version of the theory, called the Einstein-Cartan-Kibble-Sciama theory of gravity, which does.

This predicts that particles with half integer spin should interact, generating a tiny repulsive force called torsion. In ordinary circumstances, torsion is too small to have any effect. But when densities become much higher than those in nuclear matter, it becomes significant. In particular, says Poplawski, torsion prevents the formation of singularities inside a black hole.

That’s interesting for a number of reasons. First, it has important implications for the way the Universe must have grown when it was close to its minimum size.

Astrophysicists have long known that our universe is so big that it could not have reached its current size given the rate of expansion we see now. Instead, they believe it grew by many orders of magnitude in a fraction of a second after the Big Bang, a process known as inflation.

The problem with inflation is that it needs an additional theory to explain why it occurs and that’s ugly. Poplawski’s approach immediately solves this problem. He says that torsion caused this rapid inflation.

That means the universe as we see it today can be explained by a single theory of gravity without any additional assumptions about inflation.

Another important by-product of Poplawski’s approach is that it makes it possible for universes to be born inside the event horizons of certain kinds of black hole. Here, torsion prevents the formation of a singularity but allows a HUGE energy density to build up, which leads to the creation of particles on a massive scale via pair production followed by the expansion of the new universe.

This is a Big Bang type event. “Such an expansion is not visible for observers outside the black hole, for whom the horizon’s formation and all subsequent processes occur after infinite time,” says Poplawski.

For this reason, the new universe is a separate branch of space time and evolves accordingly.

Incidentally, this approach also suggests a solution to another of the great problems of cosmology: why time seems to flow in one direction but not in the other, even though the laws of physics are time symmetric.

Poplawski says the origin of the arrow of time comes from the asymmetry of the flow of matter into the black hole from the mother universe. “The arrow of cosmic time of a universe inside a black hole would then be fixed by the time-asymmetric collapse of matter through the event horizon,” he says.

In other words, our universe inherited its arrow of time from its mother.

He says that daughter universes may inherit other properties from their mothers, implying that it may be possible to detect these properties, providing an experimental proof of his idea.

Theories of everything don’t get much more ambitious than this. Entertaining stuff!

Ref: arxiv.org/abs/1007.0587: Cosmology With Torsion – An Alternative To Cosmic Inflation


Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics

(Submitted on 20 Feb 2010)

In this philosophical paper, we explore computational and biological analogies to address the fine-tuning problem in cosmology. We first clarify what it means for physical constants or initial conditions to be fine-tuned. We review important distinctions such as the dimensionless and dimensional physical constants, and the classification of constants proposed by Levy-Leblond. Then we explore how two great analogies, computational and biological, can give new insights into our problem. This paper includes a preliminary study to examine the two analogies. Importantly, analogies are both useful and fundamental cognitive tools, but can also be misused or misinterpreted. The idea that our universe might be modelled as a computational entity is analysed, and we discuss the distinction between physical laws and initial conditions using algorithmic information theory. Smolin introduced the theory of “Cosmological Natural Selection” with a biological analogy in mind. We examine an extension of this analogy involving intelligent life. We discuss if and how this extension could be legitimated. 
Keywords: origin of the universe, fine-tuning, physical constants, initial conditions, computational universe, biological universe, role of intelligent life, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis.

Comments: 25 pages, Foundations of Science, in press
Subjects: General Physics (physics.gen-ph)
Cite as: arXiv:1002.3905v1 [physics.gen-ph]

Cosmologia do Leite

Visualizing Cosmological Concepts Using the Analog of a Hot Liquid

E. Yusofi, M. Mohsenzadeh
(Submitted on 26 Jun 2010)

We have used the expansion process of hot milk, which has similarities with the cosmic expansion, to facilitate easier and better visualization and teaching of cosmological concepts. Observations of the milk are used to illustrate phenomena related to the Planck era, the standard hot big bang model, cosmic inflation, problems with the formation of structure, and other subjects. This innovative and easily implemented demonstration can enhance the learning of cosmological concepts.

Comments: 12 pages, 5 figures
Subjects: Popular Physics (physics.pop-ph); Cosmology and Extragalactic Astrophysics (astro-ph.CO)
Journal reference: AER(Astronomical Education Review),2010
Cite as: arXiv:1006.5159v1 [physics.pop-ph]

Quanto tempo irá durar o Universo?

Da Wikipedia, clique em cima para aumentar.
Dado que a formação de estrelas cessa em 100 trilhões de anos, isso significa que o Universo ainda é um bebê recém nascido. Se esses 100 trilhões de anos forem igualado a 100 anos de uma vida humana, o Universo teria hoje 5 dias de vida…
A menos que o Grande Rip aconteça, claro…
Interessante que a teoria do Darwinismo Cosmológico Forte (a de que civilizações tecnológicas criam universos-bebês) prediz que o tempo de vida do Universo maximiza o número de civilizações (outras coisas ficando iguais…). Ou seja, deveríamos ter w = -1, ver abaixo. Empiricamente, parece que w realmente é igual a -1. Curioso…

Big Rip

From Wikipedia, the free encyclopedia

This article is about the physics concept. For the Massively Multiplayer Online (MMO) Game Network, see The Big Rip.

The Big Rip is a cosmological hypothesis first published in 2003, about the ultimate fate of the universe, in which the matter of the universe, from stars and galaxies to atoms and subatomic particles, are progressively torn apart by the expansion of the universe at a certain time in the future. Theoretically, the scale factor of the universe becomes infinite at a finite time in the future.

The hypothesis relies crucially on the type of dark energy in the universe. The key value is the equation of state parameter w, the ratiobetween the dark energy pressure and its energy density. At w < −1, the universe will eventually be pulled apart. Such energy is calledphantom energy, an extreme form of quintessence.

In a phantom-energy dominated universe, the universe expands at an ever-increasing rate. However, this implies that the size of theobservable universe is continually shrinking; the distance to the edge of the observable universe which is moving away at the speed of light from any point gets ever closer. When the size of the observable universe is smaller than any particular structure, then no interaction between the farthest parts of the structure can occur, neither gravitational nor electromagnetic (nor weak or strong), and when they can no longer interact with each other in any way they will be “ripped apart”. The model implies that after a finite time there will be a final singularity, called the “Big Rip”, in which all distances diverge to infinite values.

The authors of this hypothesis, led by Robert Caldwell of Dartmouth College, calculate the time from now to the end of the universe as we know it for this form of energy to be

t_{rip} - t_0 \approx \frac{2}{3|1+w|H_0\sqrt{1-\Omega_m}}

where w is a measure of the repulsive force of Dark Energy , H0 is Hubble’s constant and Ωm is the present-day value of the density of all the matter in the universe.

In their paper they consider an example with w = -1.5, H0 = 70 km/s/MPsec and Ωm = 0.3, in which case the end of the universe is approximately 22 billion years from now. This is not considered as a prediction, but as a hypothetical example. The authors note that evidence indicates w is very close to -1 in our universe, which makes ω the dominating term in the equation. The closer (1 + w) is to zero, the closer the denominator is to zero and the more distant (in time) is the Big Rip. If w were exactly equal to -1 then the Big Rip could not happen, regardless of the values of H0 or Ωm.

In their scenario for w = -1.5, the galaxies would first be separated from each other. About 60 million years before the end, gravity would be too weak to hold the Milky Way and other individual galaxies together. Approximately three months before the end, the Solar system will be gravitationally unbound. In the last minutes, stars and planets will be torn apart, and an instant before the end, atoms will be destroyed.[1]

[edit]See also

[edit]References

  1. ^ Caldwell, Robert R.; Kamionkowski, Marc and Weinberg, Nevin N. (2003). “”Phantom Energy and Cosmic Doomsday””. “Physical Review Letters”, 91,: 071301,. arXiv:astro-ph/0302506.

Vida no Multiverso


The Drake Equation For The Multiverse

Posted: 09 Feb 2010 09:10 PM PST

The famous Drake equation estimates the number of intelligent civilisations in the Milky Way. Now a new approach asks how many might exist in the entire multiverse

In 1960, the astronomer Frank Drake devised an equation for estimating the number of intelligent civilisations in our galaxy. He did it by breaking down the problem into a hierarchy of various factors.

He suggested that the total number of intelligent civilisations in the Milky Way depends first on the rate of star formation. He culled this number by estimating the fraction of these stars with rocky planets, the fraction of those planets that can and do support life and the fraction of these that go on to support intelligent life capable of communicating with us. The result is this equation:

which is explained in more detail in this Wikipedia entry.

Today, Marcelo Gleiser at Dartmouth College in New Hampshire points out that cosmology has moved on since the 1960s. One of the most provocative new ideas is that the universe we see is one of many, possibly one of an infinite number. One line of thinking is that the laws of physics may be very different in these universes and that carbon-based life could only have arisen in those where conditions were fine-tuned in a particular way. This is the anthropic principle.

Consequently, says Gleiser, the Drake Equation needs updating to take the multiverse and the extra factors it introduces into account.

He begins by considering the total set of universes in the multiverse and defines the subset in which the parameters and fundamental constants are compatible with the anthropic principle. This is the subset {ccosmo}.

He then considers the subset of these universes in which astrophysical conditions are ripe for star and galaxy formation {c-astro}. Next he looks at the subset of these in which planets form that are capable of harbouring life {c-life}. And finally he defines the subset of these in which complex life actually arises {c-complex life}.

Then the conditions for complex life to emerge in a particular universe in the multiverse must satisfy the statement at the top of this post (where the composition symbol denotes ‘together with’).

But there’s a problem: this is not an equation. To form a true Drake-like argument, Gleiser would need to assign probabilities to each of these sets allowing him to write an equation in which the assigned probabilities multiplied together, on one side of the equation, equal the fraction of universes where complex life emerges on the other side.

Here he comes up against one of the great problems of modern cosmology–that without evidence to back up their veracity, many ideas in modern cosmology are little more than philosophy. So assigning a probability to the fraction of universes in the multiverse in which the fundamental constants and laws satisfy the anthropic principle is not just hard, but almost impossible to formulate at all.

Take {c-cosmo} for example. Gleiser points out a few of the obvious parameters that would need to taken into account in deriving a probability. These are the vacuum energy density, matter-antimatter asymmetry, dark matter density, the couplings of the four fundamental forces and the masses of quarks and leptons so that hadrons and then nuclei can form after electroweak symmetry breaking. Try assigning a probability to that lot.

Neither is it much easier for {c-astro}. This needs to take into account the fact that heavy elements seem to be important for the emergence of life which only seem to occur in galaxies above a certain mass and in stars of a certain type and age. Estimating the probability of these conditions occurring is still beyond astronomers.

At first glance, the third set {c-life} ought to be easier to handle. This must take into account the planetary and chemical constraints on the formation of life. The presence of liquid water and various elements such as carbon, oxygen and nitrogen seem to be important as do more complex molecules. How common these conditions are, we don’t yet know.

Finally there is {c-complex life}, which includes all the planetary factors that must coincide for complex life to emerge. These may include long term orbital stability, the presence of a magnetic field to protect delicate biomolecules, plate tectonics, a large moon and so on. That’s not so easy to estimate either.

Many people have tried to put the numbers into Drake’s equation. The estimates for the number of intelligent civilisations in the Milky Way ranges from one (ours) to countless tens of thousands. Drake himself put the number at 10.

Gleiser’s take on the Drake equation for the Multiverse is an interesting approach. What it tells us, however, is that our limited understanding of the universe today does not allow us to make any reasonable estimate of the number of intelligent lifeforms in the multiverse (more than one). And given the limits on what we can ever know about other universes, it’s likely that we’ll never be able to do much better than that.

Ref: arxiv.org/abs/1002.1651: Drake Equation For the Multiverse: From String Landscape to Complex Life

Revolução na Teoria de Gravitação?

Gravity as an entropic force

From Wikipedia, the free encyclopedia

Verlinde’s statistical description of gravity as an entropic force leads to the correct inverse square distance law of attraction between classical bodies.

The hypothesis of gravity being an entropic force has a history that goes back to research on black hole thermodynamics by Bekenstein andHawking in the mid-1970s. These studies suggest a deep connection between gravity and thermodynamics. In 1995 Jacobson demonstrated that the Einstein equations describing relativistic gravitation can be derived by combining general thermodynamic considerations with the equivalence principle.[1] Subsequently, other physicists have further explored the link between gravity and entropy.[2]

In 2009, Erik Verlinde disclosed a conceptual theory that describes gravity as an entropic force.[3] This theory combines the thermodynamic approach to gravity with Gerardus ‘t Hooft‘s holographic principle. If proven correct, gravity is not a fundamental interaction, but an emergent phenomenon which arises from the statistical behaviour of microscopic degrees of freedom encoded on a holographic screen.[4]

Verlinde’s suggestion of gravity being an entropic phenomenon attracted considerable media[5][6] exposure, and led to immediate follow-up work in cosmology,[7][8] the dark energy hypothesis,[9] cosmological acceleration,[10][11] cosmological inflation,[12] and loop quantum gravity.[13] Also, a specific microscopic model has been proposed that indeed leads to entropic gravity emerging at large scales.[14]

[edit]See also

[edit]References

This article uses bare URLs in its references. Please use proper citations containing each referenced work’s title, author, date, and source, so that the article remains verifiable in the future. Help may be available. Several templates are available for formatting. (March 2010)
  1. ^ Thermodynamics of Spacetime: The Einstein Equation of StateTed Jacobson, 1995
  2. ^ Thermodynamical Aspects of Gravity: New insightsThanu Padmanabhan, 2009
  3. ^ http://www.volkskrant.nl/wetenschap/article1326775.ece/Is_Einstein_een_beetje_achterhaald Dutch newspaper ‘Volkskrant‘, 9 December 2009
  4. ^ On the Origin of Gravity and the Laws of NewtonErik Verlinde, 2010
  5. ^ The entropy force: a new direction for gravityNew Scientist, 20 January 2010, issue 2744
  6. ^ Gravity is an entropic form of holographic informationWired Magazine, 20 January 2010
  7. ^ Equipartition of energy and the first law of thermodynamics at the apparent horizon, Fu-Wen Shu, Yungui Gong, 2010
  8. ^ Friedmann equations from entropic force, Rong-Gen Cai, Li-Ming Cao, Nobuyoshi Ohta 2010
  9. ^ It from Bit: How to get rid of dark energy, Johannes Koelman, 2010
  10. ^ Entropic Accelerating Universe, Damien Easson, Paul Frampton, George Smoot, 2010
  11. ^ Entropic cosmology: a unified model of inflation and late-time acceleration, Yi-Fu Cai, Jie Liu, Hong Li, 2010
  12. ^ Towards a holographic description of inflation and generation of fluctuations from thermodynamics, Yi Wang, 2010
  13. ^ Newtonian gravity in loop quantum gravityLee Smolin, 2010
  14. ^ Notes concerning “On the origin of gravity and the laws of Newton” by E. Verlinde, Jarmo Makela, 2010

[edit]Further reading

Entropia e Gravidade

Gravity Emerges from Quantum Information, Say Physicists

Posted: 25 Mar 2010 09:10 PM PDT

The new role that quantum information plays in gravity sets the scene for a dramatic unification of ideas in physics

One of the hottest new ideas in physics is that gravity is an emergent phenomena; that it somehow arises from the complex interaction of simpler things.

A few month’s ago, Erik Verlinde at the the University of Amsterdam put forward one such idea which has taken the world of physics by storm. Verlinde suggested that gravity is merely a manifestation of entropy in the Universe. His idea is based on the second law of thermodynamics, that entropy always increases over time. It suggests that differences in entropy between parts of the Universe generates a force that redistributes matter in a way that maximises entropy. This is the force we call gravity.

What’s exciting about the approach is that it dramatically simplifies the theoretical scaffolding that supports modern physics. And while it has its limitations–for example, it generates Newton’s laws of gravity rather than Einstein’s–it has some advantages too, such as the ability to account for the magnitude of dark energy which conventional theories of gravity struggle with.

But perhaps the most powerful idea to emerge from Verlinde’s approach is that gravity is essentially a phenomenon of information.

Today, this idea gets a useful boost from Jae-Weon Lee at Jungwon University in South Korea and a couple of buddies. They use the idea of quantum information to derive a theory of gravity and they do it taking a slightly different tack to Verlinde.

At the heart of their idea is the tricky question of what happens to information when it enters a black hole. Physicists have puzzled over this for decades with little consensus. But one thing they agree on is Landauer’s principle: that erasing a bit of quantum information always increases the entropy of the Universe by a certain small amount and requires a specific amount of energy.

Jae-Weon and co assume that this erasure process must occur at the black hole horizon. And if so, spacetime must organise itself in a way that maximises entropy at these horizons. In other words, it generates a gravity-like force.

That’s intriguing for several reasons. First, Jae-Weon and co assume the existence of spacetime and its geometry and simply ask what form it must take if information is being erased at horizons in this way.

It also relates gravity to quantum information for the first time. Over recent years many results in quantum mechanics have pointed to the increasingly important role that information appears to play in the Universe.

Some physicists are convinced that the properties of information do not come from the behaviour of information carriers such as photons and electrons but the other way round. They think that information itself is the ghostly bedrock on which our universe is built.

Gravity has always been a fly in this ointment. But the growing realisation that information plays a fundamental role here too, could open the way to the kind of unification between the quantum mechanics and relativity that physicists have dreamed of.

Ref: arxiv.org/abs/1001.5445: Gravity from Quantum Information

Gravidade como força entrópica emergente?

How Duality Could Resolve Dark Matter Dilemma

Posted: 23 May 2010 09:10 PM PDT

Astrophysicists need to choose between dark matter or modified gravity to explain the Universe. But a strange new duality may mean they can have both



The debate over the wave or particle-like nature of light consumed physicists for 300 years after Isaac Newton championed particles and Christian Huygens backed the idea of waves. The resolution, that light can be thought of as both a wave and a particle, would have astounded these giants of physics, as indeed, it does us.


What shouldn’t surprise us, though, is that other seemingly intractable arguments might be similarly resolved.


But exactly this may be in store for the dark matter conundrum which has puzzled astrophysicists for almost 80s years, according to Chiu Man Ho at Vanderbilt University in Nashville and a couple of buddies,


The problem is that galaxies rotate so fast that the matter they contain ought to fly off into space. Similarly, clusters of galaxies do not seem to contain enough mass to bind them together and so ought to fly apart. Since this manifestly doesn’t happen, some force must be holding these masses in place.


Astrophysicists have put forward two explanations. The first is that these galaxies are filled with unseen mass and this so-called dark matter provides the extra gravitational tug. The second is that gravity is stronger at these intergalactic scales and so does the job by itself, an idea called modified Newtonian dynamics or MOND.


There is no love lost between the dark matter proponents and their MONDian counterparts: both say the other is wrong and scour the Universe in search of evidence to damn their opponents. Neither side has convincingly crushed the other’s argument so far but all concerned seem to agree that when one triumphs, the other will be ground underfoot.


Perhaps there’s another possibility, however: that they’re both right.


What makes this possible is a new approach to gravity in which it is an emergent phenomenon related to entropy. We looked at this a few months ago here.


The basic idea is that parts of the Universe have different levels of entropy and this creates a force that redistributes matter in a way that maximises entropy. This force is what we call gravity.


So far, this approach has assumed a simple Universe. But cosmologists know that our Universe is not only expanding but accelerating away from us. What Chui and co have done is derive gravity as an emergent force using the same entropic approach but this time in a Universe that is accelerating.


The result is a form of gravity in which parameters for acceleration and mass share a strange kind of duality: either the acceleration term can be thought of as modified as in MOND; or the mass term can be though of as modified, as in the dark matter theory.


In effect, Chui and co are saying that dark matter and MOND are two sides of the same coin.

Interestingly, the effect of each type of modification seems to be scale dependent. In this theory, the MONDian interpretation works at the galactic scale while the dark matter interpretation works best at the scale of galactic clusters.


That’s actually how the observational evidence pans out too. MOND seems to better explain the real behaviour of galaxies while the dark matter approach better explains the structure of galaxy clusters.


Could it be that both are manifestations of the same thing? Only the brave or foolish would rule it out. And stranger things have happened in physics, as Newton and Huygens would surely attest to.


Ref: arxiv.org/abs/1005.3537: MONDian Dark Matter

Cérebros de Boltzmann

Boltzmann brains and the scale-factor cutoff measure of the multiverse

Andrea De Simone, Alan H. Guth, Andrei Linde, Mahdiyar Noorbala, Michael P. Salem, Alexander Vilenkin
(Submitted on 28 Aug 2008 (v1), last revised 10 May 2010 (this version, v2))

To make predictions for an eternally inflating “multiverse”, one must adopt a procedure for regulating its divergent spacetime volume. Recently, a new test of such spacetime measures has emerged: normal observers – who evolve in pocket universes cooling from hot big bang conditions – must not be vastly outnumbered by “Boltzmann brains” – freak observers that pop in and out of existence as a result of rare quantum fluctuations. If the Boltzmann brains prevail, then a randomly chosen observer would be overwhelmingly likely to be surrounded by an empty world, where all but vacuum energy has redshifted away, rather than the rich structure that we observe. Using the scale-factor cutoff measure, we calculate the ratio of Boltzmann brains to normal observers. We find the ratio to be finite, and give an expression for it in terms of Boltzmann brain nucleation rates and vacuum decay rates. We discuss the conditions that these rates must obey for the ratio to be acceptable, and we discuss estimates of the rates under a variety of assumptions.

Comments: 32 pp, 2 figs The work has been significantly improved and extended. In discussing the Boltzmann Brain (BB) nucleation rate, we corrected the statement and the implications of the Bekenstein bound. Other additions include a toy model based on an ideal gas, discussions of BB’s in Schwarzschild-de Sitter space and the stability of BB’s against expansion, and the generalization of dominant vacua
Subjects: High Energy Physics – Theory (hep-th); Cosmology and Extragalactic Astrophysics (astro-ph.CO); General Relativity and Quantum Cosmology (gr-qc)
Report number: MIT-CTP-3975, SU-ITP-08/20
Cite as: arXiv:0808.3778v2 [hep-th]

Mais matéria que anti-matéria

A New Clue to Explain Existence

By DENNIS OVERBYE, Times

Published: May 17, 2010


Physicists at the Fermi National Accelerator Laboratory are reporting that they have discovered a new clue that could help unravel one of the biggest mysteries of cosmology: why the universe is composed of matter and not its evil-twin opposite, antimatter. If confirmed, the finding portends fundamental discoveries at the new Large Hadron Collider outside Geneva, as well as a possible explanation for our own existence.

The results have now been posted on the Internet and submitted to the Physical Review.

A energia escura e o Congresso

How to Build a Dark Energy Detector

Posted: 25 Jan 2010 09:10 PM PST

All the evidence for dark energy comes from the observation of distant galaxies. Now physicists have worked out how to spot it in the lab

The notion of dark energy is peculiar, even by cosmological standards.

Cosmologists have foisted the idea upon us to explain the apparent accelerating expansion of the Universe. They say that this acceleration is caused by energy that fills space at a density of 10^-10 joules per cubic metre.

What’s strange about this idea is that as space expands, so too does the amount of energy. If you’ve spotted the flaw in this argument, you’re not alone. Forgetting the law of conservation of energy is no small oversight.

What we need is another way of studying dark energy, ideally in a lab on Earth. Today, Martin Perl at Stanford University and Holger Mueller down the road at the University of California, Berkeley, suggest just such an experiment

The dark energy density might sound small but Perl and Mueller point out that physicists routinely measure fields with much smaller energy densities. For example an electric field of 1 Volt per metre has an energy density of 10^-12 joules per cubic metre. That’s easy to measure on Earth.

Of course there are some important differences between an electric field and the dark energy field that make measurements tricky. Not least of these is that you can’t turn off dark energy. Another is that there is no known reference against which to measure it.

That leaves the possibility of a gradient in the dark energy field. If there is such a gradient, then it ought to be possible to measure its effect and the best way to do this is with atom interferometry, say Perl and Mueller.

Atom interferometry measures the phase change caused by the difference in two trajectories of an atom in space. So if a gradient in this field exists it should be possible to spot it by cancelling out the effects of all other forces. Perl and Mueller suggest screening out electromagnetic forces with conventional shields and using two atom interferometers to cancel out the the effect of gravitational forces.

That should allow measurements with unprecedented accuracy. Experiments with single atom interferometers have already measured the Earth’s gravitational pull to an accuracy of 10^-9. The double interferometer technique should increase this to at least 10^-17.

That’s a very exciting experiment which looks to be within reach with today’s technology.

There are two potential flies in Perl and Mueller’s ointment. The first is that the nature of dark energy is entirely unknown. If it exists and if there is a gradient, it is by no means certain that dark energy will exert a force on atoms at all. That will leave them the endless task of trying to place tighter and tighter limits on the size of a non-existent force.

The second is that some other unknown force will rear its head in this regime and swamp the measurements. If that happens, it’s hard to imagine Perl and Mueller being too upset. That’s the kind of discovery that ought to put a smile on any physicists face.

Ref:arxiv.org/abs/1001.4061: Exploring The Possibility Of Detecting Dark Energy In A Terrestrial Experiment Using Atom Interferometry

To Understand Congress, Just Watch the Sandpile

Posted: 24 Jan 2010 09:10 PM PST

The behavior of Congress can be modeled by the same process that causes avalanches in sandpiles.

What does it take for a resolution in Congress to achieve sizeable support? It’s easy to imagine that the support of certain influential representatives is crucial because of their skill in the cut and thrust of political bargaining.

Not so, say Mikhail Simkin and Vwani Roychowdhury at the University of California, Los Angeles. It turns out that the way a particular resolution gains support can be accurately simulated by the avalanches that occur when grains of sand are dropped onto each other to form a pile.

Simkin and Roychowdhury begin their analysis with a study of resolution HR1207 and a plot of the number of co-sponsors it received against time early last year. This plot is known in mathematics as a Devil’s staircase–it consists of long periods without the addition of any new co-sponsors followed by jumps when many new co-sponsors join during a single day. “One might have suspected that the biggest steps of the staircase are due to joining of a highly influential congressman bringing with himself many new co-sponsors which he had influenced,” say Simkin and Roychowdhury.

That’s uncannily similar to the way in which avalanches proceed in a a model of sandpiles developed by Per Bak, Chao Tang and Kurt Wiesenfeld in 1988. Perhaps Congress can be modelled in a similar way, reason Simkin and Roychowdhury.

Their model assumes that the roles of sand grains is played units of political pressure. They assume that there is a network of influence in Congress through which representatives exert political pressure on each other (just as sand grains exert forces on each other through the network of contacts between them in the pile). When the pressure on representatives reaches a threshold, they co-sponsor the resolution and this, in turn, puts pressure on other member of congress to sign.

This is like the pressure that builds up in a sandpile as grains are dropped onto it. When a threshold is reached at a certain point on the pile, an avalanche occurs which redistributes the pressure to other places.

In addition, the representatives are pressured by their constituents which is analogous to dropping grains of sand at random.

There is a difference between sandpiles and congress however. Once a representative has signed, he or she cannot do it again and so take no further part in the process. Any further pressure on them is simply dissipated. So representatives cannot topple more than once, unlike sand grains which can keep on toppling as the pile gets bigger.

This is a pretty simple model but when Simkin and Roychowdhury ran it, they found that it generates a Devil’s staircase that is uncannily similar to the one generated by representatives for HR1207.

Perhaps the most interesting feature is that the model assumes that all representatives have equal influence. “In our model, big steps are a result of evolution of Congress to a sort of critical state, where any congressman can trigger an avalanche of co-sponsors,” say Simkin and Roychowdhury.

The pair suggest some interesting ways to follow up their work. They point out that not all resolutions in Congress get the same level of support. In their model, this is due to the amount of public pressure, ie the number of units of political pressure dropped onto the pile at random. If there is no outside pressure, the resolution will not get sizeable support in a reasonable amount of time.

“An obvious extension to the model is to introduce political pressure against the resolution,” they say, pointing out that an interesting case would be when the negative pressure exactly balances the positive. “It could explain the cases when a resolution quickly gains some support, which, however, never becomes overwhelming.”

So representatives are not as important as perhaps they might imagine. Perhaps the stage should be replacing them with actual grains of sand. By Simkin and Roychowdhury’s reckoning, it wouldn’t make much difference.

Ref: arxiv.org/abs/1001.3732: Stochastic modeling of Congress

Explorando o relevo entrópico do multiverso


Deriving the Properties of the Universe

Posted: 17 Jan 2010 09:10 PM PST

The properties of the universe can be derived by thinking about the origin of complexity, says a new theory.


Physicists and cosmologists have long noted that the laws of physics seem remarkably well tuned to allow the existence of life, an idea known as the anthropic principle.

It is sometimes used to explain why the laws of physics are the way they are. Answer: because if they were different, we wouldn’t be here to see them.


To many people, that looks like a cop out. One problem is that this way of thinking is clearly biased towards a certain kind of carbon-based life that has evolved on a pale blue dot in an unremarkable corner of the cosmos. Surely there is a more objective way to explain the laws of physics.


Enter Raphael Bousso and Roni Harnik at the University of California, Berkeley and Stanford University respectively. They point out that the increase in entropy in any part of the Universe is a decent measure of the complexity that exists there. Perhaps the anthropic principle can be replaced with an entropic one?


Today, they outline their idea and it makes a fascinating read. By thinking about the way entropy increases, Bousso and Harnik derive the properties of an average Universe in which the complexity has risen to a level where observers would have evolved to witness it.


They make six predictions about such a Universe. They say “typical observers find themselves in a flat universe, at the onset of vacuum domination, surrounded by a recently produced bath of relativistic quanta.These quanta are neither very dilute nor condensed, and thus appear as a roughly thermal background.”


Sound familiar? It so happens that we live in a (seemingly) flat universe, not so long after it has become largely a vacuum and we’re bathed in photons that form a thermal background. That’s the cosmic infrared background that is emitted by galactic dust heated by starlight (this is different from the cosmic microwave background which has a different origin).


That’s a remarkably accurate set of predictions from a very general principle. The question, of course, is how far can you run with a theory like this.


It certainly has the feel of a powerful idea. But, just like the anthropic principle, it also has the scent of circular reasoning about it: the universe is the way it is because if it were different, the complexity necessary to observe it wouldn’t be here to see it.


That may not be so hard to stomach, given the power of the new idea. Even a hardened physicist would have to accept.that Bousso and Harnik have a remarkably elegant way of capturing the state of the universe.


Ref:arxiv.org/abs/1001.1155: The Entropic Landscape


The Entropic Landscape

(Submitted on 8 Jan 2010 (v1), last revised 14 Jan 2010 (this version, v2))

We initiate a quantitative exploration of the entire landscape. Predictions thus far have focused on subsets of landscape vacua that share most properties with our own. Using the entropic principle (the assumption that entropy production traces the formation of complex structures such as observers), we derive six predictions that apply to the whole landscape. Typical observers find themselves in a flat universe, at the onset of vacuum domination, surrounded by a recently produced bath of relativistic quanta. These quanta are neither very dilute nor condensed, and thus appear as a roughly thermal background. Their characteristic wavelength is of order the inverse fourth root of the vacuum energy. These predictions hold for completely arbitrary observers, in arbitrary vacua with potentially exotic particle physics and cosmology. They agree with observation: We live in a flat universe at the onset of vacuum domination, whose dominant entropy production process (the glow of galactic dust) has recently produced a radiation bath (the cosmic infrared background). This radiation is marginally dilute, relativistic, and has a wavelength of order 100 microns, as predicted.

Comments: 40 pages and 3 figures, references added
Subjects: High Energy Physics – Theory (hep-th); Cosmology and Extragalactic Astrophysics (astro-ph.CO); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics – Phenomenology (hep-ph)
Cite as: arXiv:1001.1155v2 [hep-th]

Analogias em Física e Cosmologia

Ainda meditando sobre analogias e metáforas…

As analogias físicas são mais fortes que as analogias e metáforas linguísticas, porque se baseiam em um mapeamento matemático ou equivalência das equações presentes em dois diferentes sistemas. O caso mais simples é a analogia entre o oscilador harmônico e o circuito RLC.

Análogos acústicos de Buracos Negros, que poderiam vindicar o mecanismo da radiação de Hawking, mostram a força das analogias físicas.

Como um grande matemático já disse (não me lembro quem, preciso procurar no Google):

“Bons matemáticos encontram analogias entre conceitos. Grandes matemáticos enxergam analogias entre analogias”.


Wednesday, June 10, 2009

Acoustic Black Hole Created in Bose-Einstein Condensate

The creation of an acoustic black hole leaves the way open for the discovery of Hawking radiation.

One of the many curious properties of Bose-Einstein Condensates (BECs) is that the flow of sound through them is governed by the same equations that describe how light is bent by a gravitational field. That sets up the possibility of all kinds of fun and games: in theory, physicists can reproduce with sound and BECs whatever wicked way gravity has with light.

Today, Oren Lahav and his mates at the Israel Institute of Technology, in Haifa, say that they’ve created the sonic equivalent of a black hole in a BEC. That’s some achievement, given that physicists have wondered about this possibility for some 30 years, and various groups with the ability to create BECs have been racing to create acoustic black holes.

The general idea is to set up a supersonic flow of atoms within the BEC. Sound waves moving against this flow can never make any ground. So the region where the flow changes from subsonic to supersonic is an event horizon. Any sound waves (or phonons) created inside the event horizon can never escape because the flow there is supersonic. That’s the black hole.

Lahav and co set up a supersonic flow by creating a deep potential well in the middle of a BEC that attracts atoms. The atoms stream into it but cannot give up their energy when they arrive (they’re already in their lowest energy state), and so they stream across the well at supersonic speed.

The result is a region within the BEC in which the atoms move at supersonic speed. This is the black hole: any phonon unlucky enough to stray into this region cannot escape.

One reason why sonic black holes are so highly prized is that they ought to produce Hawking radiation. Quantum mechanics predicts that pairs of “virtual” phonons with equal and opposite momentum ought to be constantly springing in and out of existence in BECs.

If one of this pair were to cross the event horizon, it would be sucked into the black hole, never to escape. The other, however, would be free to go on its way. This stream of escapees would be the famous, but as yet unobserved, Hawking radiation.

Lahav and his buddies haven’t gotten this far yet, but they’ve made an important step toward observing Hawking radiation and clearly have their eyes on this goal.

There’s no shortage of competition here, and the creation of the first sonic black hole will be sure to spur the competition. Expect to see somebody claim the first observation of Hawking radiation soon.

Ref: arxiv.org/abs/0906.1337: A Sonic Black Hole in a Density-Inverted Bose-Einstein Condensate