Home // Posts tagged "astrophysics"

Nosso universo vai congelar como uma cerveja super-resfriada…

SCIENTIFIC METHOD / SCIENCE & EXPLORATION

Finding the Higgs? Good news. Finding its mass? Not so good.

“Fireballs of doom” from a quantum phase change would wipe out present Universe.

by  – Feb 19 2013, 8:55pm HB

A collision in the LHC’s CMS detector.

Ohio State’s Christopher Hill joked he was showing scenes of an impending i-Product launch, and it was easy to believe him: young people were setting up mats in a hallway, ready to spend the night to secure a space in line for the big reveal. Except the date was July 3 and the location was CERN—where the discovery of the Higgs boson would be announced the next day.

It’s clear the LHC worked as intended and has definitively identified a Higgs-like particle. Hill put the chance of the ATLAS detector having registered a statistical fluke at less than 10-11, and he noted that wasn’t even considering the data generated by its partner, the CMS detector. But is it really the one-and-only Higgs and, if so, what does that mean? Hill was part of a panel that discussed those questions at the meeting of the American Association for the Advancement of Science.

As theorist Joe Lykken of Fermilab pointed out, the answers matter. If current results hold up, they indicate the Universe is currently inhabiting what’s called a false quantum vacuum. If it were ever to reach the real one, its existing structures (including us), would go away in what Lykken called “fireballs of doom.”

We’ll look at the less depressing stuff first, shall we?

Zeroing in on the Higgs

Thanks to the Standard Model, we were able to make some very specific predictions about the Higgs. These include the frequency with which it will decay via different pathways: two gamma-rays, two Z bosons (which further decay to four muons), etc. We can also predict the frequency of similar looking events that would occur if there were no Higgs. We can then scan each of the decay pathways (called channels), looking for energies where there is an excess of events, or bump. Bumps have shown up in several channels in roughly the same place in both CMS and ATLAS, which is why we know there’s a new particle.

But we still don’t know precisely what particle it is. The Standard Model Higgs should have a couple of properties: it should be scalar and should have a spin of zero. According to Hill, the new particle is almost certainly scalar; he showed a graph where the alternative, pseudoscalar, was nearly ruled out. Right now, spin is less clearly defined. It’s likely to be zero, but we haven’t yet ruled out a spin of two. So far, so Higgs-like.

The Higgs is the particle form of a quantum field that pervades our Universe (it’s a single quantum of the field), providing other particles with mass. In order to do that, its interactions with other particles vary—particles are heavier if they have stronger interactions with the Higgs. So, teams at CERN are sifting through the LHC data, checking for the strengths of these interactions. So far, with a few exceptions, the new particle is acting like the Higgs, although the error bars on these measurements are rather large.

As we said above, the Higgs is detected in a number of channels and each of them produces an independent estimate of its mass (along with an estimated error). As of the data Hill showed, not all of these estimates had converged on the same value, although they were all consistent within the given errors. These can also be combined mathematically for a single estimate, with each of the two detectors producing a value. So far, these overall estimates are quite close: CMS has the particle at 125.8GeV, Atlas at 125.2GeV. Again, the error bars on these values overlap.

Oops, there goes the Universe

That specific mass may seem fairly trivial—if it were 130GeV, would you care? Lykken made the argument you probably should. But he took some time to build to that.

Lykken pointed out, as the measurements mentioned above get more precise, we may find the Higgs isn’t decaying at precisely the rates we expect it to. This may be because we have some details of the Standard Model wrong. Or, it could be a sign the Higgs is also decaying into some particles we don’t know about—particles that are dark matter candidates would be a prime choice. The behavior of the Higgs might also provide some indication of why there’s such a large excess of matter in the Universe.

But much of Lykken’s talk focused on the mass. As we mentioned above, the Higgs field pervades the entire Universe; the vacuum of space is filled with it. And, with a value for the Higgs mass, we can start looking into the properties of the Higgs filed and thus the vacuum itself. “When we do this calculation,” Lykken said, “we get a nasty surprise.”

It turns out we’re not living in a stable vacuum. Eventually, the Universe will reach a point where the contents of the vacuum are the lowest energy possible, which means it will reach the most stable state possible. The mass of the Higgs tells us we’re not there yet, but are stuck in a metastable state at a somewhat higher energy. That means the Universe will be looking for an excuse to undergo a phase transition and enter the lower state.

What would that transition look like? In Lykken’s words, again, “fireballs of doom will form spontaneously and destroy the Universe.” Since the change would alter the very fabric of the Universe, anything embedded in that fabric—galaxies, planets, us—would be trashed during the transition. When an audience member asked “Are the fireballs of doom like ice-9?” Lykken replied, “They’re even worse than that.”

Lykken offered a couple of reasons for hope. He noted the outcome of these calculations is extremely sensitive to the values involved. Simply shifting the top quark’s mass by two percent to a value that’s still within the error bars of most measurements, would make for a far more stable Universe.

And then there’s supersymmetry. The news for supersymmetry out of the LHC has generally been negative, as various models with low-mass particles have been ruled out by the existing data (we’ll have more on that shortly). But supersymmetry actually predicts five Higgs particles. (Lykken noted this by showing a slide with five different photos of Higgs taken at various points in his career, in which he was “differing in mass and other properties, as happens to all of us.”) So, when the LHC starts up at higher energies in a couple of years, we’ll actually be looking for additional, heavier versions of the Higgs.

If those are found, then the destruction of our Universe would be permanently put on hold. “If you don’t like that fate of the Universe,” Lykken said, “root for supersymmetry”

O melhor livro de divulgação científica que encontrei em quarenta anos de leituras

Depois escrevo minha resenha…

A REALIDADE OCULTA – Universos paralelos e as leis profundas do cosmo
Brian Greene
R$ 59,00 Comprar
R$ 39,00 E-Book
Indique Comente
É necessário estar logado para utilizar este recurso. Acompanhe

Meio século atrás, os cientistas encaravam com ironia a possibilidade de existirem outros universos além deste que habitamos. Tal hipótese não passava de um delírio digno de Alice no País das Maravilhas – e que, de todo modo, jamais poderia ser comprovada experimentalmente. Os desafios propostos pela Teoria da Relatividade e pela física quântica para o entendimento de nosso próprio universo já eram suficientemente complexos para ocupar gerações e gerações de pesquisadores. Entretanto, diversos estudos independentes entre si, conduzidos por cientistas respeitados em suas áreas de atuação – teoria das cordas, eletrodinâmica quântica, teoria da informação -, começaram a convergir para o mesmo ponto: a existência de universos paralelos – o multiverso – não só é provável como passou a ser a explicação mais plausível para diversos enigmas cosmológicos.
Em A realidade oculta, Brian Greene – um dos maiores especialistas mundiais em cosmologia e física de partículas – expõe o fantástico desenvolvimento da física do multiverso ao longo das últimas décadas. O autor de O universo elegante passa em revista as diferentes teorias sobre os universos paralelos a partir dos fundamentos da relatividade e da mecânica quântica. Por meio de uma linguagem acessível e valendo-se de numerosas figuras explicativas, Greene orienta o leitor pelos labirintos da realidade mais profunda da matéria e do pensamento.

“Se extraterrestres aparecessem amanhã e pedissem para conhecer as capacidades da mente humana, não poderíamos fazer nada melhor que lhes oferecer um exemplar deste livro.” – Timothy Ferris, New York Times Book Review

Determinando se vivemos dentro da Matrix

The Measurement That Would Reveal The Universe As A Computer Simulation

If the cosmos is a numerical simulation, there ought to be clues in the spectrum of high energy cosmic rays, say theorists

1 comment

THE PHYSICS ARXIV BLOG

Wednesday, October 10, 2012

One of modern physics’ most cherished ideas is quantum chromodynamics, the theory that describes the strong nuclear force, how it binds quarks and gluons into protons and neutrons, how these form nuclei that themselves interact. This is the universe at its most fundamental.

So an interesting pursuit is to simulate quantum chromodynamics on a computer to see what kind of complexity arises. The promise is that simulating physics on such a fundamental level is more or less equivalent to simulating the universe itself.

There are one or two challenges of course. The physics is mind-bogglingly complex and operates on a vanishingly small scale. So even using the world’s most powerful supercomputers, physicists have only managed to simulate tiny corners of the cosmos just a few femtometers across. (A femtometer is 10^-15 metres.)

That may not sound like much but the significant point is that the simulation is essentially indistinguishable from the real thing (at least as far as we understand it).

It’s not hard to imagine that Moore’s Law-type progress will allow physicists to simulate significantly larger regions of space. A region just a few micrometres across could encapsulate the entire workings of a human cell.

Again, the behaviour of this human cell would be indistinguishable from the real thing.

It’s this kind of thinking that forces physicists to consider the possibility that our entire cosmos could be running on a vastly powerful computer. If so, is there any way we could ever know?

Today, we get an answer of sorts from Silas Beane, at the University of Bonn in Germany, and a few pals.  They say there is a way to see evidence that we are being simulated, at least in certain scenarios.

First, some background. The problem with all simulations is that the laws of physics, which appear continuous, have to be superimposed onto a discrete three dimensional lattice which advances in steps of time.

The question that Beane and co ask is whether the lattice spacing imposes any kind of limitation on the physical processes we see in the universe. They examine, in particular, high energy processes, which probe smaller regions of space as they get more energetic

What they find is interesting. They say that the lattice spacing imposes a fundamental limit on the energy that particles can have. That’s because nothing can exist that is smaller than the lattice itself.

So if our cosmos is merely a simulation, there ought to be a cut off in the spectrum of high energy particles.

It turns out there is exactly this kind of cut off in the energy of cosmic ray particles,  a limit known as the Greisen–Zatsepin–Kuzmin or GZK cut off.

This cut-off has been well studied and comes about because high energy particles interact with the cosmic microwave background and so lose energy as they travel  long distances.

But Beane and co calculate that the lattice spacing imposes some additional features on the spectrum. “The most striking feature…is that the angular distribution of the highest energy components would exhibit cubic symmetry in the rest frame of the lattice, deviating significantly from isotropy,” they say.

In other words, the cosmic rays would travel preferentially along the axes of the lattice, so we wouldn’t see them equally in all directions.

That’s a measurement we could do now with current technology. Finding the effect would be equivalent to being able to to ‘see’ the orientation of lattice on which our universe is simulated.

That’s cool, mind-blowing even. But the calculations by Beane and co are not without some important caveats. One problem is that the computer lattice may be constructed in an entirely different way to the one envisaged by these guys.

Another is that this effect is only measurable if the lattice cut off is the same as the GZK cut off. This occurs when the lattice spacing is about 10^-12 femtometers. If the spacing is significantly smaller than that, we’ll see nothing.

Nevertheless, it’s surely worth looking for, if only to rule out the possibility that we’re part of a simulation of this particular kind but secretly in the hope that we’ll find good evidence of our robotic overlords once and for all.

Ref: arxiv.org/abs/1210.1847: Constraints on the Universe as a Numerical Simulation

Kepler detecta 140 planetas similares à Terra?

Telescópio acha 140 planetas que podem ter vida
22 de julho de 2010 13h48 atualizado às 14h15

comentários
181

Kepler descobre planetas quando eles passam em frente a sua estrela Foto: Nasa/Divulgação

Kepler descobre planetas quando eles passam em frente a sua estrela, assim como registra Vênus ou Mercúrio ao passarem em frente ao Sol
Foto: Nasa/Divulgação

Cientistas anunciaram a descoberta de 140 novos planetas parecidos com a Terra encontrados nas últimas semanas. Com os novos dados, os cientistas acreditam que existam cerca de 100 milhões de planetas parecidos com o nosso e que possam abrigar vida apenas na Via Láctea. As informações são do Daily Mail.

Os achados foram feitos pelo telescópio espacial Kepler, que procura novos planetas desde que foi lançado, em janeiro de 2009. Segundo o astrônomo Dimitar Sasselov, os planetas têm tamanho parecido com o da Terra. O cientista descreveu a descoberta como a “realização do sonho de Copérnico”, em referência ao pai da astronomia moderna.

Novos planetas fora do sistema solar são descobertos quando eles passam em frente a sua estrela. O telescópio não capta uma imagem direta, mas registra a minúscula diminuição do brilho do astro quando o planeta passa em frente. Essa passagem causa “piscadas” na luz. Pelo cálculo da diminuição de brilho, do tempo entre as “piscadas” e da massa da estrela, os astrônomos conseguem descobrir o tamanho do planeta.

O Kepler continuará pesquisando o céu dia e noite, sem interrupção, pelos próximos quatro anos, segundo o cientista. Sasselov afirma que nos últimos 15 anos cerca de 500 exoplanetas foram descobertos, mas nenhum foi considerado parecido com a Terra, ou seja, com a possibilidade de abrigar vida.

“Vida é um sistema químico que realmente necessita de um planeta pequeno, água e pedras e uma grande quantidade de complexos químicos para surgir e sobreviver. (…) Tem um monte de trabalho para fazermos com isso, mas os resultados estatísticos são claros e planetas como a nossa Terra estão lá fora. (…) Nossa própria Via Láctea é rica nesse tipo de planetas”, disse o astrônomo durante a apresentação dos resultados do Kepler na conferência TEDGlobal, em Oxford, no Reino Unido.

Redação Terra

O tempo não para?


O Tempo Não Pára

Cazuza

Disparo contra o sol
Sou forte, sou por acaso
Minha metralhadora cheia de mágoas
Eu sou um cara
Cansado de correr
Na direção contrária
Sem pódio de chegada ou beijo de namorada
Eu sou mais um cara

Mas se você achar
Que eu tô derrotado
Saiba que ainda estão rolando os dados
Porque o tempo, o tempo não pára

Dias sim, dias não
Eu vou sobrevivendo sem um arranhão
Da caridade de quem me detesta

A tua piscina tá cheia de ratos
Tuas idéias não correspondem aos fatos
O tempo não pára

Eu vejo o futuro repetir o passado
Eu vejo um museu de grandes novidades
O tempo não pára
Não pára, não, não pára

Eu não tenho data pra comemorar
Às vezes os meus dias são de par em par
Procurando uma agulha num palheiro

Nas noites de frio é melhor nem nascer
Nas de calor, se escolhe: é matar ou morrer
E assim nos tornamos brasileiros
Te chamam de ladrão, de bicha, maconheiro
Transformam o país inteiro num puteiro
Pois assim se ganha mais dinheiro

A tua piscina tá cheia de ratos
Tuas idéias não correspondem aos fatos
O tempo não pára

Eu vejo o futuro repetir o passado
Eu vejo um museu de grandes novidades
O tempo não pára
Não pára, não, não pára

Dias sim, dias não
Eu vou sobrevivendo sem um arranhão
Da caridade de quem me detesta

A tua piscina tá cheia de ratos
Tuas idéias não correspondem aos fatos
O tempo não pára

Eu vejo o futuro repetir o passado
Eu vejo um museu de grandes novidades
O tempo não pára
Não pára, não, não pára

Quanto tempo irá durar o Universo?

Da Wikipedia, clique em cima para aumentar.
Dado que a formação de estrelas cessa em 100 trilhões de anos, isso significa que o Universo ainda é um bebê recém nascido. Se esses 100 trilhões de anos forem igualado a 100 anos de uma vida humana, o Universo teria hoje 5 dias de vida…
A menos que o Grande Rip aconteça, claro…
Interessante que a teoria do Darwinismo Cosmológico Forte (a de que civilizações tecnológicas criam universos-bebês) prediz que o tempo de vida do Universo maximiza o número de civilizações (outras coisas ficando iguais…). Ou seja, deveríamos ter w = -1, ver abaixo. Empiricamente, parece que w realmente é igual a -1. Curioso…

Big Rip

From Wikipedia, the free encyclopedia

This article is about the physics concept. For the Massively Multiplayer Online (MMO) Game Network, see The Big Rip.

The Big Rip is a cosmological hypothesis first published in 2003, about the ultimate fate of the universe, in which the matter of the universe, from stars and galaxies to atoms and subatomic particles, are progressively torn apart by the expansion of the universe at a certain time in the future. Theoretically, the scale factor of the universe becomes infinite at a finite time in the future.

The hypothesis relies crucially on the type of dark energy in the universe. The key value is the equation of state parameter w, the ratiobetween the dark energy pressure and its energy density. At w < −1, the universe will eventually be pulled apart. Such energy is calledphantom energy, an extreme form of quintessence.

In a phantom-energy dominated universe, the universe expands at an ever-increasing rate. However, this implies that the size of theobservable universe is continually shrinking; the distance to the edge of the observable universe which is moving away at the speed of light from any point gets ever closer. When the size of the observable universe is smaller than any particular structure, then no interaction between the farthest parts of the structure can occur, neither gravitational nor electromagnetic (nor weak or strong), and when they can no longer interact with each other in any way they will be “ripped apart”. The model implies that after a finite time there will be a final singularity, called the “Big Rip”, in which all distances diverge to infinite values.

The authors of this hypothesis, led by Robert Caldwell of Dartmouth College, calculate the time from now to the end of the universe as we know it for this form of energy to be

t_{rip} - t_0 \approx \frac{2}{3|1+w|H_0\sqrt{1-\Omega_m}}

where w is a measure of the repulsive force of Dark Energy , H0 is Hubble’s constant and Ωm is the present-day value of the density of all the matter in the universe.

In their paper they consider an example with w = -1.5, H0 = 70 km/s/MPsec and Ωm = 0.3, in which case the end of the universe is approximately 22 billion years from now. This is not considered as a prediction, but as a hypothetical example. The authors note that evidence indicates w is very close to -1 in our universe, which makes ω the dominating term in the equation. The closer (1 + w) is to zero, the closer the denominator is to zero and the more distant (in time) is the Big Rip. If w were exactly equal to -1 then the Big Rip could not happen, regardless of the values of H0 or Ωm.

In their scenario for w = -1.5, the galaxies would first be separated from each other. About 60 million years before the end, gravity would be too weak to hold the Milky Way and other individual galaxies together. Approximately three months before the end, the Solar system will be gravitationally unbound. In the last minutes, stars and planets will be torn apart, and an instant before the end, atoms will be destroyed.[1]

[edit]See also

[edit]References

  1. ^ Caldwell, Robert R.; Kamionkowski, Marc and Weinberg, Nevin N. (2003). “”Phantom Energy and Cosmic Doomsday””. “Physical Review Letters”, 91,: 071301,. arXiv:astro-ph/0302506.

O retorno de Hayabusa

Mais informações aqui, aqui e aqui.

Vida no Multiverso


The Drake Equation For The Multiverse

Posted: 09 Feb 2010 09:10 PM PST

The famous Drake equation estimates the number of intelligent civilisations in the Milky Way. Now a new approach asks how many might exist in the entire multiverse

In 1960, the astronomer Frank Drake devised an equation for estimating the number of intelligent civilisations in our galaxy. He did it by breaking down the problem into a hierarchy of various factors.

He suggested that the total number of intelligent civilisations in the Milky Way depends first on the rate of star formation. He culled this number by estimating the fraction of these stars with rocky planets, the fraction of those planets that can and do support life and the fraction of these that go on to support intelligent life capable of communicating with us. The result is this equation:

which is explained in more detail in this Wikipedia entry.

Today, Marcelo Gleiser at Dartmouth College in New Hampshire points out that cosmology has moved on since the 1960s. One of the most provocative new ideas is that the universe we see is one of many, possibly one of an infinite number. One line of thinking is that the laws of physics may be very different in these universes and that carbon-based life could only have arisen in those where conditions were fine-tuned in a particular way. This is the anthropic principle.

Consequently, says Gleiser, the Drake Equation needs updating to take the multiverse and the extra factors it introduces into account.

He begins by considering the total set of universes in the multiverse and defines the subset in which the parameters and fundamental constants are compatible with the anthropic principle. This is the subset {ccosmo}.

He then considers the subset of these universes in which astrophysical conditions are ripe for star and galaxy formation {c-astro}. Next he looks at the subset of these in which planets form that are capable of harbouring life {c-life}. And finally he defines the subset of these in which complex life actually arises {c-complex life}.

Then the conditions for complex life to emerge in a particular universe in the multiverse must satisfy the statement at the top of this post (where the composition symbol denotes ‘together with’).

But there’s a problem: this is not an equation. To form a true Drake-like argument, Gleiser would need to assign probabilities to each of these sets allowing him to write an equation in which the assigned probabilities multiplied together, on one side of the equation, equal the fraction of universes where complex life emerges on the other side.

Here he comes up against one of the great problems of modern cosmology–that without evidence to back up their veracity, many ideas in modern cosmology are little more than philosophy. So assigning a probability to the fraction of universes in the multiverse in which the fundamental constants and laws satisfy the anthropic principle is not just hard, but almost impossible to formulate at all.

Take {c-cosmo} for example. Gleiser points out a few of the obvious parameters that would need to taken into account in deriving a probability. These are the vacuum energy density, matter-antimatter asymmetry, dark matter density, the couplings of the four fundamental forces and the masses of quarks and leptons so that hadrons and then nuclei can form after electroweak symmetry breaking. Try assigning a probability to that lot.

Neither is it much easier for {c-astro}. This needs to take into account the fact that heavy elements seem to be important for the emergence of life which only seem to occur in galaxies above a certain mass and in stars of a certain type and age. Estimating the probability of these conditions occurring is still beyond astronomers.

At first glance, the third set {c-life} ought to be easier to handle. This must take into account the planetary and chemical constraints on the formation of life. The presence of liquid water and various elements such as carbon, oxygen and nitrogen seem to be important as do more complex molecules. How common these conditions are, we don’t yet know.

Finally there is {c-complex life}, which includes all the planetary factors that must coincide for complex life to emerge. These may include long term orbital stability, the presence of a magnetic field to protect delicate biomolecules, plate tectonics, a large moon and so on. That’s not so easy to estimate either.

Many people have tried to put the numbers into Drake’s equation. The estimates for the number of intelligent civilisations in the Milky Way ranges from one (ours) to countless tens of thousands. Drake himself put the number at 10.

Gleiser’s take on the Drake equation for the Multiverse is an interesting approach. What it tells us, however, is that our limited understanding of the universe today does not allow us to make any reasonable estimate of the number of intelligent lifeforms in the multiverse (more than one). And given the limits on what we can ever know about other universes, it’s likely that we’ll never be able to do much better than that.

Ref: arxiv.org/abs/1002.1651: Drake Equation For the Multiverse: From String Landscape to Complex Life

Laboratório de astrobiologia da USP

A Universidade de São Paulo (USP) abrigará até o final do ano o primeiro Laboratório de Astrobiologia do Hemisfério Sul. O centro de estudos brasileiro está sendo instalado em Valinhos, no Observatório Abrahão de Moraes, ligado ao Instituto de Astronomia, Geofísica e Ciências Atmosféricas (IAG) da USP.
A previsão é que comece a funcionar em 2010. Orçado em R$ 1 milhão, o laboratório recebeu financiamento federal do recém-criado Instituto Nacional de Ciência e Tecnologia do Espaço.
Será usado pela comunidade científica brasileira e internacional para achar respostas para as três principais questões da Astrobiologia: Como a vida surgiu e evoluiu no nosso planeta? Existe vida fora da Terra? Qual o futuro da vida aqui e nos outros corpos celestes?
De perfil multidisciplinar, a Astrobiologia envolve conceitos de Astronomia, Biologia Molecular, Química, Meteorologia, Geofísica e Geologia. Os estudos são realizados a partir de cálculos teóricos e grande parte dos dados obtidos virá da câmara simuladora de ambientes, o principal equipamento do novo laboratório.
A câmara de simulação planetária do laboratório será revestida de aço inox e é capaz de reproduzir condições e ambientes extraterrestres. O equipamento é projetado para analisar parâmetros de temperatura, pressão, composição gasosa e fluxo de radiação, entre outros. Permite, também, avaliações em tempo real das pesquisas em andamento.
Extraterrestres
Segundo Douglas Galante, pesquisador do IAG, muitos experimentos do laboratório serão feitos com os extremófilos – micro-organismos capazes de sobreviver em condições ambientais extremas, como a ausência de luz solar, exposição a radiação, sal, níveis muito altos e baixos de pressão, temperatura, água e oxigênio. “Essas características fazem deles bons modelos para pesquisas com organismos extraterrestres”, explica.
O pesquisador informa que o laboratório também investigará outros temas, como reações químicas ocorridas em gelos de cometas e a penetração de radiação em diversos tipos de solo.
A instalação do laboratório ocorrerá em duas etapas: na primeira, serão aproveitadas as atuais instalações do observatório. Na seguinte, será erguido o edifício próprio do centro de pesquisa. O prédio será construído numa área de 625 metros quadrados e o projeto prevê incluir soluções para o consumo sustentável de água e de eletricidade.
Quando estiver finalizado, o centro abrigará também laboratórios de apoio de Química e Biologia, ambos com infraestrutura e equipamentos de informática adequados para a pesquisa teórico-experimental.
Para a ciência, vida extraterrestre significa a capacidade de um organismo qualquer, mesmo microscópico, sobreviver em meteoritos, planetas como Marte ou ainda em outros corpos celestes. Essas formas de vida podem ter surgido em processos totalmente independentes dos da Terra, ou ainda possuir origem comum, viajando no espaço e chegando aos planetas, no fenômeno conhecido por panspermia.
Da mesma forma, a vida na Terra pode ser fruto de outro planeta ou pode ter-se originado a partir de precursores formados no ambiente espacial. Moléculas orgânicas como aminoácidos, já encontradas no meio interestelar, em meteoros e cometas, podem ter chegado à Terra, participando do surgimento da vida há bilhões de anos.
Um lugar para ver estrelas
Criado em 1972, em uma área verde e preservada de 450 mil metros quadrados, o Observatório Abrahão de Moraes começou a perder potencial científico duas décadas depois. O motivo foi o crescimento da iluminação e a urbanização de Vinhedo, local por onde se chega ao centro de Valinhos, cidade em que está instalado. Em 1995, um dos telescópios foi automatizado e pesquisas de alto nível puderam voltar a ser realizadas.
Hoje, o observatório possui três equipamentos para observação: o Obelix, o Argus e o Prometeu. A inovação permitiu transformar o local em centro de divulgação científica permanente para grupos de até 40 estudantes do ensino médio e fundamental.
A visita pode ser diurna ou noturna e todas as atividades oferecidas pelo observatório são gratuitas. O agendamento dos grupos é feito pelo telefone (19) 3886-4439.
No ano passado, 1,8 mil alunos conheceram o local. Além da visitação, a escola interessada pode também operar pela internet os telescópios e assim propor atividades ligadas à observação astronômica.
Rogério SilveiraDa Agência Imprensa OficialCrédito fotos: Fernandes Dias Pereira
Reportagem publicada originalmente na página I do Poder Executivo do Diário Oficial do Estado de SP do dia 05/09/2009.
Veja e leia a reportagem original em PDF

Gravidade como força entrópica emergente?

How Duality Could Resolve Dark Matter Dilemma

Posted: 23 May 2010 09:10 PM PDT

Astrophysicists need to choose between dark matter or modified gravity to explain the Universe. But a strange new duality may mean they can have both



The debate over the wave or particle-like nature of light consumed physicists for 300 years after Isaac Newton championed particles and Christian Huygens backed the idea of waves. The resolution, that light can be thought of as both a wave and a particle, would have astounded these giants of physics, as indeed, it does us.


What shouldn’t surprise us, though, is that other seemingly intractable arguments might be similarly resolved.


But exactly this may be in store for the dark matter conundrum which has puzzled astrophysicists for almost 80s years, according to Chiu Man Ho at Vanderbilt University in Nashville and a couple of buddies,


The problem is that galaxies rotate so fast that the matter they contain ought to fly off into space. Similarly, clusters of galaxies do not seem to contain enough mass to bind them together and so ought to fly apart. Since this manifestly doesn’t happen, some force must be holding these masses in place.


Astrophysicists have put forward two explanations. The first is that these galaxies are filled with unseen mass and this so-called dark matter provides the extra gravitational tug. The second is that gravity is stronger at these intergalactic scales and so does the job by itself, an idea called modified Newtonian dynamics or MOND.


There is no love lost between the dark matter proponents and their MONDian counterparts: both say the other is wrong and scour the Universe in search of evidence to damn their opponents. Neither side has convincingly crushed the other’s argument so far but all concerned seem to agree that when one triumphs, the other will be ground underfoot.


Perhaps there’s another possibility, however: that they’re both right.


What makes this possible is a new approach to gravity in which it is an emergent phenomenon related to entropy. We looked at this a few months ago here.


The basic idea is that parts of the Universe have different levels of entropy and this creates a force that redistributes matter in a way that maximises entropy. This force is what we call gravity.


So far, this approach has assumed a simple Universe. But cosmologists know that our Universe is not only expanding but accelerating away from us. What Chui and co have done is derive gravity as an emergent force using the same entropic approach but this time in a Universe that is accelerating.


The result is a form of gravity in which parameters for acceleration and mass share a strange kind of duality: either the acceleration term can be thought of as modified as in MOND; or the mass term can be though of as modified, as in the dark matter theory.


In effect, Chui and co are saying that dark matter and MOND are two sides of the same coin.

Interestingly, the effect of each type of modification seems to be scale dependent. In this theory, the MONDian interpretation works at the galactic scale while the dark matter interpretation works best at the scale of galactic clusters.


That’s actually how the observational evidence pans out too. MOND seems to better explain the real behaviour of galaxies while the dark matter approach better explains the structure of galaxy clusters.


Could it be that both are manifestations of the same thing? Only the brave or foolish would rule it out. And stranger things have happened in physics, as Newton and Huygens would surely attest to.


Ref: arxiv.org/abs/1005.3537: MONDian Dark Matter

Vida e morte nas marés galácticas

Este artigo me parece relevante também para a teoria de extinções periódicas devido às marés galácticas.
Friday, December 18, 2009

Galactic Tide May Have Influenced Life on Earth

The galactic tide is strong enough to influence Oort Cloud comets, which means it may also have helped shape our planet.

The Moon’s tides have been an ever-present force in Earth’s history, shaping the landscape and the lives of the creatures that inhabit it. Now there’s a tantalising hint that the galactic tide may have played a significant role in Earth’s past.

The work comes from Jozef Klacka at Comenius University in the Slovak Republic. He has calculated the strength of the galactic tide and its effect on the Solar System. His conclusion is that the tide is strong enough to significantly effect the orbital evolution of Oort Cloud comets.

That’s a fascinating result. We’ve long known that the Moon’s tides must have been crucial for the evolution of life on Earth. The constant ebb and flow of the oceans would have left sea life stranded on beaches, forcing adaptations that allowed these creatures to cope with conditions on land.

Astrobiologists also believe that comets played an important part in the development of life on Earth because the atmosphere and oceans were seeded, at least in part, by comets. By that way of thinking, the forces and processes that have shaped evolution stretch to the edge of the Solar System.

But if the galactic tide plays a role in sending these comets our way, then it looks as if we’re part of a much larger web. Could it be that Earth and the life that has evolved here, is crucially dependent, not just on our planet, our star and our local interplanetary environment, but on the Milky Way galaxy itself?

Klacka has a lot more work to do to prove that the galactic tide plays such a role. But it might just be that the field of astrobiology has become a whole lot bigger.

Ref: arxiv.org/abs/0912.3112: Galactic Tide

O tamanho do Multiverso depende do tamanho do cérebro!

Isso me faz lembrar aquela pergunta capciosa: “quando você olha para um céu estrelado, o que tem “atrás” dele?” Se a pessoa não entende, você responde: tem os ossos da sua cabeça… e daí você explica sobre a área V1 no cérebro etc.

the physics arXiv blog 


Physicists Calculate Number of Universes in the Multiverse

Posted: 14 Oct 2009 09:10 PM PDT

If we live in a multiverse, it’s reasonable to ask how many other distinguishable universes we may share it with. Now physicists have an answer

One of the curious developments in cosmology in recent years has been the emergence of the multiverse as a mainstream idea. Instead of the Big Bang producing a single uniform universe, the latest thinking is that it produced many different universes that appear locally uniform.

One question that then arises is how many universes are there. That may sound like the sort of quantity that is inherently unknowable but Andrei Linde and Vitaly Vanchurin at Stanford University in California have worked out an answer, of sorts.

Their answer goes like this. The Big Bang was essentially a quantum process which generated quantum fluctuations in the state of the early universe. The universe then underwent a period of rapid growth called inflation during which these perturbations were “frozen”, creating different initial classical conditions in different parts of the cosmos. Since each of these regions would have a different set of laws of low energy physics, they can be thought of as different universes.

What Linde and Vanchurin have done is estimate how many different universes could have appeared as a result of this effect. Their answer is that this number must be proportional to the effect that caused the perturbations in the first place, a process called slow roll inflation, and in particular to the number “e-foldings” of slow roll inflation.

Of course, the actual number depends critically on how you define the difference between universes.

Linde and Vanchurin have applied some reasonable rules to calculate that the number of universes in the multiverse and have totted it up to at least 10^10^10^7. A “humungous” number is how they describe it, with no little understatement.

How many of these could we actually see? What’s interesting here is that the properties of the observer become an important factor because of a limit to the amount of information that can be contained within any given volume of space, a number known as the Bekenstein limit, and by the limits of the human brain. 

Linde and Vanchurin say that total amount of information that can be absorbed by one individual during a lifetime is about 10^16 bits. So a typical human brain can have 10^10^16 configurations and so could never disintguish more than that number of different universes.

10^10^16 is a big number but it is dwarfed by the “humungous” 10^10^10^7. 

“We have found that the strongest limit on the number of different locally distinguishable geometries is determined mostly by our abilities to distinguish between different universes and to remember our results,” say Linde and Vanchurin

So the limit does not depend on the properties of the multiverse but on the properties of the observer.

How profound is that!

Ref: arxiv.org/abs/0910.1589: How Many Universes Are In The Multiverse?

Invariância de escala nas leis da Física?

Black Holes Cannot Exist in Latest Theory of Quantum Gravity

Posted: 21 Sep 2009 09:10 PM PDT

Nobel prize-wining physicist says black holes and space-time singularities cannot exist in his latest model of the universe

One of the great challenges of modern science is to unite our thinking about the universe on the largest scale with our notions of how it works on the smallest; in other words to combine relativity and quantum mechanics into a single theory.

The current best effort is a notion called string theory, an idea born of quantum thinking and in which gravity is a byproduct of complexity, a so-called emergent phenomenon.

The trouble with with this process of emergence is that it pays mere lip service to our intuitive ideas about causality: that an affect must be preceded by its cause. At least, that’s how the Nobel prize winning physicist Gerard ‘t Hooft sees things.

To put this right, he has constructed a different model of reality that preserves causality and has some interesting side effects. The fundamental change in his thinking is to accept a new kind of symmetry in the universe.

A symmetry is a property of a system that is left unchanged under a certain transformation. So for example, our laws of physics are derived from the idea that they must remain constant under any change of position or direction in space. It’s a hugely powerful idea.

Now ‘t Hooft says that to preserve the idea of causality in a theory of quantum gravity we need to accept the idea of a symmetry of scale. In other words, teh laws of physcis are the same at every scale. He also introduces an idea called “black hole complementarity” in which an observer inside a black hole sees the universe in a different way to an observer outside a black hole.

The consequences of this thinking are profound. t’ Hooft puts it like this:

“If we add these to our set of symmetry transformations, black holes, space-time singularities, and horizons disappear,”

In exchange, we keep the notion of causality intact.

You can argue the merits of such an exchange but the important question is whether ‘t Hooft’s new universe bears any relation to the one in which we live.

In answer we can say that the existence of black holes is well accepted. Astronomers can see their gravitational effects. And while nobody has directly observed a black hole or the Hawking radiation physicists assume they emit, few doubt that the evidence in favour will mount.

A more serious problem is the notion of scale invariance itself. Here’s a thought experiment for ‘t Hooft. Imagine you were suddenly shrunk or enlarged by some unknown factor inside a closed box, what experiment could you do to determine your new scale?

If the laws of physics were scale invariant, it would be impossible to determine your scale with an experiment.

But suppose you were to measure the position of a ball. Surely, in our universe, the accuracy of your measurement would be a good indication of your scale, since quantum effects would be easily distinguishable from Newtonian ones.

‘t Hooft seems to recognise this limitation admitting that “Newton’s constant G is not scale-invariant at all.”

But that’s a problem of his own making. In answer to the question of how to unite the physics of the very big with the physics of the very small, ‘t Hooft says there is no difference between them.

That may not be as crazy as it sounds. The differences we see could be the result of some other symmetry-breaking process, a kind of illusion. But how does this happen?

He says the answer may come from a better understanding of the way information flows through his universe. “Obviously, this leaves us with the problem of defining what exactly information is, and how it links with the equations of motion, ” he says.

‘t Hooft is not the first to run up against information. When pushed to its limits, every fundamental theory of physics runs foul of our poor understanding of information.

It may be no understatement to say that the biggest breakthrough in physics must come in information theory rather than quantum mechanics or relativity.

Ref: arxiv.org/abs/0909.3426: Quantum Gravity without Space-time Singularities or Horizons