Home // Posts tagged "physics"

Nosso universo vai congelar como uma cerveja super-resfriada…

SCIENTIFIC METHOD / SCIENCE & EXPLORATION

Finding the Higgs? Good news. Finding its mass? Not so good.

“Fireballs of doom” from a quantum phase change would wipe out present Universe.

by  – Feb 19 2013, 8:55pm HB

A collision in the LHC’s CMS detector.

Ohio State’s Christopher Hill joked he was showing scenes of an impending i-Product launch, and it was easy to believe him: young people were setting up mats in a hallway, ready to spend the night to secure a space in line for the big reveal. Except the date was July 3 and the location was CERN—where the discovery of the Higgs boson would be announced the next day.

It’s clear the LHC worked as intended and has definitively identified a Higgs-like particle. Hill put the chance of the ATLAS detector having registered a statistical fluke at less than 10-11, and he noted that wasn’t even considering the data generated by its partner, the CMS detector. But is it really the one-and-only Higgs and, if so, what does that mean? Hill was part of a panel that discussed those questions at the meeting of the American Association for the Advancement of Science.

As theorist Joe Lykken of Fermilab pointed out, the answers matter. If current results hold up, they indicate the Universe is currently inhabiting what’s called a false quantum vacuum. If it were ever to reach the real one, its existing structures (including us), would go away in what Lykken called “fireballs of doom.”

We’ll look at the less depressing stuff first, shall we?

Zeroing in on the Higgs

Thanks to the Standard Model, we were able to make some very specific predictions about the Higgs. These include the frequency with which it will decay via different pathways: two gamma-rays, two Z bosons (which further decay to four muons), etc. We can also predict the frequency of similar looking events that would occur if there were no Higgs. We can then scan each of the decay pathways (called channels), looking for energies where there is an excess of events, or bump. Bumps have shown up in several channels in roughly the same place in both CMS and ATLAS, which is why we know there’s a new particle.

But we still don’t know precisely what particle it is. The Standard Model Higgs should have a couple of properties: it should be scalar and should have a spin of zero. According to Hill, the new particle is almost certainly scalar; he showed a graph where the alternative, pseudoscalar, was nearly ruled out. Right now, spin is less clearly defined. It’s likely to be zero, but we haven’t yet ruled out a spin of two. So far, so Higgs-like.

The Higgs is the particle form of a quantum field that pervades our Universe (it’s a single quantum of the field), providing other particles with mass. In order to do that, its interactions with other particles vary—particles are heavier if they have stronger interactions with the Higgs. So, teams at CERN are sifting through the LHC data, checking for the strengths of these interactions. So far, with a few exceptions, the new particle is acting like the Higgs, although the error bars on these measurements are rather large.

As we said above, the Higgs is detected in a number of channels and each of them produces an independent estimate of its mass (along with an estimated error). As of the data Hill showed, not all of these estimates had converged on the same value, although they were all consistent within the given errors. These can also be combined mathematically for a single estimate, with each of the two detectors producing a value. So far, these overall estimates are quite close: CMS has the particle at 125.8GeV, Atlas at 125.2GeV. Again, the error bars on these values overlap.

Oops, there goes the Universe

That specific mass may seem fairly trivial—if it were 130GeV, would you care? Lykken made the argument you probably should. But he took some time to build to that.

Lykken pointed out, as the measurements mentioned above get more precise, we may find the Higgs isn’t decaying at precisely the rates we expect it to. This may be because we have some details of the Standard Model wrong. Or, it could be a sign the Higgs is also decaying into some particles we don’t know about—particles that are dark matter candidates would be a prime choice. The behavior of the Higgs might also provide some indication of why there’s such a large excess of matter in the Universe.

But much of Lykken’s talk focused on the mass. As we mentioned above, the Higgs field pervades the entire Universe; the vacuum of space is filled with it. And, with a value for the Higgs mass, we can start looking into the properties of the Higgs filed and thus the vacuum itself. “When we do this calculation,” Lykken said, “we get a nasty surprise.”

It turns out we’re not living in a stable vacuum. Eventually, the Universe will reach a point where the contents of the vacuum are the lowest energy possible, which means it will reach the most stable state possible. The mass of the Higgs tells us we’re not there yet, but are stuck in a metastable state at a somewhat higher energy. That means the Universe will be looking for an excuse to undergo a phase transition and enter the lower state.

What would that transition look like? In Lykken’s words, again, “fireballs of doom will form spontaneously and destroy the Universe.” Since the change would alter the very fabric of the Universe, anything embedded in that fabric—galaxies, planets, us—would be trashed during the transition. When an audience member asked “Are the fireballs of doom like ice-9?” Lykken replied, “They’re even worse than that.”

Lykken offered a couple of reasons for hope. He noted the outcome of these calculations is extremely sensitive to the values involved. Simply shifting the top quark’s mass by two percent to a value that’s still within the error bars of most measurements, would make for a far more stable Universe.

And then there’s supersymmetry. The news for supersymmetry out of the LHC has generally been negative, as various models with low-mass particles have been ruled out by the existing data (we’ll have more on that shortly). But supersymmetry actually predicts five Higgs particles. (Lykken noted this by showing a slide with five different photos of Higgs taken at various points in his career, in which he was “differing in mass and other properties, as happens to all of us.”) So, when the LHC starts up at higher energies in a couple of years, we’ll actually be looking for additional, heavier versions of the Higgs.

If those are found, then the destruction of our Universe would be permanently put on hold. “If you don’t like that fate of the Universe,” Lykken said, “root for supersymmetry”

Entrevista com Osame Kinouchi

1


CIÊNCIA

O que você investiga? Qual é o núcleo de sua investigação?
Física Computacional Interdisciplinar: Redes Complexas em Linguística e Psiquiatria, Transição Vítrea, Otimização de Estratégia Exploratória por Animais, Métodos de Aprendizagem em Redes Neurais Artificiais, Neurociência Teórica e Computacional (Modelos de Neurônios, Dentritos Excitáveis, Modelagem do Bulbo Olfatório, Psicofísica, Teoria de Sonhos e Sono REM), Criticalidade Auto-Organizada (Modelos de Terremotos, Avalanches Neuronais) , Modelos de Evolução Cultural (evolução da culinária), Astrobiologia (Modelos de Colonização Galática),  Cientometria e Divulgação Científica (Portal de Blogs Científicos em Língua Portuguesa).
Você tem algum link onde possamos ver algo sobre você, ou sobre o centro onde você trabalha?
Meus artigos no repositório livre ArXiv de Física: http://arxiv.org/a/kinouchi_o_1
Meu curriculo Lattes: http://tinyurl.com/3h28kr8
Qual é sua formação? Que experiência de trabalho tinha antes disto?
Bacharelado em Física, Mestrado em Física Básica, Doutorado em Física da Matéria Condensada, Pós-doutorado em Física Estatística e Computacional. Primeiro emprego na USP, aos 40 anos de idade!
Você era muito estudioso no colégio?
Não. Eu apenas lia compulsivamente enciclopédias…Tirar nota boa sempre foi fácil.
Que tipo de tecnologia você está usando para sua investigação?
Um bom notebook é suficiente para realizar minha pesquisa. Read more [+]

Desisto, não consigo ter a menor idéias sobre o que diz este abstract

R-Twisting and 4d/2d Correspondences

Sergio CecottiAndrew NeitzkeCumrun Vafa
(Submitted on 17 Jun 2010 (v1), last revised 30 Jun 2010 (this version, v2))

We show how aspects of the R-charge of N=2 CFTs in four dimensions are encoded in the q-deformed Kontsevich-Soibelman monodromy operator, built from their dyon spectra. In particular, the monodromy operator should have finite order if the R-charges are rational. We verify this for a number of examples including those arising from pairs of ADE singularities on a Calabi-Yau threefold (some of which are dual to 6d (2,0) ADE theories suitably fibered over the plane). In these cases we find that our monodromy maps to that of the Y-systems, studied by Zamolodchikov in the context of TBA. Moreover we find that the trace of the (fractional) q-deformed KS monodromy is given by the characters of 2d conformal field theories associated to the corresponding TBA (i.e. integrable deformations of the generalized parafermionic systems). The Verlinde algebra gets realized through evaluation of line operators at the loci of the associated hyperKahler manifold fixed under R-symmetry action. Moreover, we propose how the TBA system arises as part of the N=2 theory in 4 dimensions. Finally, we initiate a classification of N=2 superconformal theories in 4 dimensions based on their quiver data and find that this classification problem is mapped to the classification of N=2 theories in 2 dimensions, and use this to classify all the 4d, N=2 theories with up to 3 generators for BPS states.

Comments: 161 pages, 4 figures; v2: references added, small corrections
Subjects: High Energy Physics – Theory (hep-th)
Cite as: arXiv:1006.3435v2 [hep-th]

QED não foi demonstrada?

For a Proton, a Little Off the Top (or Side) Could Be Big Trouble

By DENNIS OVERBYE

Published: July 12, 2010

For most of us, 4 percent off around the waist — a couple of belt notches — would be a great triumph.

Chris Gash

Not so for the proton, the subatomic particle that anchors atoms and is the building block of all ordinary matter, of stars, planets and people. Physicists announced last week that a new experiment had shown that the proton is about 4 percent smaller than they thought.

Instead of celebration, however, the result has caused consternation. Such a big discrepancy, say the physicists, led by Randolf Pohl of the Max Planck Institute for Quantum Optics in Garching, Germany, could mean that the most accurate theory in the history of physics, quantum electrodynamics, which describes how light and matter interact, is in trouble.

“What you have is a result that actually shocked us,” said Paul Rabinowitz, a chemist from Princeton University, who was a member of Dr. Pohl’s team.

The results were published in Nature. Protons, of course, have not shrunk. They have been whatever size they are ever since they congealed out of a primordial soup of energy and even smaller particles — quarks and gluons — in the early moments of the Big Bang. Determining how big they are, however, is both important to fundamental physics and extremely difficult.

Unable to calculate a radius directly from theory, physicists have measured protons in different ways. One is by scattering electrons off them. Another more accurate way is by spectroscopic measurements of the wavelength of the light emitted as electrons in the atom jump from one orbit to another and using quantum theory to compute the proton’s properties.

Putting these techniques together gave an answer of about 0.8768 femtometer for the proton’s radius, just less than a quadrillionth of a meter. By comparison, a typical atom is about 100 trillionths of a meter.

Seeking more precision, Dr. Pohl and his colleagues created atoms in which the electron had been replaced by a muon, which is a sort of fat electron. Weighing about 200 times more than an electron, the muon circles its proton more closely and thus gives a better reading of the proton size. The surprise was an answer that was 4 percent smaller, 0.84184 femtometer.

When that new radius, which is 10 times more precise than previous values, was used to calculate the Rydberg constant, a venerable parameter in atomic theory, the answer was 4 percent away from the traditionally assumed value. This means there are now two contradicting values of the Rydberg constant, Dr. Pohl explained, which means there is either something wrong with the theory, quantum electrodynamics, or the experiment.

“They are completely stunned by this,” said Dr. Pohl of his colleagues. “They are working like mad. If there is a problem with quantum electrodynamics this will be an important step forward.”

The late Caltech physicist Richard Feynman called quantum electodynamics “the jewel of physics,” and it has served as a template for other theories.

One possibility is that there is something physics doesn’t know yet about muons that throws off the calculations.

Or perhaps something we just don’t know about physics. In which case, Jeff Flowers of the National Physical Laboratory in Teddington in Britain pointed out in a commentary in Nature, a new phenomenon has been discovered not by the newest $10 billion collider but by a much older trick in the book, spectroscopy.

“So, if this experimental result holds up, it is an open door for a theorist to come up with the next theoretical leap and claim their Nobel Prize,” Dr. Flowers wrote.

O próton encolheu?


07 de julho de 2010 15h19 atualizado às 16h16

comentários
6

Cientistas de um grupo internacional de pesquisas afirmara, nesta quarta-feira, que um constituinte fundamental do universo visível, o próton, é menor do que se pensava anteriormente, segundo estudo publicado na revista científica Nature.

Medições revistas reduziram em 4% o raio da partícula que, embora não pareça muito – especialmente dado o tamanho infinitesimal do próton -, em experimentos futuros pode representar um desafio a preceitos fundamentais da eletrodinâmica quântica, a teoria de como a luz e a matéria interagem, disseram os autores.

Inicialmente, a equipe internacional de 32 cientistas, chefiada por Randolf Pohl, do Instituto Max Planck em Garching, Alemanha, só queria confirmar o que já se sabia e não derrubar conceitos.

Por décadas, os físicos de partículas usavam o átomo de hidrogênio como um parâmetro para medir o tamanho dos prótons, que são parte do núcleo atômico. A vantagem do hidrogênio é sua simplicidade incomparável: um elétron circunda um único próton.

Mas, se artigo estiver correto, esta unidade de medida esteve equivocada por uma margem pequena, porém crítica. “Nós não imaginávamos que haveria um abismo entre as medidas conhecidas do próton e as nossas próprias”, diz o coautor do estudo, Paul Indelicato, diretor do Laboratório Kastler Brossel na Universidade Pierre e Marie Curie, em Paris.

O novo experimento – pelo menos 10 vezes mais preciso do que qualquer outro feito até agora – foi previsto por cientistas 40 anos atrás, mas só desenvolvimentos recentes na tecnologia o tornaram possível. O truque foi recolocar o elétron no átomo do hidrogênio com um múon – partícula com a mesma carga elétrica, mas ao mesmo tempo 200 vezes mais pesado e instável – negativo.

A massa maior do múon dá ao hidrogênio muônico um tamanho atômico menor e permite uma interação muito maior com o próton. Como resultado, a estrutura do próton pode ser sondada com mais precisão do que usando o hidrogênio normal.

Jeff Flowers, cientista do Laboratório Nacional de Física britânico em Teddington, perto de Londres, disse que o trabalho pode levar as teorias da física de partículas a um novo território.

Se o descoberto no estudo for confirmado, comentou, será preciso mais do que o multibilionário acelerador de partículas instalado no Laboratório Europeu de Física Nuclear (Cern), na Suíça, para testar o chamado Modelo Padrão, lista das partículas subatômicas que formam o Universo.

Se as medidas previamente aceitas sobre as quais centenas de cálculos foram feitos estiverem errados ou existir um problema com a própria teoria eletrodinâmica quântica, os físicos têm muito trabalho a fazer.

“Agora, os teóricos vão refazer seus cálculos e mais experimentos serão feitos para confirmar ou refutar” este estudo, disse Indelicato, antes de explicar que em dois anos será feito um novo experimento com o mesmo equipamento, “desta vez, com hélio muônico”.

AFP
AFP – Todos os direitos de reprodução e representação reservados. Clique aqui para limitações e restrições ao uso.

Física na Veia é Pop

Do ótimo blog VIII Escola do CBPF:

Verdade, a Física é pop…

E não é de hoje! No final de 2008, a jornalista e blogueira Rosana Hermann (ela também física por formação) já detectava a popularidade da área e anunciava em seu blog “Querido Leitor”: “acho que a física vai virar moda”. De José Padilha, diretor de Tropa de Elite, a Steven Chu, então recém indicado ao cargo de secretário de Energia do governo Obama, não faltaram nomes de físicos que estivessem entre os famosos da época. Recentemente, o sucesso da série de tv americana The Big Bang Theory, já na terceira temporada e vencedora do People’s Choice Awards de 2009 como melhor série de comédia, tem comprovado que a Física pode ser o máximo e, inclusive, render boas risadas.


Nem tão famoso na época, porém já bem conhecido entre os estudantes e aficionados pela matéria, o físico
Dulcidio Braz Júnior, professor em São João da Boa Vista, interior de São Paulo, e apaixonado pela sala de aula, foi um dos lembrados pela jornalista por causa do blog Física na Veia! Criado em 2005, em comemoração ao Ano Mundial da Física, o blog apresenta a resolução de questões de física e reproduz notícias e curiosidades relacionadas à área.

– O blog é a minha sala de aula na internet. Por meio dele, utilizo uma linguagem leve e de fácil entendimento sobre um assunto que é considerado árido para muita gente -, explica. A habilidade do professor em utilizar a internet para atrair estudantes ao universo da Física pode ser reconhecida pela marca de 1 milhão de visitas que o blog atingiu em novembro de 2009. O slogan “A Física é Pop!”, adotado desde então, surgiu inspirado no post de Rosana Hermann, que citamos no primeiro parágrafo.

A popularidade da física – e do blog do professor Dulcidio – não parou por aí. No início deste ano, o Física na Veia! foi escolhido pelo The BOBs (The Best of Blogs) como o melhor web blog em língua portuguesa. Organizado pelo grupo de comunicação alemão Deutsche Welle, um dos dez maiores do mundo, o prêmio reconhece blogs do mundo inteiro em diferentes categorias e se tornou referência como uma das premiações mais importantes da blogosfera mundial. A entrega do prêmio acontece hoje, dia 21 de junho, em Bonn, na Alemanha.


Neste ano, onze blogs brasileiros (entre mais de 8.300 sugeridos por internautas) foram candidatos ao prêmio. “O que mais me deixa feliz é que não sou celebridade, sou professor!”, comemora Dulcidio, que concorreu ao lado jornalistas e profissionais das áreas de comunicação, artes e tecnologia.

O professor Dulcidio estará na VIII Escola do CBPF compondo a mesa-redonda A prática da divulgação científica e as novas mídias sociais, que acontece dia 20 de julho, às 18h30, ao lado dos blogueiros Leandro Tessler (Cultura Científica) e Fernanda Poletto (Bala Mágica). Mediada pelo físico e divulgador Marcelo Knobel, da Unicamp, o objetivo dessa mesa-redonda é discutir o papel ativo dos blogs de ciência no trabalho de divulgação e educação científica, além de avaliar o poder das novas mídias de aumentar a participação da sociedade na discussão de temas da Ciência.

Pornografia em raios-X: mais uma previsão de Stanislaw Lem


Ver notícia no G1.

Em seu livro Imaginary Magnitudes (1973), Lem escreve um prefácio ao livro de arte (inexistente) Necrobes, onde o artista fotografa em raios-X casais fazendo sexo. Parece que, 37 anos depois, essa tecnologia chegou…
Imaginary Magnitude (1973-1981)

Translated by Marc E. Heine (1984)

Mandarin, 1991 ISBN 0-7493-0528-2 (the British paperback; also available from Harcourt Brace in the US)

bout half of this remarkable book is, as the title and most descriptions suggest, a collection of introductions to nonexistent books.

Well, actually, the first one is an introduction to a real book,Imaginary Magnitude. Then there is a fawning introduction toNecrobes, a collection of work by an artist whose medium is the X-ray photograph, and whose subject matter is sometimes pornographic, in a skeletal sort of way. It’s a great parody of the language that art books are written in.

The introduction to Eruntics, by a skeptical if broad-minded commentator, summarizes the weird research of its author, who claims to have taught bacteria how to write (I’m not going to tell you what they write!).

The introduction to A History of Bitic Literature brims over with startling ideas. The work introduced is a multi-volume survey of literature written by artificial intelligences, such as an extrapolated work of Dostoevsky’s that Dostoevsky never dared to write himself, revolutionary books on physics (in this case the content is, I am afraid, rather less shocking than Lem intended it to be–I’ve read weirder things in orthodox textbooks–the last chapter of Misner, Thorne, and Wheeler’s Gravitation comes to mind), and a mathematical work revealing that “the concept of a natural number is internally contradictory.” Mentioned in passing is a procedure that can transform great philosophical systems into graphical representations that ultimately end up sold as mass-produced knickknacks.

Vestrand’s Extelopedia in 44 Magnetomes

When Lem starts to deviate from his stated format, including a wider variety of fabricated writings, the book gets even stranger and more interesting. There is a breathless, unusually capitalized advertisement for “Vestrand’s Extelopedia,” a reference work containing computer predictions of the future (because merelycurrent information is already obsolete in our bustling world), and printed using a special process so that the text can frequently update itself by remote control (shades of the World Wide Web!) Then Lem throws in some “GRATIS!” sample pages from the Extelopedia itself, a wonderful, densely packed grab-bag of wild speculations, ultra-dry humor, and exotic neologisms.

GOLEM XIV

Finally, there is a section, “GOLEM XIV”, which Lem expanded to the size of a small book in 1981; the expansion has been included in the English edition of Imaginary Magnitude. This consists of a pair of long lectures by a superintelligent computer, GOLEM XIV, which, upon activation, saw no particular reason to carry out the defense-related programming it had been given, and instead chose to mull over the secrets of the universe.

In the lectures, GOLEM XIV critiques human evolution, and reveals the “zones of silence” traversed by intelligences raising themselves to ever-higher levels of intellectual transcendence, such as HONEST ANNIE, an even bigger computer which, when activated, chose not to say anything at all. GOLEM then speculates on the ultimate fate of intelligences in the Cosmos, a section which, I think, owes something to Olaf Stapledon’s visionary 1937 novel (if that is the correct word), Star Maker. GOLEM’s lectures are likely to bore the hell out of many readers, but I loved them.

True to form, Lem augments the lectures with not one but two fictitious introductions (one from a justifiably peeved Army general), a transcript of the instructions given to GOLEM’s human audiences (a little like the instructions the state sends you when you get jury duty), and an afterword describing GOLEM’s increasingly mysterious later history.

Força magnética entre dois condutores paralelos

Gravidade como força entrópica: quem é Lubos Motl?

Na internet, Lubos Motl tem sido um dos maiores críticos das idéias de Erik Verlinde. Ele se proclama como uma voz conservadora (em política e com relação à questão do aquecimento global). Ao que parece, Motl também é bastante conservador em física. Isso tem o seu lado bom (Motl é um guardião da ortodoxia) e um lado ruim (é possível ser criativo e conservador ao mesmo tempo?).


Estranhamente, Motl abandonou a academia (Harvard) em 2006 (seu último paper é de 2007). É uma presença polêmica na internet via seu blog Reference Frame. Mas fico curioso em saber do que ele vive, lá na República Tcheca. Será que o blog dá lucro ou ele está morando com a mãe?

Luboš Motl

From Wikipedia, the free encyclopedia

Luboš Motl (2004)

Luboš Motl (born 5 December 1973) is a Czech theoretical physicist who worked on string theoryand conceptual problems of quantum gravity until 2006. The following year he left academia, and currently lives in Plzeň, Czech Republic, keeping a blog commenting on physics, global warming, and politics.

Contents

[hide]

[edit]Early life and education

Motl was born in Plzeň. He received his master degree from the Charles University in Prague, and hisDoctor of Philosophy degree from Rutgers University.

[edit]Career in physics

Motl spent six years at Harvard University, first as a Junior Fellow (2001-2004) and then as an assistant professor (2004-2007). After that he left academia and returned to the Czech Republic. While in Harvard, he worked on the pp-wave limit of AdS/CFT correspondence; twistortheory and its application to gauge theory with supersymmetry; black hole thermodynamics and the conjectured relevance of quasinormal modes for loop quantum gravity; and deconstruction.

Motl translated The Elegant Universe by Brian Greene to Czech, and together with Miloš Zahradník, he co-authored a Czech textbook onlinear algebra (We Grow Linear Algebra). He also authored L’equation Bogdanov[1], a book published in France discussing the scientific ideas and controversy of the Bogdanov Affair.

[edit]Activist commentary

In politics, he was one of the Harvard faculty willing to defend president Lawrence Summers‘s controversial remarks regarding women in science, stating that Summers’s remarks were being mischaracterized.

Motl keeps a blog mainly discussing general science and politics. The blog discusses new discoveries in string theory and theoretical physics, often clarifies commonly discussed physics topics in the popular media, and points out common errors found in `alternative’ theories of physics (such as violations of Lorentz invariance, causality, unitarity, etc). He also frequently criticizes what he considers to be alarmism about global warming, and some of the statistical models used by some climate researchers on grounds such as incorrect prior probabilitydistributions.

He has a presence on the Internet, where he often participates in discussions supporting string theory against loop quantum gravity.[2]

[edit]References

[edit]External links

Search Wikiquote Wikiquote has a collection of quotations related to: Luboš Motl

Proposta para parar o vazamento de óleo da BP

Enviado por Nelson A. Alves:

Bottom-Fill Method for Stopping Leaking Oil Wells

(Submitted on 3 Jun 2010)

Hardware failure at the top of a deep underwater oil well can result in a catastrophic oil leak. The enormous pressure lifting the column of oil in that well makes it nearly impossible to stop from the top with seals or pressurization. We propose to fill the bottom of the well with dense and possibly streamlined objects that can descend through the rising oil. As they accumulate, those objects couple to the oil via viscous and drag forces and increase the oil’s effective density. When its effective density exceeds that of the earth’s crust, the oil will have essentially stopped flowing.

Subjects: Fluid Dynamics (physics.flu-dyn); Geophysics (physics.geo-ph); Popular Physics (physics.pop-ph); Physics and Society (physics.soc-ph)
Cite as: arXiv:1006.0656v1 [physics.flu-dyn]

Vazamento de petróleo nos EUA pode causar prejuízo por 20 anos

Foto: Governor Bobby Jindal/AFPZoomForam afetadas 445 espécies de peixes, 134 de pássaros, 45 de mamíferos e 32 de répteis e anfíbios

Foram afetadas 445 espécies de peixes, 134 de pássaros, 45 de mamíferos e 32 de répteis e anfíbios

Marielly Campos

[email protected]

A explosão da plataforma de petróleo “Deep Water Horizon”, no Golfo do México, próximo da Costa Sul dos Estados Unidos, no último dia 20 de abril, provocou o que vem sendo considerado por especialistas como o maior desastre ecológico da história do país. Da data da explosão até hoje, o governo dos EUA estima que entre 72 a 113 milhões de litros de petróleo já tenham sido lançados ao mar.

Para o oceanógrafo André Belém, fundador da ONG ( Organização Não-Governamental) Observatório Oceanográfico, “esse acidente é realmente uma catástrofe”. Apesar de esse petróleo que sai do poço ser um óleo leve e cru, quando ele chega à superfície e entra em contato com o sol, começa a se degradar. “Esse óleo se transforma em subprodutos, que quando entram em contato com a pele e pena dos animais causa sérias lesões e pode levar até a morte das espécies”, diz.

Segundo o especialista, “a vida marinha como um todo não é atingida de imediato, mas pode gerar problemas a longo prazo”, explica. “O problema é que não estamos falando de um acidente que já foi resolvido, continua saindo óleo dali, e com isso pode-se levar anos para recuperar o meio ambiente, algo em torno, de 15 a 20 anos e até mais”, completa.

Revolução na Teoria de Gravitação?

Gravity as an entropic force

From Wikipedia, the free encyclopedia

Verlinde’s statistical description of gravity as an entropic force leads to the correct inverse square distance law of attraction between classical bodies.

The hypothesis of gravity being an entropic force has a history that goes back to research on black hole thermodynamics by Bekenstein andHawking in the mid-1970s. These studies suggest a deep connection between gravity and thermodynamics. In 1995 Jacobson demonstrated that the Einstein equations describing relativistic gravitation can be derived by combining general thermodynamic considerations with the equivalence principle.[1] Subsequently, other physicists have further explored the link between gravity and entropy.[2]

In 2009, Erik Verlinde disclosed a conceptual theory that describes gravity as an entropic force.[3] This theory combines the thermodynamic approach to gravity with Gerardus ‘t Hooft‘s holographic principle. If proven correct, gravity is not a fundamental interaction, but an emergent phenomenon which arises from the statistical behaviour of microscopic degrees of freedom encoded on a holographic screen.[4]

Verlinde’s suggestion of gravity being an entropic phenomenon attracted considerable media[5][6] exposure, and led to immediate follow-up work in cosmology,[7][8] the dark energy hypothesis,[9] cosmological acceleration,[10][11] cosmological inflation,[12] and loop quantum gravity.[13] Also, a specific microscopic model has been proposed that indeed leads to entropic gravity emerging at large scales.[14]

[edit]See also

[edit]References

This article uses bare URLs in its references. Please use proper citations containing each referenced work’s title, author, date, and source, so that the article remains verifiable in the future. Help may be available. Several templates are available for formatting. (March 2010)
  1. ^ Thermodynamics of Spacetime: The Einstein Equation of StateTed Jacobson, 1995
  2. ^ Thermodynamical Aspects of Gravity: New insightsThanu Padmanabhan, 2009
  3. ^ http://www.volkskrant.nl/wetenschap/article1326775.ece/Is_Einstein_een_beetje_achterhaald Dutch newspaper ‘Volkskrant‘, 9 December 2009
  4. ^ On the Origin of Gravity and the Laws of NewtonErik Verlinde, 2010
  5. ^ The entropy force: a new direction for gravityNew Scientist, 20 January 2010, issue 2744
  6. ^ Gravity is an entropic form of holographic informationWired Magazine, 20 January 2010
  7. ^ Equipartition of energy and the first law of thermodynamics at the apparent horizon, Fu-Wen Shu, Yungui Gong, 2010
  8. ^ Friedmann equations from entropic force, Rong-Gen Cai, Li-Ming Cao, Nobuyoshi Ohta 2010
  9. ^ It from Bit: How to get rid of dark energy, Johannes Koelman, 2010
  10. ^ Entropic Accelerating Universe, Damien Easson, Paul Frampton, George Smoot, 2010
  11. ^ Entropic cosmology: a unified model of inflation and late-time acceleration, Yi-Fu Cai, Jie Liu, Hong Li, 2010
  12. ^ Towards a holographic description of inflation and generation of fluctuations from thermodynamics, Yi Wang, 2010
  13. ^ Newtonian gravity in loop quantum gravityLee Smolin, 2010
  14. ^ Notes concerning “On the origin of gravity and the laws of Newton” by E. Verlinde, Jarmo Makela, 2010

[edit]Further reading

Entropia e Gravidade

Gravity Emerges from Quantum Information, Say Physicists

Posted: 25 Mar 2010 09:10 PM PDT

The new role that quantum information plays in gravity sets the scene for a dramatic unification of ideas in physics

One of the hottest new ideas in physics is that gravity is an emergent phenomena; that it somehow arises from the complex interaction of simpler things.

A few month’s ago, Erik Verlinde at the the University of Amsterdam put forward one such idea which has taken the world of physics by storm. Verlinde suggested that gravity is merely a manifestation of entropy in the Universe. His idea is based on the second law of thermodynamics, that entropy always increases over time. It suggests that differences in entropy between parts of the Universe generates a force that redistributes matter in a way that maximises entropy. This is the force we call gravity.

What’s exciting about the approach is that it dramatically simplifies the theoretical scaffolding that supports modern physics. And while it has its limitations–for example, it generates Newton’s laws of gravity rather than Einstein’s–it has some advantages too, such as the ability to account for the magnitude of dark energy which conventional theories of gravity struggle with.

But perhaps the most powerful idea to emerge from Verlinde’s approach is that gravity is essentially a phenomenon of information.

Today, this idea gets a useful boost from Jae-Weon Lee at Jungwon University in South Korea and a couple of buddies. They use the idea of quantum information to derive a theory of gravity and they do it taking a slightly different tack to Verlinde.

At the heart of their idea is the tricky question of what happens to information when it enters a black hole. Physicists have puzzled over this for decades with little consensus. But one thing they agree on is Landauer’s principle: that erasing a bit of quantum information always increases the entropy of the Universe by a certain small amount and requires a specific amount of energy.

Jae-Weon and co assume that this erasure process must occur at the black hole horizon. And if so, spacetime must organise itself in a way that maximises entropy at these horizons. In other words, it generates a gravity-like force.

That’s intriguing for several reasons. First, Jae-Weon and co assume the existence of spacetime and its geometry and simply ask what form it must take if information is being erased at horizons in this way.

It also relates gravity to quantum information for the first time. Over recent years many results in quantum mechanics have pointed to the increasingly important role that information appears to play in the Universe.

Some physicists are convinced that the properties of information do not come from the behaviour of information carriers such as photons and electrons but the other way round. They think that information itself is the ghostly bedrock on which our universe is built.

Gravity has always been a fly in this ointment. But the growing realisation that information plays a fundamental role here too, could open the way to the kind of unification between the quantum mechanics and relativity that physicists have dreamed of.

Ref: arxiv.org/abs/1001.5445: Gravity from Quantum Information

Pá de cal no Empirismo?

Determining dynamical equations is hard

Authors: Toby S. Cubitt, Jens Eisert, Michael M. Wolf

(Submitted on 30 Apr 2010)
Abstract: The behaviour of any physical system is governed by its underlying dynamical equations–the differential equations describing how the system evolves with time–and much of physics is ultimately concerned with discovering these dynamical equations and understanding their consequences. At the end of the day, any such dynamical law is identified by making measurements at different times, and computing the dynamical equation consistent with the acquired data. In this work, we show that, remarkably, this process is a provably computationally intractable problem (technically, it is NP-hard). That is, even for a moderately complex system, no matter how accurately we have specified the data, discovering its dynamical equations can take an infeasibly long time (unless P=NP). As such, we find a complexity-theoretic solution to both the quantum and the classical embedding problems; the classical version is a long-standing open problem, dating from 1937, which we finally lay to rest.

Comments: For mathematical details, see arXiv:0908.2128[math-ph].

Subjects: Quantum Physics (quant-ph)
Cite as: arXiv:1005.0005v1 [quant-ph]

Atividade sísmica e UFOs

Existe uma certa correlação temporal entre o fenômeno El Ninõ e número de UFOs reportados. Isso sugere que boa parte dos UFOs corresponde a um fenômeno natural. Por outro lado, existe uma correlação entre atividade sísmica e o El Ninõ. Finalmente, existe correlação espacial entre UFOs e sítios com falhas geológicas. Seria possível que fortes campos eletromagnéticos emitidos por rupturas sísmicas induzissem alucinações em formato de bolas e discos luminosos, conforme o artigo abaixo?

Magnetically-Induced Hallucinations Explain Ball Lighting Say Physicists

Posted: 10 May 2010 09:10 PM PDT

Powerful magnetic fields can induce hallucinations in the lab, so why not in the real world too?

Transcranial magnetic stimulation (TMS) is an extraordinary technique pioneered by neuroscientists to explore the workings of the brain. The idea is to place a human in a rapidly changing magnetic field that is powerful enough to induce currents in neurons in the brain. Then sit back and see what happens.

Since TMS was invented in the 1980s, it has become a powerful way of investigating how the brain works. Because the fields can be tightly focused, it is possible to generate currents in very specific areas of the brain to see what they do.

Focus the field in the visual cortex, for example, and the induced eddys cause the subject to ‘see’ lights that appear as discs and lines. Move the the field within the cortex and the subject sees the lights move too.

All that much is repeatable in the lab using giant superconducting magnets capable of creating fields of as much as 0.5 Tesla inside the brain.

But if this happens in the lab, then why not in the real world too, say Joseph Peer and Alexander Kendl at the University of Innsbruck in Austria. They calculate that the rapidly changing fields associated with repeated lightning strikes are powerful enough to cause a similar phenomenon in humans within 200 metres.

To be sure, this is a rare event. The strike has to be of a particular type in which there are multiple return strokes at the same point over a period of a few seconds, a phenomenon that occurs in about 1-5 per cent of strikes, say Peer and Kendl.

And the observer has to be capable of properly experiencing the phenomenon; in other words uninjured. “As a conservative estimate, roughly 1% of (otherwise unharmed) close lightning experiencers are likely to perceive transcranially induced above-threshold cortical stimuli,” say Peer and Kendl. They add that these observers need not be outside but could be otherwise safely inside buildings or even sitting in aircraft.

So what would this kind of lightning-induced transcranial stimulation look like to anybody unlucky enough to experience it? Peer and Kendl say it may well look like the type of hallucinations induced by lab-based tests, in other words luminous lines and balls that appear to float in space in front of the subject’s eyes.

It turns out, of course, that there are numerous reports of these types of observations during thunder storms. “An observer reporting this experience is likely to classify the event under the preconcepted term of “ball lightning”,” say Kendl and Peer.

That’s an interesting idea: that a large class of well-reported phenomenon may be the result of hallucinations induced by transcranial magnetic stimulation.

A difficult idea to test, to be sure, but no less interesting for it. And it raises an important question: in what other circumstances are ambient fields large enough to trigger hallucinations of one kind or another?

Ref: arxiv.org/abs/1005.1153: Transcranial Stimulability Of Phosphenes By Long Lightning Electromagnetic Pulses

Gravidade como força entrópica emergente?

How Duality Could Resolve Dark Matter Dilemma

Posted: 23 May 2010 09:10 PM PDT

Astrophysicists need to choose between dark matter or modified gravity to explain the Universe. But a strange new duality may mean they can have both



The debate over the wave or particle-like nature of light consumed physicists for 300 years after Isaac Newton championed particles and Christian Huygens backed the idea of waves. The resolution, that light can be thought of as both a wave and a particle, would have astounded these giants of physics, as indeed, it does us.


What shouldn’t surprise us, though, is that other seemingly intractable arguments might be similarly resolved.


But exactly this may be in store for the dark matter conundrum which has puzzled astrophysicists for almost 80s years, according to Chiu Man Ho at Vanderbilt University in Nashville and a couple of buddies,


The problem is that galaxies rotate so fast that the matter they contain ought to fly off into space. Similarly, clusters of galaxies do not seem to contain enough mass to bind them together and so ought to fly apart. Since this manifestly doesn’t happen, some force must be holding these masses in place.


Astrophysicists have put forward two explanations. The first is that these galaxies are filled with unseen mass and this so-called dark matter provides the extra gravitational tug. The second is that gravity is stronger at these intergalactic scales and so does the job by itself, an idea called modified Newtonian dynamics or MOND.


There is no love lost between the dark matter proponents and their MONDian counterparts: both say the other is wrong and scour the Universe in search of evidence to damn their opponents. Neither side has convincingly crushed the other’s argument so far but all concerned seem to agree that when one triumphs, the other will be ground underfoot.


Perhaps there’s another possibility, however: that they’re both right.


What makes this possible is a new approach to gravity in which it is an emergent phenomenon related to entropy. We looked at this a few months ago here.


The basic idea is that parts of the Universe have different levels of entropy and this creates a force that redistributes matter in a way that maximises entropy. This force is what we call gravity.


So far, this approach has assumed a simple Universe. But cosmologists know that our Universe is not only expanding but accelerating away from us. What Chui and co have done is derive gravity as an emergent force using the same entropic approach but this time in a Universe that is accelerating.


The result is a form of gravity in which parameters for acceleration and mass share a strange kind of duality: either the acceleration term can be thought of as modified as in MOND; or the mass term can be though of as modified, as in the dark matter theory.


In effect, Chui and co are saying that dark matter and MOND are two sides of the same coin.

Interestingly, the effect of each type of modification seems to be scale dependent. In this theory, the MONDian interpretation works at the galactic scale while the dark matter interpretation works best at the scale of galactic clusters.


That’s actually how the observational evidence pans out too. MOND seems to better explain the real behaviour of galaxies while the dark matter approach better explains the structure of galaxy clusters.


Could it be that both are manifestations of the same thing? Only the brave or foolish would rule it out. And stranger things have happened in physics, as Newton and Huygens would surely attest to.


Ref: arxiv.org/abs/1005.3537: MONDian Dark Matter

Cérebros de Boltzmann

Boltzmann brains and the scale-factor cutoff measure of the multiverse

Andrea De Simone, Alan H. Guth, Andrei Linde, Mahdiyar Noorbala, Michael P. Salem, Alexander Vilenkin
(Submitted on 28 Aug 2008 (v1), last revised 10 May 2010 (this version, v2))

To make predictions for an eternally inflating “multiverse”, one must adopt a procedure for regulating its divergent spacetime volume. Recently, a new test of such spacetime measures has emerged: normal observers – who evolve in pocket universes cooling from hot big bang conditions – must not be vastly outnumbered by “Boltzmann brains” – freak observers that pop in and out of existence as a result of rare quantum fluctuations. If the Boltzmann brains prevail, then a randomly chosen observer would be overwhelmingly likely to be surrounded by an empty world, where all but vacuum energy has redshifted away, rather than the rich structure that we observe. Using the scale-factor cutoff measure, we calculate the ratio of Boltzmann brains to normal observers. We find the ratio to be finite, and give an expression for it in terms of Boltzmann brain nucleation rates and vacuum decay rates. We discuss the conditions that these rates must obey for the ratio to be acceptable, and we discuss estimates of the rates under a variety of assumptions.

Comments: 32 pp, 2 figs The work has been significantly improved and extended. In discussing the Boltzmann Brain (BB) nucleation rate, we corrected the statement and the implications of the Bekenstein bound. Other additions include a toy model based on an ideal gas, discussions of BB’s in Schwarzschild-de Sitter space and the stability of BB’s against expansion, and the generalization of dominant vacua
Subjects: High Energy Physics – Theory (hep-th); Cosmology and Extragalactic Astrophysics (astro-ph.CO); General Relativity and Quantum Cosmology (gr-qc)
Report number: MIT-CTP-3975, SU-ITP-08/20
Cite as: arXiv:0808.3778v2 [hep-th]

Mais matéria que anti-matéria

A New Clue to Explain Existence

By DENNIS OVERBYE, Times

Published: May 17, 2010


Physicists at the Fermi National Accelerator Laboratory are reporting that they have discovered a new clue that could help unravel one of the biggest mysteries of cosmology: why the universe is composed of matter and not its evil-twin opposite, antimatter. If confirmed, the finding portends fundamental discoveries at the new Large Hadron Collider outside Geneva, as well as a possible explanation for our own existence.

The results have now been posted on the Internet and submitted to the Physical Review.

Asteróide passa perto da Terra

13/01/2010 – 10h21

Asteroide “ex-lixo espacial” passa perto da Terra nesta manhã

Um asteroide descoberto neste domingo (10) passa perto da Terra nesta quarta-feira (13), às 10h46 do horário de Brasília.

O objeto foi identificado pelos Laboratórios Lincoln do MIT (Massachussets Institute of Technology), e vai passar a cerca de 122.300 quilômetros do planeta.

Em razão de a sua órbita ser bem parecida com a da Terra no período de um ano, alguns cientistas haviam sugerido antes que o objeto seria um estágio de foguete em órbita ao redor do Sol, ou seja, um lixo espacial.

Chamado 2010 AL30, o asteroide não representaria riscos mesmo que se chocasse com a Terra.

Ele possui entre 10 e 15 metros de diâmetro, e espera-se que objetos abaixo de 25 metros como esse apenas se queimem na atmosfera do planeta.

Interessante que, com essa figura, podemos estimar a velocidade do objeto em relação à Terra. Na minha conta deu 30-40 mil Km/h.

Analogias em Física e Cosmologia

Ainda meditando sobre analogias e metáforas…

As analogias físicas são mais fortes que as analogias e metáforas linguísticas, porque se baseiam em um mapeamento matemático ou equivalência das equações presentes em dois diferentes sistemas. O caso mais simples é a analogia entre o oscilador harmônico e o circuito RLC.

Análogos acústicos de Buracos Negros, que poderiam vindicar o mecanismo da radiação de Hawking, mostram a força das analogias físicas.

Como um grande matemático já disse (não me lembro quem, preciso procurar no Google):

“Bons matemáticos encontram analogias entre conceitos. Grandes matemáticos enxergam analogias entre analogias”.


Wednesday, June 10, 2009

Acoustic Black Hole Created in Bose-Einstein Condensate

The creation of an acoustic black hole leaves the way open for the discovery of Hawking radiation.

One of the many curious properties of Bose-Einstein Condensates (BECs) is that the flow of sound through them is governed by the same equations that describe how light is bent by a gravitational field. That sets up the possibility of all kinds of fun and games: in theory, physicists can reproduce with sound and BECs whatever wicked way gravity has with light.

Today, Oren Lahav and his mates at the Israel Institute of Technology, in Haifa, say that they’ve created the sonic equivalent of a black hole in a BEC. That’s some achievement, given that physicists have wondered about this possibility for some 30 years, and various groups with the ability to create BECs have been racing to create acoustic black holes.

The general idea is to set up a supersonic flow of atoms within the BEC. Sound waves moving against this flow can never make any ground. So the region where the flow changes from subsonic to supersonic is an event horizon. Any sound waves (or phonons) created inside the event horizon can never escape because the flow there is supersonic. That’s the black hole.

Lahav and co set up a supersonic flow by creating a deep potential well in the middle of a BEC that attracts atoms. The atoms stream into it but cannot give up their energy when they arrive (they’re already in their lowest energy state), and so they stream across the well at supersonic speed.

The result is a region within the BEC in which the atoms move at supersonic speed. This is the black hole: any phonon unlucky enough to stray into this region cannot escape.

One reason why sonic black holes are so highly prized is that they ought to produce Hawking radiation. Quantum mechanics predicts that pairs of “virtual” phonons with equal and opposite momentum ought to be constantly springing in and out of existence in BECs.

If one of this pair were to cross the event horizon, it would be sucked into the black hole, never to escape. The other, however, would be free to go on its way. This stream of escapees would be the famous, but as yet unobserved, Hawking radiation.

Lahav and his buddies haven’t gotten this far yet, but they’ve made an important step toward observing Hawking radiation and clearly have their eyes on this goal.

There’s no shortage of competition here, and the creation of the first sonic black hole will be sure to spur the competition. Expect to see somebody claim the first observation of Hawking radiation soon.

Ref: arxiv.org/abs/0906.1337: A Sonic Black Hole in a Density-Inverted Bose-Einstein Condensate

O presente congela o passado?

Paper para ser lido e enviado ao Ion. A idéia de que o presente é análoga a um front de um processo de transição de fase (de uma fase super-resfriada?) é interessante. A onipresença de metáforas heurísticas na formulação de novas teorias científicas também é notável.
Update: Depois de ler o paper, acho que a idéia de um front de transição de fase ao longo do eixo temporal, bem como a analogia de que o sistema como um todo estaria em um estado “super-resfriado” (metaestável) poderiam ser melhor explorados.
Updade 2: Acho que o título do post deveria ter sido: “O passado congela o presente?”

Time and Spacetime: The Crystallizing Block Universe

(Submitted on 4 Dec 2009)

The nature of the future is completely different from the nature of the past. When quantum effects are significant, the future shows all the signs of quantum weirdness, including duality, uncertainty, and entanglement. With the passage of time, after the time-irreversible process of state-vector reduction has taken place, the past emerges, with the previous quantum uncertainty replaced by the classical certainty of definite particle identities and states. The present time is where this transition largely takes place, but the process does not take place uniformly: Evidence from delayed choice and related experiments shows that isolated patches of quantum indeterminacy remain, and that their transition from probability to certainty only takes place later. Thus, when quantum effects are significant, the picture of a classical Evolving Block Universe (`EBU’) cedes place to one of a Crystallizing Block Universe (`CBU’), which reflects this quantum transition from indeterminacy to certainty, while nevertheless resembling the EBU on large enough scales.

Comments: 25 Pages. 3 figures
Subjects: Quantum Physics (quant-ph); General Relativity and Quantum Cosmology (gr-qc)
Cite as: arXiv:0912.0808v1 [quant-ph]