[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Axiom-developer] Re: Axiom + High Energy Physics

From: Bob McElrath
Subject: [Axiom-developer] Re: Axiom + High Energy Physics
Date: Thu, 10 Nov 2005 16:48:12 -0800
User-agent: Mutt/1.5.11

C Y address@hidden wrote:
> --- Bob McElrath <address@hidden> wrote:
> > C Y address@hidden wrote:
> > > --- Bob McElrath <address@hidden> wrote:
> > > 
> > > > Cernlib certainly contains mathematical routines.
> > > > 
> > > > However it contains a very large number of things which are not
> > > > mathematical routines, and it is written in fortran.
> > > 
> > > I thought it was being ported to C?  Did that not happen?
> > 
> > no...though the majority of it is in root.
> Hmm.  OK.  So scratch cernlibs, and focus on root.

Root is also not something you want to absorb into axiom.

Root is about almost an operating system.  It contains (among other
things): a database, an object browser, a dynamic object system for C++,
a C++ interpreter, opengl graphics toolkit, drawing and canvas routines,

It has been placed under the LGPL though as of the most recent version.
(there are debian packages!!!)  If it contains useful routines, we
should extract them and incorporate them into axiom, rather than
interfacing with root...

> Ho boy, I've got a long ways to go.  Is leading order the decay
> products from t B j H, and next to leading order the products from
> those products?

Leading order means roughly "the biggest piece".  In this context I mean
a diagram containing no closed loops (I am not worried about the decay
products of t,b,j,h -- those can be handled straightforwardly).  NLO
(next-to-leading order) is diagrams with one loop.

And, for the record, I just finished calculating that cross section, and
it's tiny.  A bad example.  ;)

> > There are a lot of very smart people working on this, it's a highly
> > non-trivial problem.  The above example involves a few thousand
> > feynman diagrams.
> Um. (setf *question-mode* 'clueless-dweeb)  How does one display a few
> thousand diagrams usably?  Or are the feynman diagrams the means rather
> than the end?

The feynman diagrams are a useful representation that helps get an
intuitive feel.  Of course there is a 1:1 conversion from diagrams to
formulas.  But, there are other ways to solve the path integral.

The programs I listed (madgraph, comphep, grace) generate the diagrams
algorithmically (usually in a postscript file).  Feyncalc will generate
loop diagrams algorithmically and display them using kludgy mma
graphics.  Generating the diagrams is non-trivial but not hard.  It's
evaluating them that's a bitch.  ;)

> How much of that can be either pre-entered as a function of the
> experimental conditions or algorithmically decided?

Little.  Some approaches (like Sherpa) attempt to encode lots of
experimental information so the user doesn't have to deal with it.
Experimental collaborations generally have their own "monte carlo"
system, which really means a big pile of ugliness designed to compute
the things they most intend to measure.  If the user asks for a specific
process that has not been calculated at NLO, it usually falls back to
one of the tree-level generators (madgraph, comphep, grace).  But this
has known problems.  Cross sections are usually wrong by as much as a
factor of two, but worse, angular relations among particles can be
significantly wrong.

This, I think, is the really hard part: Identifying what can be
pre-decided and handled without the user's intervention.  Most packages
right now just return garbage results, and the user doesn't know there
was a problem.  For instance, infrared singularities cancel between
leading order and next to leading order, but can instead be handled by
placing realistic detector resolution cuts.  All programs I know will
happily integrate the singularity and return NAN or garbage.  In
principle this can be recognized and handled.

> Fun!  So there's some opportunity for real research here, from the
> sound of things.

Oh definitely.  There's opportunity for many Ph.D. theses.

> > Many people actually see this as a crisis.  When the LHC turns on we
> > will quickly be in a situation where the theory error bars are much
> > larger than the experimental error bars, because next-to-leading
> > order results have in general not been calculated for many 
> > processes.  (and in many cases, even that is not sufficent accuracy)
> Wow.  That's kind of a fun time, actually.

I'm not so sure "fun" is the right word.  But I look forward to the day
that the physics departments around the country kick themselves for
hiring so many useless string theorists.  (Europeans have generally been
more pragmatic on this point)

> Maybe if they solve some of the difficult questions you mentioned
> earlier and happend to do it in Axiom...

That would be nice.  That's a matter of building the low level tools
(dirac matrices, polylogarithms), and advertising.

> Sounds like a challenge!  Maybe Maxima's tensor package would have some
> useful hints for that part of the operation - it has been heavily
> worked on of late (I noticed you mentioned it in your software page -
> last I heard it had been fixed and was working.)

Cool, I'll have to take a look again.

Bob McElrath [Univ. of California at Davis, Department of Physics]

    "In science, 'fact' can only mean 'confirmed to such a degree that it would
    be perverse to withhold provisional assent.' I suppose that apples might
    start to rise tomorrow, but the possibility does not merit equal time in
    physics classrooms." -- Stephen Jay Gould (1941 - 2002)

Attachment: signature.asc
Description: Digital signature

reply via email to

[Prev in Thread] Current Thread [Next in Thread]