prosper

Information about prosper

Published on December 28, 2007

Author: Lucianna

Source: authorstream.com

Content

Bayesian Within The Gates A View From Particle Physics:  Bayesian Within The Gates A View From Particle Physics Harrison B. Prosper Florida State University SAMSI 24 January, 2006 Outline:  Outline Measuring Zero as Precisely as Possible! Signal/Background Discrimination 1-D Example 14-D Example Some Open Issues Summary Measuring Zero!:  Measuring Zero! Diamonds may not be forever Neutron <-> anti-neutron transitions, CRISP Experiment (1982 – 1985), Institut Laue Langevin Grenoble, France Method Fire gas of cold neutrons onto a graphite foil. Look for annihilation of anti-neutron component. Measuring Zero!:  Measuring Zero! Count number of signal + background events N. Suppress putative signal and count background events B, independently. Results: N = 3 B = 7 Measuring Zero!:  Measuring Zero! Classic 2-Parameter Counting Experiment N ~ Poisson(s+b) B ~ Poisson(b) Wanted: A statement like s < u(N,B) @ 90% CL Measuring Zero!:  Measuring Zero! In 1984, no exact solution existed in the particle physics literature! But, surely it must have been solved by statisticians. Alas, from Kendal and Stuart I learnt that calculating exact confidence intervals is “a matter of very considerable difficulty”. Measuring Zero!:  Measuring Zero! Exact in what way? Over the ensemble of statements of the form s є [0, u) at least 90% of them should be true whatever the true value of the signal s AND whatever the true value of the background parameter b. blame… Neyman (1937) Slide8:  “Keep it simple, but no simpler” Albert Einstein Bayesian @ the Gate (1984):  Bayesian @ the Gate (1984) Solution: p(N,B|s,b) = Poisson(s+b) Poisson(b) the likelihood p(s,b) = uniform(s,b) the prior Compute the posterior density p(s,b|N,B) p(s,b|N,B) = p(N,B|s,b) p(s,b)/p(N,B) Marginalize over b p(s|N,B) = ∫p(s,b|N,B) db This reasoning was compelling to me then, and is much more so now! Particle Physics Data:  Particle Physics Data proton + anti-proton -> positron (e+) neutrino (n) Jet1 Jet2 Jet3 Jet4 This event “lives” in 3 + 2 + 3 x 4 = 17 dimensions. Particle Physics Data:  CDF/Dzero Discovery of top quark (1995) Data red Signal green Background blue, magenta Dzero: 17-D -> 2-D Particle Physics Data Slide12:  But that was then, and now is now! Today we have 2 GHz laptops, with 2 GB of memory! It is fun to deploy huge, sometimes unreliable, computational resources, that is, brains, to reduce the dimensionality of data. But perhaps it is now feasible to work directly in the original high-dimensional space, using hardware! Signal/Background Discrimination:  Signal/Background Discrimination The optimal solution is to compute p(S|x) = p(x|s) p(s) / [p(x|s) p(s) + p(x|B) p(B)] Every signal/background discrimination method is ultimately an algorithm to approximate this solution, or a mapping thereof. Therefore, if a method is already at the Bayes limit, no other method, however sophisticated, can do better! Signal/Background Discrimination:  Given D = x, y x = {x1,…xN}, y = {y1,…yN} of N training examples Infer A discriminant function f(x, w), with parameters w p(w|x, y) = p(x, y|w) p(w) / p(x, y) = p(y|x, w) p(x|w) p(w) / p(y|x) p(x) = p(y|x, w) p(w) / p(y|x) assuming p(x|w) -> p(x) Signal/Background Discrimination Signal/Background Discrimination:  A typical likelihood for classification: p(y|x, w) = Pi f(xi, w)y [1 – f(xi, w)]1-y where y = 0 for background events y = 1 for signal events If f(x, w) flexible enough, then maximizing p(y|x, w) with respect to w yields f = p(S|x), asymptotically. Signal/Background Discrimination Signal/Background Discrimination:  However, in a full Bayesian calculation one usually averages with respect to the posterior density y(x) = ∫ f(x, w) p(w|D) dw Questions: 1. Do suitably flexible functions f(x, w) exist? 2. Is there a feasible way to do the integral? Signal/Background Discrimination Answer 1: Hilbert’s 13th Problem!:  Answer 1: Hilbert’s 13th Problem! Prove that the following is impossible y(x,y,z) = F( A(x), B(y), C(z) ) In 1957, Kolmogorov proved the contrary conjecture y(x1,..,xn) = F( f1(x1),…,fn(xn) ) I’ll call such functions, F, Kolmogorov functions Kolmogorov Functions:  Kolmogorov Functions A neural network is an example of a Kolmogorov function, that is, a function capable of approximating arbitrary mappings f:RN -> U The parameters w = (u, a, v, b) are called weights Answer 2: Use Hybrid MCMC:  Answer 2: Use Hybrid MCMC Computational Method Generate a Markov chain (MC) of N points {w} drawn from the posterior density p(w|D) and average over the last M points. Each point corresponds to a network. Software Flexible Bayesian Modeling by Radford Neal http://www.cs.utoronto.ca/~radford/fbm.software.html A 1-D Example:  A 1-D Example Signal p+pbar -> t q b Background p+pbar -> W b b NN Model Class (1, 15, 1) MCMC 500 tqb + Wbb events Use last 20 networks in a MC chain of 500. x Wbb tqb A 1-D Example :  A 1-D Example x Dots p(S|x) = HS/(HS+HB) HS, HB, 1-D histograms Curves Individual NNs n(x, wk) Black curve < n(x, w) > A 14-D Example (Finding Susy!):  A 14-D Example (Finding Susy!) Transverse momentum spectra Signal: black curve Signal/Noise 1/100,000 A 14-D Example (Finding Susy!):  A 14-D Example (Finding Susy!) Missing transverse momentum spectrum (caused by escape of neutrinos and Susy particles) Variable count 4 x (ET, h, f) + (ET, f) = 14 A 14-D Example (Finding Susy!):  Likelihood Prior A 14-D Example (Finding Susy!) Signal 250 p+pbar -> top + anti-top (MC) events Background 250 p+pbar -> gluino gluino (MC) events NN Model Class (14, 40, 1) (641-D parameter space!) MCMC Use last 100 networks in a Markov chain of 10,000, skipping every 20. But does it Work?:  But does it Work? Signal to noise can reach 1/1 with an acceptable signal strength But does it Work? :  But does it Work? Let d(x) = N p(x|S) + N p(x|B) be the density of the data, containing 2N events, assuming, for simplicity, p(S) = p(B). A properly trained classifier y(x) approximates p(S|x) = p(x|S)/[p(x|S) + p(x|B)] Therefore, if the signal and background events are weighted with y(x), we should recover the signal density. But does it Work? :  But does it Work? Amazingly well ! Some Open Issues:  Some Open Issues Why does this insane function p(w1,…,w641|x1,…,x500) behave so well? 641 parameters > 500 events! How should one verify that an n-D (n ~ 14) swarm of simulated background events matches the n-D swarm of observed events (in the background region)? How should one verify that y(x) is indeed a reasonable approximation to the Bayes discriminant, p(S|x)? Summary:  Summary Bayesian methods have been, and are being, used with considerable success by particle physicists. Happily, the frequentist/Bayesian Cold War is abating! The application of Bayesian methods to highly flexible functions, e.g., neural networks, is very promising and should be broadly applicable. Needed: A powerful way to compare high-dimensional swarms of points. Agree, or not agree, that is the question!

Related presentations


Other presentations created by Lucianna

Nutritional Care of Burns
04. 01. 2008
0 views

Nutritional Care of Burns

spine2 no background
08. 05. 2008
0 views

spine2 no background

banking
14. 04. 2008
0 views

banking

emerging security threats
29. 09. 2007
0 views

emerging security threats

Thunderstorms
03. 10. 2007
0 views

Thunderstorms

i2 traffic shaping
03. 10. 2007
0 views

i2 traffic shaping

bind
07. 10. 2007
0 views

bind

prefix delegation requirement1
09. 10. 2007
0 views

prefix delegation requirement1

dipo
12. 10. 2007
0 views

dipo

Living Things
12. 10. 2007
0 views

Living Things

wnv062904
21. 10. 2007
0 views

wnv062904

latinoamerica
22. 10. 2007
0 views

latinoamerica

Rachinsky
11. 10. 2007
0 views

Rachinsky

Slide presentazione
24. 10. 2007
0 views

Slide presentazione

feynman
16. 10. 2007
0 views

feynman

gt bot
13. 10. 2007
0 views

gt bot

fr summit marginson 230306
30. 10. 2007
0 views

fr summit marginson 230306

Accelerators CZ
15. 11. 2007
0 views

Accelerators CZ

Les Animaux du Zoo
11. 10. 2007
0 views

Les Animaux du Zoo

Rapport Nationale MAROC
23. 10. 2007
0 views

Rapport Nationale MAROC

Grammar essentials
16. 11. 2007
0 views

Grammar essentials

sponge
20. 11. 2007
0 views

sponge

Crans Montana 03 nieuw
15. 10. 2007
0 views

Crans Montana 03 nieuw

Workshop
02. 11. 2007
0 views

Workshop

NSF 12 6 2001
31. 12. 2007
0 views

NSF 12 6 2001

Class8
07. 01. 2008
0 views

Class8

toc wkshp nov03
18. 10. 2007
0 views

toc wkshp nov03

VCT Morocco
24. 10. 2007
0 views

VCT Morocco

NACADA Combined Workshop 11 04
29. 09. 2007
0 views

NACADA Combined Workshop 11 04

sky
13. 11. 2007
0 views

sky

file Kigali Strengthening Local
07. 01. 2008
0 views

file Kigali Strengthening Local

10638221831Maroc MinInt French
23. 10. 2007
0 views

10638221831Maroc MinInt French

ub geographicimagery051 001
27. 09. 2007
0 views

ub geographicimagery051 001

Presentación RR EXPORTA def
23. 10. 2007
0 views

Presentación RR EXPORTA def

HPCN summary 7 5 2007
17. 10. 2007
0 views

HPCN summary 7 5 2007

ammosov
12. 10. 2007
0 views

ammosov

A NEW ENGLISH COURSE Book 3
20. 02. 2008
0 views

A NEW ENGLISH COURSE Book 3

Food Bank of New Jersey
29. 02. 2008
0 views

Food Bank of New Jersey

lewis
19. 10. 2007
0 views

lewis

XC Safety and mentor
03. 04. 2008
0 views

XC Safety and mentor

NA3
07. 04. 2008
0 views

NA3

Civitas Plus2006
18. 03. 2008
0 views

Civitas Plus2006

Ch14 7e
10. 04. 2008
0 views

Ch14 7e

Team2
11. 04. 2008
0 views

Team2

fmla
04. 10. 2007
0 views

fmla

retailcompetition
17. 04. 2008
0 views

retailcompetition

Using ILS
22. 04. 2008
0 views

Using ILS

shaw
16. 03. 2008
0 views

shaw

CSI Presentation 2007
19. 02. 2008
0 views

CSI Presentation 2007

NIST TDT2004
07. 05. 2008
0 views

NIST TDT2004

chapter3v2
15. 10. 2007
0 views

chapter3v2

MEDOPSBOOKFEB01
02. 05. 2008
0 views

MEDOPSBOOKFEB01

BostwPres
02. 05. 2008
0 views

BostwPres

555 Spanish
02. 05. 2008
0 views

555 Spanish

hexapod Shirke
02. 05. 2008
0 views

hexapod Shirke

Lung Expansion 1
02. 05. 2008
0 views

Lung Expansion 1

Aaron
02. 05. 2008
0 views

Aaron

CMI slides Feb05
01. 11. 2007
0 views

CMI slides Feb05

SAP1012
10. 03. 2008
0 views

SAP1012

lesson 4
15. 10. 2007
0 views

lesson 4

2006 APHA
05. 10. 2007
0 views

2006 APHA

probir
30. 03. 2008
0 views

probir

Rauf Presentation NEW
18. 10. 2007
0 views

Rauf Presentation NEW

IAJAPAN
09. 10. 2007
0 views

IAJAPAN

Mr Daisuke Matsunaga
09. 10. 2007
0 views

Mr Daisuke Matsunaga

3 KukaGLBThealthissues
29. 10. 2007
0 views

3 KukaGLBThealthissues

Bernard ANSELMETTI
24. 10. 2007
0 views

Bernard ANSELMETTI

NBII Newark 10 02
21. 10. 2007
0 views

NBII Newark 10 02

MarketingWorkshop 4 22 05rev1
24. 10. 2007
0 views

MarketingWorkshop 4 22 05rev1

FEESTDAGEN
06. 11. 2007
0 views

FEESTDAGEN

trainplanesandautomo biles
13. 03. 2008
0 views

trainplanesandautomo biles

NWA June00
05. 10. 2007
0 views

NWA June00

Panama 2004 Reporte
25. 10. 2007
0 views

Panama 2004 Reporte

SAKURA Yamamoto
25. 03. 2008
0 views

SAKURA Yamamoto

tiner presentation
04. 01. 2008
0 views

tiner presentation

aseancjp
09. 10. 2007
0 views

aseancjp

schools talk
29. 10. 2007
0 views

schools talk

BethkeA
02. 10. 2007
0 views

BethkeA

DeVidtsPresentation
11. 10. 2007
0 views

DeVidtsPresentation

Zhu Zhiyong
16. 10. 2007
0 views

Zhu Zhiyong

pres1 1
22. 10. 2007
0 views

pres1 1

AESC 2005 VERMONT Result
02. 11. 2007
0 views

AESC 2005 VERMONT Result