Machine learning using approximate inference : variational and sequential Monte Carlo methods / Christian Andersson Naesseth.
-
- Andersson Naesseth, Christian, 1986- (författare)
-
-
Alternativt namn: Naesseth, Christian Andersson, 1986-
-
- Schön, Thomas, 1977- (preses)
-
-
Alternativt namn: Schön, Thomas B., 1977-
-
Lindsten, Fredrik, 1984- (preses)
-
Murray, Iain (opponent)
-
- Linköpings universitet. Institutionen för systemteknik (utgivare)
-
-
Alternativt namn: Engelska: Linköping University. Department of Electrical Engineering
-
Alternativt namn: ISY
-
Linköpings universitet Tekniska fakulteten (utgivare)
- Publicerad: Linköping : Linköping University, Department of Electrical Engineering, [2018]
- Tillverkad: 2018
- Engelska 1 onlineresurs (xiv, 39 sidor)
-
Serie: Linköping studies in science and technology. Dissertations, 0345-7524 ; 1969
-
Läs hela texten (Sammanfattning och ramberättelse från Linköping University Electronic Press)
-
Läs hela texten
-
Läs hela texten
Sammanfattning
Ämnesord
Stäng
- Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubiquitous in our everyday life. The systems we design, and technology we develop, requires us to coherently represent and work with uncertainty in data. Probabilistic models and probabilistic inference gives us a powerful framework for solving this problem. Using this framework, while enticing, results in difficult-to-compute integrals and probabilities when conditioning on the observed data. This means we have a need for approximate inference, methods that solves the problem approximately using a systematic approach. In this thesis we develop new methods for efficient approximate inference in probabilistic models. There are generally two approaches to approximate inference, variational methods and Monte Carlo methods. In Monte Carlo methods we use a large number of random samples to approximate the integral of interest. With variational methods, on the other hand, we turn the integration problem into that of an optimization problem. We develop algorithms of both types and bridge the gap between them. First, we present a self-contained tutorial to the popular sequential Monte Carlo (SMC) class of methods. Next, we propose new algorithms and applications based on SMC for approximate inference in probabilistic graphical models. We derive nested sequential Monte Carlo, a new algorithm particularly well suited for inference in a large class of high-dimensional probabilistic models. Then, inspired by similar ideas we derive interacting particle Markov chain Monte Carlo to make use of parallelization to speed up approximate inference for universal probabilistic programming languages. After that, we show how we can make use of the rejection sampling process when generating gamma distributed random variables to speed up variational inference. Finally, we bridge the gap between SMC and variational methods by developing variational sequential Monte Carlo, a new flexible family of variational approximations.
Ämnesord
- Maskininlärning (sao)
- Statistisk inferens (sao)
- Approximationer (sao)
- Monte Carlo-metoder (sao)
- Engineering and Technology (ssif)
- Electrical Engineering, Electronic Engineering, Information Engineering (ssif)
- Control Engineering (ssif)
- Teknik (ssif)
- Elektroteknik och elektronik (ssif)
- Reglerteknik (ssif)
- Natural Sciences (ssif)
- Computer and Information Sciences (ssif)
- Computer Sciences (ssif)
- Naturvetenskap (ssif)
- Data- och informationsvetenskap (Datateknik) (ssif)
- Datavetenskap (datalogi) (ssif)
- Signal Processing (ssif)
- Signalbehandling (ssif)
- Monte Carlo method (LCSH)
- Machine learning (LCSH)
- Mathematical statistics (LCSH)
- Probabilities (LCSH)
Genre
- government publication (marcgt)
Klassifikation
- 006.31 (DDC)
- Pud (kssb/8 (machine generated))
Inställningar
Hjälp
Titeln finns på 1 bibliotek.
Ange som favorit