881,18 €
979,09 €
-10% with code: EXTRA
Markov Decision Processes
Markov Decision Processes
881,18
979,09 €
  • We will send in 10–14 business days.
Examines several fundamentals concerning the manner in which Markov decision problems may be properly formulated and the determination of solutions or their properties. Coverage includes optimal equations, algorithms and their characteristics, probability distributions, modern development in the Markov decision process area, namely structural policy analysis, approximation modeling, multiple objectives and Markov games. Copiously illustrated with examples.
979.09
  • SAVE -10% with code: EXTRA

Markov Decision Processes (e-book) (used book) | D J White | bookbook.eu

Reviews

(5.00 Goodreads rating)

Description

Examines several fundamentals concerning the manner in which Markov decision problems may be properly formulated and the determination of solutions or their properties. Coverage includes optimal equations, algorithms and their characteristics, probability distributions, modern development in the Markov decision process area, namely structural policy analysis, approximation modeling, multiple objectives and Markov games. Copiously illustrated with examples.

EXTRA 10 % discount with code: EXTRA

881,18
979,09 €
We will send in 10–14 business days.

The promotion ends in 22d.21:51:59

The discount code is valid when purchasing from 10 €. Discounts do not stack.

Log in and for this item
you will receive 9,79 Book Euros!?
  • Author: D J White
  • Publisher:
  • ISBN-10: 0471936278
  • ISBN-13: 9780471936275
  • Format: 15.6 x 23.6 x 2.2 cm, kieti viršeliai
  • Language: English English

Examines several fundamentals concerning the manner in which Markov decision problems may be properly formulated and the determination of solutions or their properties. Coverage includes optimal equations, algorithms and their characteristics, probability distributions, modern development in the Markov decision process area, namely structural policy analysis, approximation modeling, multiple objectives and Markov games. Copiously illustrated with examples.

Reviews

  • No reviews
0 customers have rated this item.
5
0%
4
0%
3
0%
2
0%
1
0%
(will not be displayed)