Statements in which the resource exists as a subject.
PredicateObject
rdf:type
lifeskim:mentions
pubmed:issue
3
pubmed:dateCreated
2000-3-16
pubmed:abstractText
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms.
pubmed:grant
pubmed:language
eng
pubmed:journal
pubmed:citationSubset
IM
pubmed:status
MEDLINE
pubmed:month
Mar
pubmed:issn
0933-3657
pubmed:author
pubmed:issnType
Print
pubmed:volume
18
pubmed:owner
NLM
pubmed:authorsComplete
Y
pubmed:pagination
221-44
pubmed:dateRevised
2007-11-14
pubmed:meshHeading
pubmed:year
2000
pubmed:articleTitle
Planning treatment of ischemic heart disease with partially observable Markov decision processes.
pubmed:affiliation
Computer Science Department, Box 1910, Brown University, Providence, RI 02912, USA. milos@cs.brown.edu
pubmed:publicationType
Journal Article, Research Support, U.S. Gov't, P.H.S., Research Support, U.S. Gov't, Non-P.H.S.