Statements in which the resource exists.
SubjectPredicateObjectContext
pubmed-article:10708022rdf:typepubmed:Citationlld:pubmed
pubmed-article:10708022lifeskim:mentionsumls-concept:C1704922lld:lifeskim
pubmed-article:10708022lifeskim:mentionsumls-concept:C0681916lld:lifeskim
pubmed-article:10708022lifeskim:mentionsumls-concept:C0750572lld:lifeskim
pubmed-article:10708022lifeskim:mentionsumls-concept:C0680844lld:lifeskim
pubmed-article:10708022lifeskim:mentionsumls-concept:C1706907lld:lifeskim
pubmed-article:10708022pubmed:issue3lld:pubmed
pubmed-article:10708022pubmed:dateCreated2000-3-23lld:pubmed
pubmed-article:10708022pubmed:abstractTextOne of the essential ways in which nonlinear image restoration algorithms differ from linear, convolution-type image restoration filters is their capability to restrict the restoration result to nonnegative intensities. The iterative constrained Tikhonov-Miller (ICTM) algorithm, for example, incorporates the nonnegativity constraint by clipping all negative values to zero after each iteration. This constraint will be effective only when the restored intensities have near-zero values. Therefore the background estimation will have an influence on the effectiveness of the nonnegativity constraint of these algorithms. We investigated quantitatively the dependency of the performance of the ICTM, Carrington, and Richardson-Lucy algorithms on the estimation of the background and compared it with the performance of the linear Tikhonov-Miller restoration filter. We found that the performance depends critically on the background estimation: An underestimation of the background will make the nonnegativity constraint ineffective, which results in a performance that does not differ much from the Tikhonov-Miller filter performance. A (small) overestimation, however, degrades the performance dramatically, since it results in a clipping of object intensities. We propose a novel general method to estimate the background based on the dependency of nonlinear restoration algorithms on the background, and we demonstrate its applicability on real confocal images.lld:pubmed
pubmed-article:10708022pubmed:languageenglld:pubmed
pubmed-article:10708022pubmed:journalhttp://linkedlifedata.com/r...lld:pubmed
pubmed-article:10708022pubmed:citationSubsetIMlld:pubmed
pubmed-article:10708022pubmed:statusMEDLINElld:pubmed
pubmed-article:10708022pubmed:monthMarlld:pubmed
pubmed-article:10708022pubmed:issn1084-7529lld:pubmed
pubmed-article:10708022pubmed:authorpubmed-author:van KempenG...lld:pubmed
pubmed-article:10708022pubmed:authorpubmed-author:van VlietL...lld:pubmed
pubmed-article:10708022pubmed:issnTypePrintlld:pubmed
pubmed-article:10708022pubmed:volume17lld:pubmed
pubmed-article:10708022pubmed:ownerNLMlld:pubmed
pubmed-article:10708022pubmed:authorsCompleteYlld:pubmed
pubmed-article:10708022pubmed:pagination425-33lld:pubmed
pubmed-article:10708022pubmed:dateRevised2006-11-15lld:pubmed
pubmed-article:10708022pubmed:meshHeadingpubmed-meshheading:10708022...lld:pubmed
pubmed-article:10708022pubmed:meshHeadingpubmed-meshheading:10708022...lld:pubmed
pubmed-article:10708022pubmed:meshHeadingpubmed-meshheading:10708022...lld:pubmed
pubmed-article:10708022pubmed:year2000lld:pubmed
pubmed-article:10708022pubmed:articleTitleBackground estimation in nonlinear image restoration.lld:pubmed
pubmed-article:10708022pubmed:affiliationCentral Analytical Sciences, Unilever Research Vlaardingen, The Netherlands.lld:pubmed
pubmed-article:10708022pubmed:publicationTypeJournal Articlelld:pubmed
pubmed-article:10708022pubmed:publicationTypeResearch Support, Non-U.S. Gov'tlld:pubmed