• 【扩散张量成像:结构自适应平滑。】 复制标题 收藏 收藏
    DOI:10.1016/j.neuroimage.2007.10.024 复制DOI
    作者列表:Tabelow K,Polzehl J,Spokoiny V,Voss HU
    BACKGROUND & AIMS: :Diffusion Tensor Imaging (DTI) data is characterized by a high noise level. Thus, estimation errors of quantities like anisotropy indices or the main diffusion direction used for fiber tracking are relatively large and may significantly confound the accuracy of DTI in clinical or neuroscience applications. Besides pulse sequence optimization, noise reduction by smoothing the data can be pursued as a complementary approach to increase the accuracy of DTI. Here, we suggest an anisotropic structural adaptive smoothing procedure, which is based on the Propagation-Separation method and preserves the structures seen in DTI and their different sizes and shapes. It is applied to artificial phantom data and a brain scan. We show that this method significantly improves the quality of the estimate of the diffusion tensor, by means of both bias and variance reduction, and hence enables one either to reduce the number of scans or to enhance the input for subsequent analysis such as fiber tracking.
    背景与目标: :扩散张量成像(DTI)数据的特点是高噪声水平。因此,诸如各向异性指数或用于光纤跟踪的主扩散方向之类的量的估计误差相对较大,并且可能会大大混淆DTI在临床或神经科学应用中的准确性。除了脉冲序列优化以外,通过平滑数据来降低噪声可以作为提高DTI精度的一种补充方法。在这里,我们建议一种各向异性的结构自适应平滑方法,该方法基于“传播-分离”方法,并保留了DTI中看到的结构以及它们的不同大小和形状。它适用于人工幻象数据和脑部扫描。我们表明,该方法通过减少偏差和减少方差,显着提高了扩散张量估计的质量,因此使减少扫描次数或增强输入以进行后续分析(如纤维跟踪)的方法得以实现。
  • 【原始范围的内核密度估计量:平滑和自相关红鲱鱼。】 复制标题 收藏 收藏
    DOI:10.1890/06-0930 复制DOI
    作者列表:Fieberg J
    BACKGROUND & AIMS: :Two oft-cited drawbacks of kernel density estimators (KDEs) of home range are their sensitivity to the choice of smoothing parameter(s) and their need for independent data. Several simulation studies have been conducted to compare the performance of objective, data-based methods of choosing optimal smoothing parameters in the context of home range and utilization distribution (UD) estimation. Lost in this discussion of choice of smoothing parameters is the general role of smoothing in data analysis, namely, that smoothing serves to increase precision at the cost of increased bias. A primary goal of this paper is to illustrate this bias-variance trade-off by applying KDEs to sampled locations from simulated movement paths. These simulations will also be used to explore the role of autocorrelation in estimating UDs. Autocorrelation can be reduced (1) by increasing study duration (for a fixed sample size) or (2) by decreasing the sampling rate. While the first option will often be reasonable, for a fixed study duration higher sampling rates should always result in improved estimates of space use. Further, KDEs with typical data-based methods of choosing smoothing parameters should provide competitive estimates of space use for fixed study periods unless autocorrelation substantially alters the optimal level of smoothing.
    背景与目标: :经常提到的原始范围内核密度估计器(KDE)的两个缺点是,它们对选择平滑参数的敏感性以及对独立数据的需求。已经进行了一些模拟研究,以比较在家庭范围和利用率分布(UD)估计的情况下选择最佳平滑参数的客观,基于数据的方法的性能。在平滑参数选择的讨论中,丢失的是平滑在数据分析中的一般作用,即,平滑以增加偏差为代价来提高精度。本文的主要目的是通过将KDE应用到来自模拟运动路径的采样位置来说明这种偏差-方差的权衡。这些模拟也将用于探讨自相关在估计UD中的作用。自相关可以通过以下方法降低:(1)通过增加研究持续时间(针对固定样本量)或(2)通过降低采样率来降低。尽管第一种选择通常是合理的,但对于固定的研究持续时间,较高的采样率应始终可以改善对空间使用的估计。此外,除非采用自相关方法可以显着改变最佳平滑度,否则采用典型的基于数据的平滑度选择方法的KDE应当提供固定研究期间空间使用的竞争性估算值。
  • 【通过使用修饰的核苷和离液剂来平滑DNA双链体的热稳定性。】 复制标题 收藏 收藏
    DOI:10.1093/nar/27.6.1492 复制DOI
    作者列表:Nguyen HK,Fournier O,Asseline U,Dupret D,Thuong NT
    BACKGROUND & AIMS: :The effect of alkyltrimethylammonium ions on the thermostability of natural and modified DNA duplexes has been investigated. We have shown that the use of tetramethylammonium ions TMA+along with the chemical modification of duplexes allow the fine adjustment of T m and the possibility of obtaining several duplex systems with varied isostabilizedtemperatures, some of which show greater stability than those of natural DNA. This approach could be very useful for DNA sequencing by hybridization.
    背景与目标: :已研究了烷基三甲基铵离子对天然和修饰的DNA双链体的热稳定性的影响。我们已经表明,使用四甲基铵离子TMA以及双链体的化学修饰可以对T m进行精细调节,并可以获得具有不同等稳定温度的多个双链体系统,其中某些系统显示出比天然DNA更高的稳定性。该方法对于通过杂交进行DNA测序非常有用。
  • 【使用平滑样条的分层贝叶斯时空分析的血运重建几率。】 复制标题 收藏 收藏
    DOI:10.1002/sim.3094 复制DOI
    作者列表:Silva GL,Dean CB,Niyonsenga T,Vanasse A
    BACKGROUND & AIMS: :Hierarchical Bayesian models are proposed for over-dispersed longitudinal spatially correlated binomial data. This class of models accounts for correlation among regions by using random effects and allows a flexible modelling of spatiotemporal odds by using smoothing splines. The aim is (i) to develop models which will identify temporal trends of odds and produce smoothed maps including regional effects, (ii) to specify Markov chain Monte Carlo (MCMC) inference for fitting such models, (iii) to study the sensitivity of such Bayesian binomial spline spatiotemporal analyses to prior assumptions, and (iv) to compare mechanisms for assessing goodness of fit. An analysis of regional variation for revascularization odds of patients hospitalized for acute coronary syndrome in Quebec motivates and illustrates the methods developed.
    背景与目标: :提出层次贝叶斯模型用于过度分散的纵向空间相关二项式数据。此类模型通过使用随机效应来说明区域之间的相关性,并可以通过使用平滑样条线对时空几率进行灵活的建模。目的是(i)开发能够识别赔率的时间趋势并产生包括区域效应在内的平滑图的模型;(ii)指定马尔可夫链蒙特卡罗(MCMC)推论来拟合此类模型;(iii)研究模型的敏感性。这样的贝叶斯二项式样条时空分析到先前的假设,以及(iv)比较评估拟合优度的机制。对魁北克因急性冠脉综合征住院的患者进行血运重建几率的区域变化进行分析,可以激发并说明所开发的方法。
  • 【局部平滑窗口大小和基频对闪烁计算的交互作用。】 复制标题 收藏 收藏
    DOI:10.1016/s0892-1997(05)80332-7 复制DOI
    作者列表:Jafari M,Till JA,Law-Till CB
    BACKGROUND & AIMS: :Slow amplitude modulation of human voice was approximated by a sinusoidal wave. The theoretical effects of smoothing window size, F0, and modulation frequency on window amplitude average as well as calculated shimmer were mathematically derived. Subsequently, the theoretical predictions were tested using idealized and real voice signals from normal speakers. The theoretical and experimental results suggest that shimmer (when calculated using a smoothing window) is a function of window duration and modulation frequency. Window duration when defined as a constant number of pitch periods varies from speaker to speaker depending on their F0. It may not be desirable to use local smoothing windows with a constant number of cycles for shimmer computation, especially if voices with known low-frequency amplitude modulations but notably different fundamental frequencies are compared.
    背景与目标: :人声的慢幅度调制由正弦波近似。数学上得出了平滑窗口大小,F0和调制频率对窗口幅度平均值以及计算出的微光的理论影响。随后,使用来自正常扬声器的理想和真实语音信号测试了理论预测。理论和实验结果表明,闪烁(使用平滑窗口计算时)是窗口持续时间和调制频率的函数。当定义为恒定音调周期数时,窗口持续时间因扬声器的F0而异。可能不希望使用具有恒定周期数的局部平滑窗口来进行微光计算,尤其是在比较具有已知低频幅度调制但明显不同的基频的声音的情况下。
  • 【在3D治疗计划系统上实施等中心偏移技术以平滑MLC场边缘。】 复制标题 收藏 收藏
    DOI:10.1118/1.1485061 复制DOI
    作者列表:Xue J,Zhang P,Wu J,Wang Z,Sibata C
    BACKGROUND & AIMS: :Stepped leaf edges are the major limitation of conforming to the prescribed treatment contour defined by the conventional multileaf collimator (MLC), which produces a scalloped dose pattern. The commercial HD-270 MLC (HDI) technique provides a software solution of the conventional MLC to achieve smoothed edge and optimal penumbra of the MLC shaped field. We implemented the HDI functionality on a 3D treatment planning system and compared the dosimetric effects of the HDI delivery in simulation with those in experiment for a number of the MLC fields. The fields from the contour of varied shapes with different sizes of the leaf stepping were tested for the HDI delivery. There is a good agreement of the dose distribution between the calculation as implemented in the planning system and the measurement performed on the treatment machine. It has been shown that the HDI delivery significantly smooths the stepped field edge with the reduced isodose undulation and effective penumbra. A problem may be present when the HDI is applied for the treatment of the circular contour of smaller diameter, and the conformity of the MLC shaping may not be achievable satisfactorily with the existing system. The optimization of leaf configuration is suggested to improve the conformity of the HDI technique. The HDI planning then can be used to assist in the decision making of applying the HDI treatment delivery.
    背景与目标: :叶片边缘呈阶梯状是符合常规多叶准直仪(MLC)所定义的规定治疗轮廓的主要限制,它会产生扇贝形的剂量模式。商业HD-270 MLC(HDI)技术提供了常规MLC的软件解决方案,以实现MLC成形场的平滑边缘和最佳半影。我们在3D治疗计划系统上实现了HDI功能,并将HDI交付在模拟中的剂量效应与在许多MLC领域的实验中的剂量效应进行了比较。测试了来自不同形状轮廓和不同叶阶大小的区域的HDI交付情况。在计划系统中执行的计算与在治疗机上执行的测量之间的剂量分布有很好的一致性。业已表明,HDI的输送显着平滑了阶梯状的场边缘,并减少了等剂量的起伏和有效的半影。当将HDI应用于较小直径的圆形轮廓的处理时,可能会出现问题,并且现有系统可能无法令人满意地实现MLC成形的一致性。建议优化叶配置以提高HDI技术的一致性。然后,可以使用HDI规划来协助做出应用HDI处理交付的决策。
  • 【自相关的空间平滑控制fMRI分析中的自由度。】 复制标题 收藏 收藏
    DOI:10.1016/j.neuroimage.2005.02.007 复制DOI
    作者列表:Worsley KJ
    BACKGROUND & AIMS: :In the statistical analysis of fMRI data, the parameter of primary interest is the effect of a contrast; of secondary interest is its standard error, and of tertiary interest is the standard error of this standard error, or equivalently, the degrees of freedom (df). In a ReML (Restricted Maximum Likelihood) analysis, we show how spatial smoothing of temporal autocorrelations increases the effective df (but not the smoothness of primary or secondary parameter estimates), so that the amount of smoothing can be chosen in advance to achieve a target df, typically 100. This has already been done at the second level of a hierarchical analysis by smoothing the ratio of random to fixed effects variances (Worsley, K.J., Liao, C., Aston, J.A.D., Petre, V., Duncan, G.H., Morales, F., Evans, A.C., 2002. A general statistical analysis for fMRI data. NeuroImage, 15:1-15); we now show how to do it at the first level, by smoothing autocorrelation parameters. The proposed method is extremely fast and it does not require any image processing. It can be used in conjunction with other regularization methods (Gautama, T., Van Hulle, M.M., in press. Optimal spatial regularisation of autocorrelation estimates in fMRI analysis. NeuroImage.) to avoid unnecessary smoothing beyond 100 df. Our results on a typical 6-min, TR = 3, 1.5-T fMRI data set show that 8.5-mm smoothing is needed to achieve 100 df, and this results in roughly a doubling of detected activations.
    背景与目标: :在功能磁共振成像数据的统计分析中,主要关注的参数是对比度的影响;次要利益是其标准误差,三次利益是该标准误差的标准误差,或者等效地是自由度(df)。在ReML(受限最大似然)分析中,我们显示了时间自相关的空间平滑如何增加有效df(而不是主要或辅助参数估计的平滑度),因此可以提前选择平滑量以实现目标df,通常为100。通过平滑随机效应与固定效应方差的比率(Worsley,KJ,Liao,C.,Aston,JAD,Petre,V.,Duncan,GH ,Morales,F.,Evans,AC,2002. fMRI数据的一般统计分析(NeuroImage,15:1-15);我们现在展示如何通过平滑自相关参数在第一级上做到这一点。所提出的方法非常快,并且不需要任何图像处理。它可以与其他正则化方法结合使用(印刷中的Gautama,T.,Van Hulle,M.M.。fMRI分析中自相关估计的最佳空间正则化。NeuroImage),以避免超过100 df的不必要平滑。我们在典型的6分钟,TR = 3、1.5-T fMRI数据集上的结果表明,要达到100 df,需要8.5 mm的平滑处理,这导致检测到的激活次数大约增加了一倍。
  • 【连续培养过程中重组细菌种群动态的研究:数据过滤和平滑化的应用。】 复制标题 收藏 收藏
    DOI:10.1002/bit.260390406 复制DOI
    作者列表:Nancib N,Mosrati R,Boudrant J
    BACKGROUND & AIMS: :A numerical method to process experimental data concerning plasmid stability of a recombinant bacteria during continuous cultures with nonselective media is proposed here. This method differs from previous ones in that it uses the derivatve form of the state equation of the Imanaka-Aiba model for recombinant cultures. The methodology proposed here allows one to estimate values for the two model parameters without forcing them to be constant. Until now, this could not be done using classical analytical techniques because these parameters have been considered invariable because of the integration used in the evaluation of the model. These parameters are (1) the difference in the specific growth rates between plasmid-carrying cells and plasmid-free cells (deltamu), and (2) the probability of plasmid loss by plasmid-containing cells (rho(r) mu(+)). The derivative technique used here is completed by mathematical treatments involving data filtering and smoothing. The values of the two parameters are in agreement with those already published. The current technique does not impose preconditions and permit us to further study related phenomena.
    背景与目标: :本文提出了一种数值方法,用于处理与非选择性培养基连续培养过程中重组细菌质粒稳定性有关的实验数据。该方法与以前的方法不同之处在于,它使用Imanaka-Aiba模型的状态方程的导数形式进行重组培养。这里提出的方法允许一个人估计两个模型参数的值,而不必强迫它们是恒定的。到现在为止,这不能使用经典的分析技术来完成,因为这些参数由于模型评估中使用的积分而被认为是不变的。这些参数是(1)质粒携带细胞和无质粒细胞之间的比生长速率的差异(deltamu),以及(2)含质粒的细胞(rho(r)mu()造成质粒丢失的概率。此处使用的导数技术是通过涉及数据过滤和平滑处理的数学处理完成的。这两个参数的值与已经发布的参数一致。当前的技术没有施加先决条件,并允许我们进一步研究相关现象。
  • 【番茄科(Lycopsida)的进化:通过使用非参数速率平滑来估计与rbcL基因序列的差异时间。】 复制标题 收藏 收藏
    DOI:10.1006/mpev.2001.0936 复制DOI
    作者列表:Wikström N,Kenrick P
    BACKGROUND & AIMS: :By use of nonparametric rate smoothing and nucleotide sequences of the rbcL gene, divergence times in Lycopodiaceae are estimated. The results show that much extant species diversity in Lycopodiaceae stems from relatively recent cladogenic events. These results corroborate previous ideas based on paleobotanical and biogeographical data. Previous molecular phylogenetic analyses recognized a split into neotropical and paleotropical clades in Huperzia, which contains 85-90% of all living species. Connecting this biogeographical pattern with continent movements, the diversification of this epiphytic group was suggested to coincide with that of angiosperms in the mid to Late Cretaceous. Results presented here are consistent with this idea, and the diversification of the two clades is resolved as Late Cretacous (78 and 95 Myr). In the related genera Lycopodium and Lycopodiella, the patterns are somewhat different. Here species diversity is scattered among different subgeneric groups. Most of the high-diversity subgeneric groups seem to have diversified very recently (Late Tertiary), whereas the cladogenic events leading to these groups are much older (Early to Late Cretaceous). Our analysis shows that, although much living species diversity stems from relatively recent cladogenesis, the origins of the family (Early Carboniferous) and generic crown groups (Early Permian to Early Jurassic) are much more ancient events.
    背景与目标: :通过使用非参数速率平滑和rbcL基因的核苷酸序列,可以估计番茄科的发散时间。结果表明,番茄科中许多现存的物种多样性源于相对较近的成枝事件。这些结果证实了基于古植物学和生物地理学数据的先前观点。先前的分子系统发育分析认为,石杉属分为新热带和古热带进化枝,其中包含所有活物种的85-90%。将此生物地理模式与大陆运动联系起来,表明该附生类群的多样性与白垩纪中晚期的被子植物相吻合。这里介绍的结果与这个想法是一致的,两个进化枝的多样性被解析为白垩纪晚期(78和95 Myr)。在相关的石蒜属和石蒜属中,模式有所不同。在这里,物种多样性散布在不同的亚属群之间。大多数高度多样性的亚属群似乎最近才多样化(第三纪晚期),而导致这些群体的成岩事件则要早得多(早至白垩纪晚期)。我们的分析表明,尽管许多生物物种的多样性起源于相对较新的近生,但该家族(早期石炭纪)和普通冠群(早二叠纪到侏罗纪早期)的起源却要古老得多。
  • 【具有Gibbs平滑的EM图像重建算法的收敛性。】 复制标题 收藏 收藏
    DOI:10.1109/42.61759 复制DOI
    作者列表:Lange K
    BACKGROUND & AIMS: :P.J. Green has defined an OSL (one-step late) algorithm that retains the E-step of the EM algorithm (for image reconstruction in emission tomography) but provides an approximate solution to the M-step. Further modifications of the OSL algorithm guarantee convergence to the unique maximum of the log posterior function. Convergence is proved under a specific set of sufficient conditions. Several of these conditions concern the potential function of the Gibb's prior, and a number of candidate potential functions are identified. Generalization of the OSL algorithm to transmission tomography is also considered.
    背景与目标: :P.J。 Green定义了一种OSL(延迟一步)算法,该算法保留了EM算法的E步骤(用于放射线断层摄影中的图像重建),但提供了M步骤的近似解决方案。 OSL算法的进一步修改可确保收敛到对数后验函数的唯一最大值。在特定的一组充分条件下证明了收敛。其中一些条件与吉布先验的潜在功能有关,并且确定了许多候选潜在功能。还考虑将OSL算法推广到透射层析成像。
  • 【多主体功能MRI研究中的空间分辨率,信噪比和平滑度。】 复制标题 收藏 收藏
    DOI:10.1016/j.neuroimage.2005.10.022 复制DOI
    作者列表:Scouten A,Papademetris X,Constable RT
    BACKGROUND & AIMS: :Functional MRI is aimed at localizing cortical activity to understand the role of specific cortical regions, providing insight into the neurophysiological underpinnings of brain function. Scientists developing fMRI methodology seek to improve detection of subtle activations and to spatially localize these activations more precisely. Except for applications in the clinical environment, such as functional mapping in patients prior to neurosurgical intervention, most basic neuroscience studies involve group level random-effects analyses. Prior to grouping data, the data from each individual are typically smoothed. A wide range of motivations for smoothing have been given including to match the spatial scale of hemodynamic responses, to normalize the error distribution (by the Central Limit Theorem) to improve the validity of inferences based on parametric tests, and, in the context of inter-subject averaging smoothing has been shown necessary to project the data down to a scale where homologies in functional anatomy are expressed across subjects. This work demonstrates that, for single-subject studies, if smoothing is to be employed, the data should be acquired at lower resolutions to maximize SNR. The benefits of a low-resolution acquisition are limited by partial volume effects and by the weak impact of resolution-dependent noise on the overall group level statistics. Given that inter-subject noise dominates across a range of tasks, improvements in within-subject noise, through changes in acquisition strategy or even moving to higher field strength, may do little to improve group statistics. Such improvements however may greatly impact single-subject studies such as those used in neurosurgical planning.
    背景与目标: :功能性MRI旨在定位皮质活动,以了解特定皮质区域的作用,从而洞悉脑功能的神经生理学基础。开发功能磁共振成像方法的科学家寻求改进对微妙激活的检测,并更精确地在空间上定位这些激活。除了在临床环境中的应用(例如在神经外科手术之前对患者进行功能映射)以外,大多数基础神经科学研究都涉及组水平的随机效应分析。在对数据进行分组之前,通常会对来自每个人的数据进行平滑处理。给出了许多平滑的动机,包括匹配血液动力学响应的空间尺度,归一化误差分布(通过中央极限定理)以提高基于参数检验的推论的有效性,以及已经证明,将对象平均平滑化对于将数据投影到一定规模是必要的,在该规模上跨对象表达了功能解剖学上的同源性。这项工作表明,对于单对象研究,如果要进行平滑处理,则应以较低的分辨率获取数据以最大程度地提高SNR。低分辨率采集的好处受到部分音量影响和与分辨率有关的噪声对整个组级别统计数据的影响的限制。考虑到受试者间的噪声在所有任务中均占主导地位,通过改变采集策略甚至转向更高的场强来改善受试者内的噪声可能对改善组统计没有多大作用。但是,此类改进可能会极大地影响单项研究,例如用于神经外科计划的研究。
  • 【非线性平滑可减少扩散张量成像中的系统误差和随机误差。】 复制标题 收藏 收藏
    DOI:10.1002/1522-2586(200006)11:6<702::aid-jmri18>3.0. 复制DOI
    作者列表:Parker GJ,Schnabel JA,Symms MR,Werring DJ,Barker GJ
    BACKGROUND & AIMS: :Calculation and sorting of the eigenvectors of diffusion using diffusion tensor imaging has previously been shown to be sensitive to noise levels in the acquired data. This sensitivity manifests as random and systematic errors in the diffusion eigenvalues and derived parameters such as indices of anisotropy. An optimized application of nonlinear smoothing techniques to diffusion data prior to calculation of the diffusion tensor is shown to reduce both random and systematic errors, while causing little blurring of anatomical structures. Conversely, filtering applied to calculated images of fractional anisotropy is shown to fail in reducing systematic errors and in recovering anatomical detail. Using both real and simulated brain data sets, it is demonstrated that this approach has the potential to allow acquisition of data that would otherwise be too noisy to be of use.
    背景与目标: 使用扩散张量成像对扩散的本征向量进行计算和分类以前已被证明对所采集数据中的噪声水平敏感。这种敏感性表现为扩散特征值和派生参数(例如各向异性指数)的随机和系统误差。在扩散张量的计算之前,非线性平滑技术在扩散数据上的优化应用已显示出可减少随机误差和系统误差,同时几乎不引起解剖结构模糊。相反,显示应用于分数各向异性的计算图像的滤波无法减少系统误差和恢复解剖结构细节。通过使用真实的和模拟的大脑数据集,证明了这种方法有可能允许获取否则可能太嘈杂而无法使用的数据。
  • 【指数平滑法在医院感染监测中的应用。】 复制标题 收藏 收藏
    DOI:10.1093/oxfordjournals.aje.a008794 复制DOI
    作者列表:Ngo L,Tager IB,Hadley D
    BACKGROUND & AIMS: :Detection of outbreaks of infection or increases in bacterial resistance to antimicrobial agents is an essential component of hospital infection control surveillance. The authors applied the method of exponential smoothing to microbiology data from 1987-1992 to investigate a suspected outbreak of gentamicin resistance among Pseudomonas aeruginosa bacteria at the Department of Veterans Affairs Medical Center, San Francisco, California, in 1991-1992. The years 1987-1990 were used to develop the baseline for the forecast model. Application of the model indicated that two observed prominent peaks in the annual cumulative incidence of gentamicin-resistant P. aeruginosa were within the upper bounds of their respective 95% confidence intervals as estimated by the forecast model--i.e., that no epidemic was in progress. This prediction was supported by investigations by the hospital's infection control team which indicated that the apparent increases were due to readmission of patients previously known to harbor these organisms. In contrast, application of a typically employed method that ignores the time series data structure indicated that there were 6 months in which incidence rates exceeded the upper bounds of their respective 95% confidence intervals, thereby erroneously suggesting that an epidemic was in progress. Recursive algorithms and some simplifying assumptions that do not affect the validity of inferences make the application of this method practical for nosocomial infection control programs.
    背景与目标: :发现感染爆发或细菌对抗菌剂的耐药性增加是医院感染控制监测的重要组成部分。作者将指数平滑法应用到1987-1992年的微生物学数据中,以调查1991-1992年在加利福尼亚州旧金山的退伍军人事务医学中心的铜绿假单胞菌细菌中怀疑的庆大霉素抗药性暴发。 1987-1990年用于发展预测模型的基线。该模型的应用表明,预测模型估计,在庆大霉素耐药的铜绿假单胞菌的年度累积发病率中有两个观察到的突出峰位于其各自95%置信区间的上限之内-即,没有流行病。该预测得到了医院感染控制小组的调查的支持,该调查表明,明显增加的原因是先前已知携带这些生物的患者再次入院。相反,应用通常采用的忽略时间序列数据结构的方法表明,有6个月的发病率超过了其各自95%置信区间的上限,从而错误地表明正在进行流行。递归算法和一些不影响推论有效性的简化假设使得该方法在医院感染控制程序中的应用很实用。
  • 【使用粗糙化的泊松似然目标函数对非参数回归正弦图进行平滑处理。】 复制标题 收藏 收藏
    DOI:10.1109/42.876303 复制DOI
    作者列表:La Rivière PJ,Pan X
    BACKGROUND & AIMS: :We develop and investigate an approach to tomographic image reconstruction in which nonparametric regression using a roughness-penalized Poisson likelihood objective function is used to smooth each projection independently prior to reconstruction by unapodized filtered backprojection (FBP). As an added generalization, the roughness penalty is expressed in terms of a monotonic transform, known as the link function, of the projections. The approach is compared to shift-invariant projection filtering through the use of a Hanning window as well as to a related nonparametric regression approach that makes use of an objective function based on weighted least squares (WLS) rather than the Poisson likelihood. The approach is found to lead to improvements in resolution-noise tradeoffs over the Hanning filter as well as over the WLS approach. We also investigate the resolution and noise effects of three different link functions: the identity, square root, and logarithm links. The choice of link function is found to influence the resolution uniformity and isotropy properties of the reconstructed images. In particular, in the case of an idealized imaging system with intrinsically uniform and isotropic resolution, the choice of a square root link function yields the desirable outcome of essentially uniform and isotropic resolution in reconstructed images, with noise performance still superior to that of the Hanning filter as well as that of the WLS approach.
    背景与目标: :我们开发并研究了一种断层图像重建方法,其中在使用非变迹滤波反投影(FBP)进行重建之前,使用使用粗糙度惩罚化的泊松似然目标函数的非参数回归来独立平滑每个投影。作为附加的概括,粗糙度损失用投影的单调变换(称为链接函数)表示。通过使用Hanning窗口,将该方法与不变位移投影滤波以及使用基于加权最小二乘(WLS)而不是泊松似然性的目标函数的相关非参数回归方法进行了比较。发现该方法导致在汉宁滤波器和WLS方法上的分辨率-噪声折衷方面的改进。我们还将研究三种不同链接功能的分辨率和噪声影响:标识,平方根和对数链接。发现链接函数的选择会影响重构图像的分辨率均匀性和各向同性。特别是,在具有本质上均匀且各向同性分辨率的理想成像系统的情况下,平方根链接函数的选择会在重建图像中产生基本均匀且各向同性分辨率的理想结果,而噪声性能仍然优于汉宁过滤器以及WLS方法的过滤器。
  • 【使用反卷积和平滑处理的癌症筛查模型的估计和预测。】 复制标题 收藏 收藏
    DOI:10.1111/j.0006-341x.2001.00389.x 复制DOI
    作者列表:Pinsky PF
    BACKGROUND & AIMS: :The model that specifies that cancer incidence, I, is the convolution of the preclinical incidence, g, and the density of time in the preclinical phase, f, has frequently been utilized to model data from cancer screening trials and to estimate such quantities as sojourn time, lead time, and sensitivity. When this model is fit to the above data, the parameters of f as well as the parameter(s) governing screening sensitivity must be estimated. Previously, g was either assumed to be equal to clinical incidence or assumed to be a constant or exponential function that also had to be estimated. Here we assume that the underlying incidence, I, in the study population (in the absence of screening) is known. With I known, g then becomes a function of f, which can be solved for using (numerical) deconvolution, thus eliminating the need to estimate g or make assumptions about it. Since numerical deconvolution procedures may be highly unstable, however, we incorporate a smoothing procedure that produces a realistic g function while still closely reproducing the original incidence function I upon convolution with f. We have also added the concept of competing mortality to the convolution model. This, along with the realistic preclinical incidence function described above, results in more accurate estimates of sojourn time and lead time and allows for estimation of quantities related to overdiagnosis, which we define here.
    背景与目标: :通常指定癌症发病率(I)为临床前发病率g和临床前阶段时间密度f的卷积模型,通常用于对癌症筛查试验中的数据进行建模并估算以下数量:逗留时间,提前期和敏感性。当此模型适合以上数据时,必须估算f的参数以及控制筛选灵敏度的参数。以前,g被假定为等于临床发生率,或者被假定为也必须估算的恒定或指数函数。在这里,我们假设在研究人群中(没有筛查的情况下)潜在的发病率I是已知的。据我所知,g随即成为f的函数,使用(数值)反卷积可以解决g的问题,从而消除了估计g或对其进行假设的需要。但是,由于数值反卷积过程可能非常不稳定,因此,我们引入了一个平滑过程,该过程会产生逼真的g函数,同时在与f卷积时仍能紧密再现原始入射函数I。我们还在卷积模型中添加了竞争死亡率的概念。这与上述现实的临床前临床发病率函数一起,可以更准确地估算停留时间和提前期,并可以估算与过度诊断相关的数量,我们在此进行定义。

+1
+2
100研值 100研值 ¥99课程
检索文献一次
下载文献一次

去下载>

成功解锁2个技能,为你点赞

《SCI写作十大必备语法》
解决你的SCI语法难题!

技能熟练度+1

视频课《玩转文献检索》
让你成为检索达人!

恭喜完成新手挑战

手机微信扫一扫,添加好友领取

免费领《Endnote文献管理工具+教程》

微信扫码, 免费领取

手机登录

获取验证码
登录