SCI Publications
2011
H.C. Hazlett, M. Poe, G. Gerig, M. Styner, C. Chappell, R.G. Smith, C. Vachet, J. Piven.
Early Brain Overgrowth in Autism Associated with an Increase in Cortical Surface Area Before Age 2, In Arch of Gen Psych, Vol. 68, No. 5, pp. 467--476. 2011.
DOI: 10.1001/archgenpsychiatry.2011.39
C.R. Henak, B.J. Ellis, M.D. Harris, A.E. Anderson, C.L. Peters, J.A. Weiss.
Role of the acetabular labrum in load support across the hip joint, In Journal of Biomechanics, Vol. 44, No. 12, pp. 2201-2206. 2011.
L. Hogrebe, A. Paiva, E. Jurrus, C. Christensen, M. Bridge, J.R. Korenberg, T. Tasdizen.
Trace Driven Registration of Neuron Confocal Microscopy Stacks, In IEEE International Symposium on Biomedical Imaging (ISBI), pp. 1345--1348. 2011.
DOI: 10.1109/ISBI.2011.5872649
A. Irimia, M.C. Chambers, J.R. Alger, M. Filippou, M.W. Prastawa, Bo Wang, D. Hovda, G. Gerig, A.W. Toga, R. Kikinis, P.M. Vespa, J.D. Van Horn.
Comparison of acute and chronic traumatic brain injury using semi-automatic multimodal segmentation of MR volumes, In Journal of Neurotrauma, Vol. 28, No. 11, pp. 2287--2306. November, 2011.
DOI: 10.1089/neu.2011.1920
PubMed ID: 21787171
Although neuroimaging is essential for prompt and proper management of traumatic brain injury (TBI), there is a regrettable and acute lack of robust methods for the visualization and assessment of TBI pathophysiology, especially for of the purpose of improving clinical outcome metrics. Until now, the application of automatic segmentation algorithms to TBI in a clinical setting has remained an elusive goal because existing methods have, for the most part, been insufficiently robust to faithfully capture TBI-related changes in brain anatomy. This article introduces and illustrates the combined use of multimodal TBI segmentation and time point comparison using 3D Slicer, a widely-used software environment whose TBI data processing solutions are openly available. For three representative TBI cases, semi-automatic tissue classification and 3D model generation are performed to perform intra-patient time point comparison of TBI using multimodal volumetrics and clinical atrophy measures. Identification and quantitative assessment of extra- and intra-cortical bleeding, lesions, edema, and diffuse axonal injury are demonstrated. The proposed tools allow cross-correlation of multimodal metrics from structural imaging (e.g., structural volume, atrophy measurements) with clinical outcome variables and other potential factors predictive of recovery. In addition, the workflows described are suitable for TBI clinical practice and patient monitoring, particularly for assessing damage extent and for the measurement of neuroanatomical change over time. With knowledge of general location, extent, and degree of change, such metrics can be associated with clinical measures and subsequently used to suggest viable treatment options.
Keywords: namic
B.M. Isaacson, J.G. Stinstra, R.D. Bloebaum, COL P.F. Pasquina, R.S. MacLeod.
Establishing Multiscale Models for Simulating Whole Limb Estimates of Electric Fields for Osseointegrated Implants, In IEEE Transactions on Biomedical Engineering, Vol. 58, No. 10, pp. 2991--2994. 2011.
DOI: 10.1109/TBME.2011.2160722
PubMed ID: 21712151
PubMed Central ID: PMC3179554
S.A. Isaacson, R.M. Kirby.
Numerical Solution of Linear Volterra Integral Equations of the Second Kind with Sharp Gradients, In Journal of Computational and Applied Mathematics, Vol. 235, No. 14, pp. 4283--4301. 2011.
Collocation methods are a well-developed approach for the numerical solution of smooth and weakly singular Volterra integral equations. In this paper, we extend these methods through the use of partitioned quadrature based on the qualocation framework, to allow the efficient numerical solution of linear, scalar Volterra integral equations of the second kind with smooth kernels containing sharp gradients. In this case, the standard collocation methods may lose computational efficiency despite the smoothness of the kernel. We illustrate how the qualocation framework can allow one to focus computational effort where necessary through improved quadrature approximations, while keeping the solution approximation fixed. The computational performance improvement introduced by our new method is examined through several test examples. The final example we consider is the original problem that motivated this work: the problem of calculating the probability density associated with a continuous-time random walk in three dimensions that may be killed at a fixed lattice site. To demonstrate how separating the solution approximation from quadrature approximation may improve computational performance, we also compare our new method to several existing Gregory, Sinc, and global spectral methods, where quadrature approximation and solution approximation are coupled.
T. Ize, C.D. Hansen.
RTSAH Traversal Order for Occlusion Rays, In Computer Graphics Forum, Vol. 30, No. 2, Wiley-Blackwell, pp. 297--305. April, 2011.
DOI: 10.1111/j.1467-8659.2011.01861.x
T. Ize, C. Brownlee, C.D. Hansen.
Real-Time Ray Tracer for Visualizing Massive Models on a Cluster, In Proceedings of the 2011 Eurographics Symposium on Parallel Graphics and Visualization, pp. 61--69. 2011.
S. Jadhav, H. Bhatia, P.-T. Bremer, J.A. Levine, L.G. Nonato, V. Pascucci.
Consistent Approximation of Local Flow Behavior for 2D Vector Fields, In Mathematics and Visualization, Springer, pp. 141--159. Nov, 2011.
DOI: 10.1007/978-3-642-23175-9 10
Typically, vector fields are stored as a set of sample vectors at discrete locations. Vector values at unsampled points are defined by interpolating some subset of the known sample values. In this work, we consider two-dimensional domains represented as triangular meshes with samples at all vertices, and vector values on the interior of each triangle are computed by piecewise linear interpolation.
Many of the commonly used techniques for studying properties of the vector field require integration techniques that are prone to inconsistent results. Analysis based on such inconsistent results may lead to incorrect conclusions about the data. For example, vector field visualization techniques integrate the paths of massless particles (streamlines) in the flow or advect a texture using line integral convolution (LIC). Techniques like computation of the topological skeleton of a vector field, require integrating separatrices, which are streamlines that asymptotically bound regions where the flow behaves differently. Since these integrations may lead to compound numerical errors, the computed streamlines may intersect, violating some of their fundamental properties such as being pairwise disjoint. Detecting these computational artifacts to allow further analysis to proceed normally remains a significant challenge.
J. Jakeman, R. Archibald, D. Xiu.
Characterization of Discontinuities in High-dimensional Stochastic Problmes on Adaptive Sparse Grids, In Journal of Computational Physics, Vol. 230, No. 10, pp. 3977--3997. 2011.
DOI: 10.1016/j.jcp.2011.02.022
Keywords: Adaptive sparse grids, Stochastic partial differential equations, Multivariate discontinuity detection, Generalized polynomial chaos method, High-dimensional approximation
F. Jiao, Y. Gur, C.R. Johnson, S. Joshi.
Detection of crossing white matter fibers with high-order tensors and rank-k decompositions, In Proceedings of the International Conference on Information Processing in Medical Imaging (IPMI 2011), Lecture Notes in Computer Science (LNCS), Vol. 6801, pp. 538--549. 2011.
DOI: 10.1007/978-3-642-22092-0_44
PubMed Central ID: PMC3327305
P.K. Jimack, R.M. Kirby.
Towards the Development on an h-p-Refinement Strategy Based Upon Error Estimate Sensitivity, In Computers and Fluids, Vol. 46, No. 1, pp. 277--281. 2011.
The use of (a posteriori) error estimates is a fundamental tool in the application of adaptive numerical methods across a range of fluid flow problems. Such estimates are incomplete however, in that they do not necessarily indicate where to refine in order to achieve the most impact on the error, nor what type of refinement (for example h-refinement or p-refinement) will be best. This paper extends preliminary work of the authors (Comm Comp Phys, 2010;7:631–8), which uses adjoint-based sensitivity estimates in order to address these questions, to include application with p-refinement to arbitrary order and the use of practical a posteriori estimates. Results are presented which demonstrate that the proposed approach can guide both the h-refinement and the p-refinement processes, to yield improvements in the adaptive strategy compared to the use of more orthodox criteria.
E. Jurrus, S. Watanabe, R. Guily, A.R.C. Paiva, M.H. Ellisman, E.M. Jorgensen, T. Tasdizen.
Semi-automated Neuron Boundary Detection and Slice Traversal Algorithm for Segmentation of Neurons from Electron Microscopy Images, In Microscopic Image Analysis with Applications in Biology (MIAAB) Workshop, 2011.
Y. Keller, Y. Gur.
A Diffusion Approach to Network Localization, In IEEE Transactions on Signal Processing, Vol. 59, No. 6, pp. 2642--2654. 2011.
DOI: 10.1109/TSP.2011.2122261
D. Keyes, V. Taylor, T. Hey, S. Feldman, G. Allen, P. Colella, P. Cummings, F. Darema, J. Dongarra, T. Dunning, M. Ellisman, I. Foster, W. Gropp, C.R. Johnson, C. Kamath, R. Madduri, M. Mascagni, S.G. Parker, P. Raghavan, A. Trefethen, S. Valcourt, A. Patra, F. Choudhury, C. Cooper, P. McCartney, M. Parashar, T. Russell, B. Schneider, J. Schopf, N. Sharp.
Advisory Committee for CyberInfrastructure Task Force on Software for Science and Engineering, Note: NSF Report, 2011.
The Software for Science and Engineering (SSE) Task Force commenced in June 2009 with a charge that consisted of the following three elements:
Identify specific needs and opportunities across the spectrum of scientific software infrastructure. Characterize the specific needs and analyze technical gaps and opportunities for NSF to meet those needs through individual and systemic approaches. Design responsive approaches. Develop initiatives and programs led (or co-led) by NSF to grow, develop, and sustain the software infrastructure needed to support NSF’s mission of transformative research and innovation leading to scientific leadership and technological competitiveness. Address issues of institutional barriers. Anticipate, analyze and address both institutional and exogenous barriers to NSF’s promotion of such an infrastructure.The SSE Task Force members participated in bi-weekly telecons to address the given charge. The telecons often included additional distinguished members of the scientific community beyond the task force membership engaged in software issues, as well as personnel from federal agencies outside of NSF who manage software programs. It was quickly acknowledged that a number of reports loosely and tightly related to SSE existed and should be leveraged. By September 2009, the task formed had formed three subcommittees focused on the following topics: (1) compute-intensive science, (2) data-intensive science, and (3) software evolution.
S.H. Kim, V. Fonov, J. Piven, J. Gilmore, C. Vachet, G. Gerig, D.L. Collins, M. Styner.
Spatial Intensity Prior Correction for Tissue Segmentation in the Developing human Brain, In Proceedings of IEEE ISBI 2011, pp. 2049--2052. 2011.
DOI: 10.1109/ISBI.2011.5872815
R.M. Kirby, B. Cockburn, S.J. Sherwin.
To CG or to HDG: A Comparative Study, In Journal of Scientific Computing, Note: published online, 2011.
DOI: 10.1007/s10915-011-9501-7
Hybridization through the border of the elements (hybrid unknowns) combined with a Schur complement procedure (often called static condensation in the context of continuous Galerkin linear elasticity computations) has in various forms been advocated in the mathematical and engineering literature as a means of accomplishing domain decomposition, of obtaining increased accuracy and convergence results, and of algorithm optimization. Recent work on the hybridization of mixed methods, and in particular of the discontinuous Galerkin (DG) method, holds the promise of capitalizing on the three aforementioned properties; in particular, of generating a numerical scheme that is discontinuous in both the primary and flux variables, is locally conservative, and is computationally competitive with traditional continuous Galerkin (CG) approaches. In this paper we present both implementation and optimization strategies for the Hybridizable Discontinuous Galerkin (HDG) method applied to two dimensional elliptic operators. We implement our HDG approach within a spectral/hp element framework so that comparisons can be done between HDG and the traditional CG approach.
We demonstrate that the HDG approach generates a global trace space system for the unknown that although larger in rank than the traditional static condensation system in CG, has significantly smaller bandwidth at moderate polynomial orders. We show that if one ignores set-up costs, above approximately fourth-degree polynomial expansions on triangles and quadrilaterals the HDG method can be made to be as efficient as the CG approach, making it competitive for time-dependent problems even before taking into consideration other properties of DG schemes such as their superconvergence properties and their ability to handle hp-adaptivity.
R.C. Knickmeyer, C. Kang, S. Woolson, K.J. Smith, R.M. Hamer, W. Lin, G. Gerig, M. Styner, J.H. Gilmore.
Twin-Singleton Differences in Neonatal Brain Structure, In Twin Research and Human Genetics, Vol. 14, No. 3, pp. 268--276. 2011.
ISSN: 1832-4274
DOI: 10.1375/twin.14.3.268
A. Knoll, S. Thelen, I. Wald, C.D. Hansen, H. Hagen, M.E. Papka.
Full-Resolution Interactive CPU Volume Rendering with Coherent BVH Traversal, In Proceedings of IEEE Pacific Visualization 2011, pp. 3--10. 2011.
B.H. Kopell, J. Halverson, C.R. Butson, M. Dickinson, J. Bobholz, H. Harsch, C. Rainey, D. Kondziolka, R. Howland, E. Eskandar, K.C. Evans, D.D. Dougherty.
Epidural cortical stimulation of the left dorsolateral prefrontal cortex for refractory major depressive disorder, In Neurosurgery, Vol. 69, No. 5, pp. 1015--1029. November, 2011.
ISSN: 1524-4040
DOI: 10.1227/NEU.0b013e318229cfcd
Page 59 of 142