Lipschitzness, monotonocity, and cocoercivity are prominent operator properties involved in an algorithm’s behavior, but their relationship to each other and what they intuitively mean about an operator and its geometry is typically overlooked and complicated in the literature. To develop intuition, this note explores what they mean and how they connect.
Chebyshev polynomials of the first kind are a popular choice of basis polynomials for function approximation. Here, this note collects some properties and theorems for approximating univariate functions, with an example.
Preconditioning adjusts the state of a model to {expedite, improve, stabilize} convergence of an optimization procedure, typically by adjusting the gradient in the update rule with the inverse of the Hessian or improving the condition number of a matrix whose spectrum affects convergence.
An exploration of compressed sensing fMRI time series with 3 different algorithms. Typically, compressed sensing reconstructs a single volume of MRI but fMRI are composed of many volumes; sensing along the time domain could reduce the number of volumes required. Of the 3 algorithms, BSBL-BO performed the best with the error curve elbowing around 30% subsampling.
After brief description of diffusion tensor images and what information they provide, I discuss an intuitive seed-based line propagation algorithm for computing a tractography map of a neuroimage. The open-source softwares required are
3D Slicer,
ITK for C++, and
ITK-SNAP.
StoneAnno is my first published first-authorship paper, presenting at SPIE 2022. With the long-term goal of fully-automated robotic endoscopic surgery, we built a dataset of endoscopic kidney stone removal videos and investigated U-Net, U-Net++, and DenseNet for the segmentation task. We found a U-Net++ model that consistently achieves >0.9 Dice score, with low loss, and produces realistic, convincing segmentations. Moving forward, I am implementing our model on hardware for deployment in ORs, as a part of my master’s thesis, and I helped Dr. Kavoussi submit an R21 grant in October 2021.
I fell in love with dimensionality reduction when I was learning statistical ML. Since I also study neuroscience, I wanted to practice the art at the intersection of my interests. I compared the 3D projections of a 53-dimensional neurophysiology dataset produced by PCA and a shallow autoencoder.