Skip to content Skip to navigation

AIMI Research Meeting: Leveraging Artificial Intelligence Through the Lens of A Biomedical Engineer - Saba Rahimi

July 15, 2021 - 3:00pm to 4:00pm

Stanford community & AIMI affiliates only



  • Successful application of deep learning in medical image analysis necessitates unprecedented amounts of labeled training data. Unlike conventional 2D applications, radiological images can be three dimensional (e.g. CT, MRI) consisting of many instances within each image. The problem is exacerbated when expert annotations are required for effective pixel-wise labeling, which incurs exorbitant labeling effort and cost. Active Learning is an established research domain that aims to reduce labeling workload by prioritizing a subset of informative unlabeled examples to annotate. Our contribution is a cost-effective approach for U-Net 3D models that uses Monte Carlo sampling to analyze pixel-wise uncertainty. Experiments on the AAPM 2017 lung CT segmentation challenge dataset show that our proposed framework can achieve promising segmentation results by using only 42% of the training data.

  • Existing systems for applying transcranial focused ultrasound (FUS) in small animals produce large focal volumes relative to the size of cerebral structures available for interrogation. The use of high ultrasonic frequencies can improve targeting specificity; however, the aberrations induced by rodent calvaria at megahertz frequencies severely distort the acoustic fields produced by single-element focused transducers. Here, we present the design, fabrication, and characterization of a high-frequency phased array system for transcranial FUS delivery in small animals. A transducer array was constructed by micromachining a spherically curved PZT-5H bowl  into 64 independent elements of equal surface area. Using phase corrections, the array is capable of generating a trans-rat skull peak negative focal pressure of up to ~2.0 MPa, which is sufficient for microbubble-mediated blood-brain barrier permeabilization at this frequency.

  • Profiling the usage of electrical devices within a smart home can be used as a method for determining an occupant's activities of daily living, which can support independent living of older adults. A nonintrusive load monitoring (NILM) system monitors the electrical consumption at a single electrical source and the operating schedules of individual devices are determined by disaggregating the composite electrical consumption waveforms. Nonintrusive load monitoring systems are able to detect the status of electrical devices based on the analysis of load signatures, the unique electrical behaviour of an individual device when it is in operation. Promising results are shown which demonstrate that real power and reactive power are useful steady-state features to identify electrical devices. Furthermore, experimental results showed that different electrical devices can be distinguished based on their switching on transient waveforms. Experimental results showed that the proposed approach can achieve high device recognition accuracy for electrical devices operating individually and simultaneously.


Saba Rahimi is a PhD student in Biomedical engineering at University of Toronto. In August, she will be defending her thesis on “On the Development of High-Frequency Phased Array Systems for Transcranial Ultrasound Delivery in Small Animals”. She has been involved in healthcare projects her whole academic life from “designing a patch for painless insulin delivery ” to “monitoring the usage of electrical devices in smart homes to predict the behaviour of older adults”. In summer 2019, she completed a 4-month artificial intelligence fellowship at Insight Data Sciences where she developed a software tool in collaboration with radiologists at Sunnybrook Hospital. During the summer of 2020, she completed a research internship in Artificial Intelligence (AI) at Microsoft, Sunnyvale, working on applications of AI in Radiology. Her current area of research interest is computer vision in medical imaging.

Contact Email: