Sai Anuroop Kesanapalli

Sai Anuroop Kesanapalli

Graduate Student

University of Southern California

Biography

I am a second year master’s student in Computer Science at the University of Southern California. I have been awarded the prestigious J N Tata Endowment scholarship for the higher education of Indians.

I currently work as a Course Producer for CSCI 567: Machine Learning, offered by the Department of Computer Science at USC in Spring 2024, and for CSCI 699: Theory of Machine Learning, a doctoral-level course offered in Fall 2023, taught by Prof. Vatsal Sharan. During Summer 2023, I worked as a Machine Learning Software Intern at DeGirum Corp., Santa Clara, where I developed an ONNX OCR pipeline with pre and post-processor modules compatible with edge-hardware, and worked on a NumPy-only implementation of the forward pass of some vision-based PyTorch operators. During Spring and Summer 2023, I was a Research Assistant in the Department of Computer Science at USC, advised by Prof. Vatsal Sharan, where I contributed to an open source project on tensor decomposition methods, and worked on a faster C++ implementation of a random forest based anomaly-detection algorithm.

Previously, I was Project Associate - I at DREAM:Lab, Indian Institute of Science (IISc), advised by Prof. Yogesh Simmhan, where I worked on performance characterization of Nvidia Jetson edge-accelerators on deep learning workloads, a review of systems research into training deep learning models on the edge hardware, and also on a Federated Learning project.

Prior to that, I graduated with a Bachelor of Technology in Computer Science and Engineering from Indian Institute of Technology (IIT) Dharwad. I worked on Federated Algorithms with Bayesian and Exponential Weighted Average approaches for my B.Tech. Project, as a member of LIaN research group, advised by Prof. B. N. Bharath.

Download my résumé .

Education

 
 
 
 
 
University of Southern California
Master of Science in Computer Science
Aug 2022 – Present Los Angeles, CA, USA
 
 
 
 
 
Indian Institute of Technology (IIT) Dharwad
Bachelor of Technology in Computer Science and Engineering
Jul 2017 – Jun 2021 Dharwad, KA, India

Experience

Work & Research

 
 
 
 
 
University of Southern California
Course Producer
Aug 2023 – Present Los Angeles, CA, USA
 
 
 
 
 
DeGirum Corp.
Machine Learning Software Intern
May 2023 – Aug 2023 Santa Clara, CA, USA
 
 
 
 
 
University of Southern California
Research Assistant
Mar 2023 – Aug 2023 Los Angeles, CA, USA
 
 
 
 
 
DREAM:Lab, Indian Institute of Science (IISc)
Project Associate - I
Aug 2021 – Jul 2022 Bangalore, KA, India
 
 
 
 
 
LIaN Lab, Indian Institute of Technology (IIT) Dharwad
Undergraduate Researcher
Aug 2020 – Jun 2021 Dharwad, KA, India

Awards

& Achievements

Gift Award and Travel Grant
J N Tata Endowment Scholarship
AP Grade for exceptional performance
All India Rank 8682
Telangana State Rank 1
Certificate of Merit
Meritorious Student, School Topper, and Star of Stars

Publications

(2022). Characterizing the Performance of Accelerated Jetson Edge Devices for Training Deep Learning Models. POMACS V6, N3 December 2022.

PDF Cite Code Slides DOI

(2022). Don’t Miss the Train: A Case for Systems Research into Training on the Edge. IPDPSW 2022.

PDF Cite DOI

(2021). Characterizing the Performance of Deep Learning Workloads on Accelerated Edge Computing Devices. HiPC (SRS) 2021.

PDF

Projects

Forward-Forward: Is it time to bid adieu to BackProp?
The goal of this project is to –
* Implement and study the
Forward-Forward (FF) algorithm proposed by Geoffrey Hinton and compare it with the traditional back-propagation (BackProp) framework
* Study the architectural differences of FF and BackProp and explore new architectures
* Analyze system performance of FF and BackProp

This work was done for the course project of CSCI 566: Deep Learning and its Applications, Spring 2023, USC.
Rethinking classic learning theory in deep neural networks
These are the notes of the presentation made by Hikaru Ibayashi on the papers titled, ‘Understanding deep learning requires rethinking generalization’, Zhang et al., ICLR 2017; and ‘Uniform convergence may be unable to explain generalization in deep learning’, Nagarajan et al., NeurIPS 2019. Presentation and notes done as a part of the coursework for CSCI 699: Computational Perspectives on the Frontiers of Machine Learning, Spring 2023, USC.
Store Sales - Time Series Forecasting
In this project, we present machine learning techniques to predict store sales in the Store Sales - Time Series Forecasting competition on Kaggle. We were asked to forecast 15 days (2017-08-16 to 2017-08-31) of store sales for Corporación Favorita, a large Ecuadorian-based grocery retailer. The provided data consisted of the past sales for 33 families of items from 54 stores over the period of 2013-01-01 to 2017-08-15. Additionally, we were given information on past oil prices, holidays, and store locations to potentially aid our forecast. We began by determining a baseline score with a basic linear regression model. With this score in mind, we experimented with decision tree-based models. After unimpressive results , we further explored the provided datasets to identify potential trends, seasonal patterns, and features that were relevant to the forecast. Finally, we trained various models on our enriched feature set and settled on the one that gave us our best score: a version of an ExtraTreesRegressor.

This project was done towards coursework of CSCI 567: Machine Learning, Fall 2022, USC.

Miscellaneous

Student Offices Held

Mentor
Student Mentor
Student Coordinator
Event Coordinator