Home » Research (Page 2)

Category Archives: Research

Subscribe

Archives

Categories

Attribution-NonCommercial-ShareAlike 4.0 International

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

Talk at University of Tokyo

On August 18, I visited Professor Shigeru Chiba at the Core Software Group of the Dept. of Creative Informatics Graduate School of Information Science and Technology at The University of Tokyo. I gave a talk about preliminary research in automated refactoring of Deep Learning software.

(more…)

Visiting Tokyo Tech

Between August 10 to 24, 2022, I visited the Programming Research Group at the Department of Mathematical and Computing Science of the Tokyo Institute of Technology. I gave a seminar talk and discussed current research with the group members. A JSPS BRIDGE fellowship supported this visit, planned initially two years ago. The trip was postponed due to COVID-19 (three times, in fact), but I was happy to have the opportunity to visit Professor Masuhara and his lab.

(more…)

Highlights of “Challenges in Migrating Imperative Deep Learning Programs to Graph Execution: An Empirical Study”

In this blog post, we summarize, using code examples, our recent empirical study on challenges in migrating imperative Deep Learning programs to graph execution.

(more…)

NYU GSTEM students visit during the summer of 2022

Medha Belwadi and Pranavi Gollanapalli will join our research group this summer through the NYU GSTEM program. NYU GSTEM is a summer program for high school juniors that allows them to participate in research laboratories. The NYU Courant Institute of Mathematical Sciences offers the program and helps promote STEM to traditionally underrepresented groups, particularly females and minorities. Medha and Pranavi will be working on our recently funded NSF project on imperative Deep Learning system programming and evolution as part of the project’s broader impacts.

(more…)

Fully-funded Ph.D. student position(s) in analysis and transformations of Deep Learning programs in New York City

I am currently seeking (potentially multiple, fully-funded) Ph.D. students interested in programming languages and software engineering research for an NSF-funded project on analysis and transformations for (imperative) Deep Learning (DL) programs. The project—based in the heart of New York City—focuses on enhancing the robustness, increasing run-time performance, and facilitating the long-lived evolution of DL systems, particularly, large, industrial DL systems. For more information on the project, please see the project announcement.

Potential research topics explored during the project may include (static/dynamic) program analysis and transformation (e.g., automated refactoring) and empirical software engineering. Successful candidates will be expected to work on projects that generally yield open-source developer tool research prototypes, plug-ins to popular IDEs, build systems, or static analyzers. Applicants may find additional information on the PI’s web page. They should also apply to the City University of New York (CUNY) Graduate Center (GC) Ph.D. program in Computer Science (deadline January 15) following a discussion with the PI.

Please see below for additional details on applying. Again, the Ph.D. program deadline is January 15.

(more…)

Received three-year NSF research grant on imperative Deep Learning program robustness and evolution as PI

I am pleased to announce that I, along with co-PI Anita Raja, have received a three-year standard research grant from the National Science Foundation (NSF) Software & Hardware Foundations (SHF) program as principal investigator (PI) for a project entitled “Practical Analyses and Safe Transformations for Imperative Deep Learning Programs.” The total grant amount is $600K.

The project will facilitate the robustness and automated evolution and maintenance of large, industrial Deep Learning (DL) software systems that use imperative style programming. More information may be found on NSF’s website; stay tuned for more details and funded research opportunities!

Slides for ICSE ’22 tool demo on rejuvenating feature logging statement levels now available

Slides for our ICSE ’22 formal tool demonstration on rejuvenating feature logging statement levels iva Git histories and Degree of Interest (DOI) are now available. The live demo will take place tomorrow at 11:45 am EST.

Slides from GMU talk about challenges in executing imperative Deep Learning programs as graphs

Slides from my talk at George Mason University (GMU) on “Challenges in Migrating Imperative Deep Learning Programs to Graph Execution: An Empirical Study” are now available.

“Migrating Imperative Deep Learning Programs to Graph Execution” guest lecture on YouTube

Thanks to Stevens Institute of Technology for posting my guest lecture on imperative Deep Learning program execution to YouTube!

Talk at Stevens Institute of Technology, March 2022

Paper on hybridization challenges in imperative Deep Learning programs accepted at MSR ’22

Our paper entitled, “Challenges in migrating imperative Deep Learning programs to graph execution: An empirical study,” has been accepted to the main technical research track at the IEEE/ACM SIGSOFT 2022 International Conference on Mining Software Repositories (MSR)! Out of 138 papers, 45 were accepted, amounting to a 32.6% acceptance rate. The conference will take place later this year in Pittsburgh and is co-located with ICSE 2022.

A special congratulations to Tatiana for publishing her first full conference paper as first-author in the second year of her Ph.D. studies! Also congrats to Mehdi and Anita, and thank you for all of your hard work!