MCECS Undergraduate Research and Mentoring Program is open for applications!

The Undergraduate Research and Mentoring Program is open for applications, due March 1st! URMP provides opportunities for students to get hands-on experience working in a lab with graduate students and faculty. It funds summer undergraduate research for students majoring in all Maseeh College undergraduate disciplines. Students will get a stipend of $6000 to work on a full time 10 week project with faculty mentors in June-August. Interested students are encouraged to reach out to potential mentors to learn more about their research and the summer research experience.

See the URMP website for additional details on the program: https://www.pdx.edu/engineering/urmp

Image

NSF REU and altREU sites are now accepting applications

The 2026 summer NSF REU and altREU sites are now accepting applications!

Both sites are fully virtual. For a direct comparison of the two programs, see https://teuscher-lab.com/altreu/program-details

We look forward to receiving your application!

Image

NEW PAPER: Winning the Lottery by Preserving Network Training Dynamics with Concrete Ticket Search

T. Arora and C. Teuscher, Winning the Lottery by Preserving Network Training Dynamics with Concrete Ticket Search, 2025 Under review. https://arxiv.org/abs/2512.07142

Image

Abstract:

The Lottery Ticket Hypothesis asserts the existence of highly sparse, trainable subnetworks (‘winning tickets’) within dense, randomly initialized neural networks. However, state-of- the-art methods of drawing these tickets, like Lottery Ticket Rewinding (LTR), are computationally prohibitive, while more efficient saliency-based Pruning-at-Initialization (PaI) techniques suffer from a significant accuracy-sparsity trade-off and fail basic sanity checks. In this work, we argue that PaI’s reliance on first-order saliency metrics, which ignore inter-weight dependencies, contributes substantially to this performance gap, especially in the sparse regime. To address this, we introduce Concrete Ticket Search (CTS), an algorithm that frames subnetwork discovery as a holistic combinatorial optimization problem. By leveraging a Concrete relaxation of the discrete search space and a novel gradient balancing scheme (GRADBALANCE) to control sparsity, CTS efficiently identifies high-performing subnetworks near initialization without requiring sensitive hyperparameter tuning. Motivated by recent works on lottery ticket training dynamics, we further propose a knowledge distillation-inspired family of pruning objectives, finding that minimizing the reverse Kullback-Leibler divergence between sparse and dense network outputs (CTSKL) is particularly effective. Experiments on varying image classification tasks show that CTS produces subnetworks that robustly pass sanity checks and achieve accuracy comparable to or exceeding LTR, while requiring only a small fraction of the computation. For example, on ResNet-20 on CIFAR10, CTSKL produces subnetworks of 99.3% sparsity with a top-1 accuracy of 74.0% in just 7.9 minutes, while LTR produces subnetworks of the same sparsity with an accuracy of 68.3% in 95.2 minutes. However, while CTS outperforms saliency-based methods in the sparsity-accuracy tradeoff across all sparsities, such advantages over LTR emerge most clearly only in the highly sparse regime.

NEW SEMINAR: Journal Debate Club

Image

Sharpen your critical thinking and scientific communication skills through structured debates on current research literature. Students will analyze primary scientific papers, construct evidence-based arguments, and engage in formal debates defending or critiquing published findings, methodologies, and interpretations. Rotating roles as debaters, moderators, and critical evaluators, participants will learn to identify experimental weaknesses, evaluate competing hypotheses, and articulate scientific arguments persuasively. This interactive format develops skills in literature analysis, oral presentation, and constructive scientific discourse.

More info here…