Daniel Lengyel: An Uphill Battle: Exploring Data Optimality Conditions in Gradient Estimation

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 ม.ค. 2025
  • Date: 30 October 2024
    Speaker: Daniel Lengyel
    Title: An Uphill Battle: Exploring Data Optimality Conditions in Gradient Estimation
    Abstract: In this talk, Daniel will present part of his PhD work on gradient estimation methods, specifically characterizing the optimal location of function evaluations-referred to as the sample set-for noisy black-box functions. This work is motivated by the observation that while gradient estimation plays a central role in many applications, there is only a limited theoretical understanding of it. The first part of the talk will focus on theoretical results and insights derived from work on the fully-determined and centered simplex gradient, which generalizes forward and central finite differences, respectively. The second part connects these insights to potential future research directions. One direction involves interacting particle and meta-heuristic optimization methods, which often perform well in practice but lack theoretical guarantees, as their update steps-often implicitly relying on gradient estimation-are challenging to analyze. Another direction concerns the curation of datasets, such as those for piecewise linear function estimators, which are commonly employed in deep learning architectures.

ความคิดเห็น •