Loading Events

« All Events

  • This event has passed.

Ph.D. Defense: Jin Lu

June 14, 2019 @ 1:00 pm - 2:00 pm UTC-5

Title: Towards Provable and Scalable Machine Learning

Ph.D. Candidate: Jin Lu

Major Advisor: Dr. Jinbo Bi

Associate Advisors: Dr. Alexander Russell, Dr. Sanguthevar Rajasekaran

Day/Time: Friday, June 14th, 2019 1:00 PM

Location: HBL 1947 Conference Room

 

 

Abstract:

 

In the recent decade, machine learning has been substantially developed and has demonstrated great success in various domains such as web search, computer vision, and natural language processing. Despite of its practical success, many of the applications involve solving NP-hard problems based on heuristics. It is challenging to analyze whether a heuristic scheme has any theoretical guarantee. In this dissertation, we show that if a certain structure occurs in sample data, it is possible to solve the related problem with provable guarantees. We propose to employ granular data structure, e.g. sample clusters or features describing an aspect of the sample, to design new statistical models and algorithms for two learning problems.

 

The first learning problem deals with the commonly-encountered missing data issue by formulating it as a matrix completion problem. When side features describing the data entities are available, we propose a new convex formulation to construct a bilinear model that infers the missing values based on the side features. This approach can be proved that with a much lower sampling rate than that of the classic matrix completion methods, it can exactly recover or epsilon-recover missing values, depending on whether the side features are corrupted. A novel linearized alternating direction method of multipliers is developed to efficiently solve the proposed convex formulation. For the second learning problem, we build a new generative adversarial network (GAN) to generate data that follow a distribution much closer to the true data distribution than the standard GAN when the data contains underlying clusters. The proposed model consists of multiple smaller GANs as components, each corresponding to a data cluster identified automatically during the construction of the GAN. This GAN approach can recover the true distribution for every cluster if an appropriate Kolmogorov regularization is used. If the GAN complexity is regularized by smoothness with a parameter epsilon, we prove that GAN model can approximate the true data distribution with an epsilon tolerance.  We use the Adaptive Momentum (ADAM) algorithm to optimize this model with scalability. The proposed two approaches essentially bring new insights and suggest new methods for provable and scalable machine learning.

Details

Date:
June 14, 2019
Time:
1:00 pm - 2:00 pm UTC-5
Event Category:

Venue

HBL Class of 1947 Conference Room
UConn Library, 369 Fairfield Way, Unit 1005
Storrs, CT 06269 United States
+ Google Map
Phone
(860) 486-2518
View Venue Website

Connect With Us