The UTC Graduate School is pleased to announce that Li Dai will present Doctoral research titled, A UNIFIED ALGEBRAIC FRAMEWORK EXTENDING FROM A 6-SET DISCRETE PROBABILITY ALGEBRA AND ITS APPLICATION IN DEEP LEARNING on 06/20/2024 at 10:00am-12:00pm in Zoom Meeting https://tennessee.zoom.us/j/4411765043?omn=83591429647. Everyone is invited to attend.
Computer Science
Chair: Dr. Joseph Kizza
Co-Chair:
Abstract:
This thesis introduces a novel Unified Algebraic Framework which includes an expandable Python Functions Package built upon an extensible 6-Set Discrete Probability Algebra. The motivation behind this research is to provide a unified, general, and extendible quantitative analysis tool that can be used to delve into the neuron-level deep neural network structure and aims at improving the transparency of how the black box works and making advancements in detailed applications. Our approach extends a 6-set Discrete Probability Algebra to a more systematic quantitative framework that incorporates the analysis of the discrete probability distribution of neurons in deep neural network structure. Our methodology leverages the existing models and visualization of the application of the framework to quantitatively know how this algebra works and then implement the neuron-level application in classical scenarios. The key contribution of this research includes a mathematical 6-Set Discrete Probability Algebra offering a more robust and reasonable foundation for how neurons play their role in deep learning networks and how the quantitative analysis of the probability distribution of the neurons provides plentiful evidence and knowledge to reduce the intuition and trial and error research pattern in the selection and design of neural networks. The thesis also provides an an off-the-shelf expandible quantitative research tool that can be applied in the current domain and customized to expand to various fields. The thesis also demonstrates how the framework defines and measures dissimilarity between neurons to improve diversity in ensemble learning, similarity to achieve neuron-level knowledge transferring, the minimum distance perturbation to optimize the network iv structure with pruning, and entropy-based on differences of neurons to interpretability and explainability. This research provides a new approach that combines more sophisticated algebraic approaches in AI(Artificial Intelligence) and practical frameworks and tools that can be applied directly in deep learning applications to enhance effectiveness and efficiency. It will be helpful for researchers who are interested in this domain.