2022 Chicago Workshop on Coding and Learning


Date:
December 2nd, 2022.

Venue:
Zoom. Please register, preferably using a university or professional email, at this link.
After registering, you will receive a confirmation email containing information about joining the meeting.

Schedule (All times are Central Time; missing talk titles are to be announced):
08:45-09:00: Erdem Koyuncu, University of Illinois Chicago. Opening remarks.
09:00-09:30: Salim El Rouayheb, Rutgers University. How to turn privacy on and off.
09:30-10:00: Emrah Akyol, SUNY Binghamton. Price of transparency in strategic classification.
10:00-10:30: Tudor Dumitras and Yigitcan Kaya, University of Maryland, College Park. Wonders and dangers of input-adaptive neural network inference.
10:30-11:00: Osvaldo Simeone, King's College London. Reliable AI for communications via conformal prediction.
11:00-11:30: Daniela Tuninetti, University of Illinois Chicago. Deep learning aided codes for the two user fading broadcast channel with feedback.
11:30-12:00: Deniz Gunduz, Imperial College London. All you need is feedback: Communication with block attention feedback codes.
12:00-12:30: Hulya Seferoglu, University of Illinois Chicago. Coded privacy-preserving computation at edge networks.
12:30-12:30: Grab some lunch while the talks continue...
12:30-13:00: Joerg Kliewer, New Jersey Institute of Technology. How to teach an old dog some new tricks: Decoding of LDPC codes via reinforcement learning.
13:00-13:30: Brad McDanel, Franklin & Marshall College. Dynamic neural networks: An overview and current trends.
13:30-14:00: Aaron Wagner, Cornell University. Optimal neural network compressors and the manifold hypothesis.
14:00-14:30: Randall Berry, Northwestern University. Observational learning with unreliable observations
14:30-15:00: Tsachy Weissman, Stanford University. On compression of, for, and with neural networks.
15:00-15:30: Natasha Devroye, University of Illinois Chicago. Towards interpreting deep-learned error-correcting codes.
15:30-16:00: Erdem Koyuncu, University of Illinois Chicago. Class means based early exit mechanisms in neural networks.

Organizer:
Erdem Koyuncu, University of Illinois Chicago

Workshop Description:
Source and channel coding have historically been the two fundamental tenets of information theory, respectively studying the ultimate performance limits to data compression and error correction. There have been many recent studies on the application of deep learning techniques to design or interpret new codes, and conversely, coding or information theoretic ideas have resulted in significant advances in various areas of machine learning. This workshop will bring together expert researchers who work in the intersection of coding and learning to present their latest contributions to the field, and also suggest future research directions. Specific topics of interest include, but not limited to:

- Interpretability/explainability in source and channel coding
- Deep learning aided coding schemes
- Coding for private and secure multi-agent learning
- Neural network compression, pruning, quantization
- Neural network capacity, approximation

This workshop is organized as an event of the IEEE Information Theory Society Chicago Chapter.