Nick Kolkin is a sixth-year PhD student working with Professor Greg Shakhnarovich on optimization-based style transfer. If you would like to make your own style transfer images, or learn more about this research, a demo and further explanation of their work can be found here.
Where are you from?
When did you initially become interested in computer science?
I wasn’t really sure what I was interested in when I first went to college. I took a bunch of random classes that weren’t offered in my high school, and computer science was the one that I liked the best.
What are your main research interests?
I’m interested in generative models of images, so models you can show one or many images, and then it will generate new images that are similar but novel. I find working with code that produces images much more tangible and interesting, and that keeps me motivated.
For a long time I’ve also been interested in measuring the distance between different types of data. In undergrad I helped with one project measuring distance between sentences, and another measuring distance between certain types of matrix, but I knew I found visual data more appealing so in graduate school I wanted to shift gears to computer vision.
Style transfer was a good fit for both of these interests. It raises practical questions, for example, how do you write a program to generate an aesthetically pleasing image? It also raises more abstract questions like how to quantify style, and how to measure the distance between two different styles. I think that understanding and quantifying how artists can render content in so many unique styles while keeping it recognizable to other people lies at the intersection of many fascinating questions in graphics and computer vision.
What are your favorite classes that you’ve taken at TTIC?
Professor Nati Srebro’s convex optimization course was very enjoyable, as was Professor Avrim Blum’s learning theory course.
Why did you choose TTIC versus other schools for your PhD?
One reason was that I wanted to have time to focus on research, and teaching requirements are very light. The other major reason was that machine learning was the main focus of many other students and faculty.
What do you like most about being a part of the TTIC community?
You have a bunch of people interested in all these different aspects of machine learning. It’s a relatively small place, so you could have had a situation where everybody does things that are too similar, but in reality almost the entire span of machine learning research is represented. One of the huge pros of that is that there’s a huge diversity of topics to discuss and you can learn something new from everyone.
What are your plans after graduation?
I’ll be joining Adobe Research next fall.