Deep Learning in NLP: A Teaser Trailer

Here we would like to feature the saturday workshop, given by Jens Johannsmeier. You can look forward to a 2 hour, interactive preview of deep learning. But don’t let me tell you, Jens actually had some words to say himself.

The whole thing will be very high-level/conceptual in nature. If you have previous experience in deep learning you might not get much out of it. Also, don’t expect to walk out of the workshop being able to train your own models. In fact, no coding is planned at all so far. Some prior knowledge in machine learning in general should be useful, and you shouldn’t be afraid of a little bit of maths.
The idea is to give a brief intro to “classical” machine learning (feature engineering + linear models) and show how deep learning improves over some of its aspects, as well as introducing some more specialized model types (recurrent/convolutional networks). We will be looking at toy examples as well as speech recognition and sentiment analysis as NLP applications. Procedure-wise, it will be a mix between “lecture” and little exercises.

Jens is quite adamant about the workshop having a specific audience. If you know this stuff, or it isn’t for you, why not enjoy the summer morning in Potsdam. I tend to agree. He has more to say.

The overall goal is not to take first steps in deep learning or something like that (that’s why there’s no coding), but to give people an idea of whether deep learning is something worth checking out for them — for their research or for fun. It could also be interesting if you’ve seen the hype surrounding it in recent times and just want it de-mystified a bit.

I still have about a week to prepare, so if you’re interested in the workshop and would really like to hear about a specific topic, feel free to message me ( and I might be able to work it in, but since we only have two hours I can’t guarantee anything.

There you go. Sounds like a workshop to me.

Leave a Reply

Your email address will not be published. Required fields are marked *