Deep learning has recently exploded as the go-to solution for some very computationally complex problems such as image recognition, text classification, and generalized task learning in robotics.
Annex aims to bring practical, extensible, productive, and performant deep learning to Elixir. In the Annex framework, deep neural networks are composed as sequences of Annex. Layer behavior implementing modules/structs that simply transform data; underlying details (parallelization, native code, algorithms, etc.) are left to implementers.
In this talk, we will discuss the current state of the Annex framework, including philosophy, capabilities, extensions, and intentions. We will discuss how to extend the capabilities of Annex by implementing the most basic unit of transformation in Annex, the Annex.Layer behavior. Using structs and modules that implement the Annex.Layer behavior we will compose a deep neural network capable of performing state-of-the-art machine learning classification tasks. We will discuss how Annex's performance can be tuned by parallelizing work through Elixir processes and native code integration.
If you are interested in machine learning, behaviors, NIFs, or Elixir, don't miss this talk.
Jason Goldberger is a native of Phoenix, Arizona. He is an eight-year veteran of the U.S. Army. Jason has been doing Elixir for five years and is an engineer at DockYard.