An introduction to the internals of the Model Asset eXchange

June 17, 2019 devadvin

This post focuses on how the Model Asset eXchange (MAX) works internally. For an introduction to MAX, see our our introduction
article
.

Architectural design

The Model Asset eXchange (MAX) uses an extensible and distributive architecture as well as a container technology and
cloud infrastructure. The following figure demonstrates its architecture:

MAX architecture

MAX is hosted on a cloud infrastructure, such as IBM cloud, and communicates with web applications through standardized
RESTful APIs. It is undergirded by a powerful abstract component, named the MAX framework. The MAX framework wraps deep
learning models implemented in different deep learning frameworks and provides programming interfaces in a uniform style,
which effectively enables developers to use deep learning models without the need to dive into deep learning programming frameworks. Each implementation of a deep learning model runs in isolated Docker containers, which promotes security and effectively turns the architecture to be easily distributive and extensible. Additionally, we build MAX exclusively on top of open source technologies, which promotes an open and collaborative culture.

Software components

Now, let’s look at MAX’s various software components.

MAX framework

The MAX framework is a Python library that wraps deep learning models to unify the programming interface. To wrap a model, it
simply requires implementing functions that process input and output. This simplicity is key to the MAX framework’s
agnosticism to deep learning programming frameworks.

Deep learning models

MAX can accommodate deep learning models written in different deep learning frameworks. The MAX framework communicates with deep learning models through standardized Python programming interfaces. To use a deep learning model in MAX, we only need to adapt its Python programming interface, that is, wrap the deep learning model. After the deep learning model is wrapped, it is available throughout the entire MAX system and does not require further adaptation in the future. This Python programming interface is objected-oriented: Wrapping only requires inheriting specific classes and implementing some predefined class functions by converting input and output of deep learning models to data structures acceptable to the MAX framework.

The wrapped deep learning models and their programming interfaces with the MAX framework are hosted in Docker containers. A
container is an isolated instance of an environment that hosts software of interest and its runtime. This isolation in
general promotes extensibility, distributability, and security.

RESTful APIs: Between applications and the MAX framework

MAX provides a standardized deep learning programming framework-agnostic programming interface as RESTful APIs, which effectively
avails developers of deep learning models without requiring them to dive into the various deep learning programming frameworks.

For each deep learning model, MAX’s output is in a JSON format following a standardized specification. This standardization
lets developers quickly adapt their applications by replacing the underlying deep learning model with very little and often
zero modification to the code that interacts with the deep learning model. This is in sharp contrast with the current common
practice: Due to the non-standardized programming interfaces, when replacing underlying deep learning models, developers usually
have to drastically modify their code and frequently find themselves mired in figuring out the correct usage of often
abstrusely defined APIs. MAX also integrates Swagger to make a graphical user interface (GUI) automatically
available to all wrapped deep learning models. An example is shown below (excerpted from our text sentiment
classifier
).

{
  "status": "ok",
  "predictions":[
    [{"positive": 0.9977352619171143, "negative": 0.002264695707708597}],
    [{"positive": 0.001138084102421999, "negative": 0.9988619089126587}]
  ]
}

For more information about MAX, see the Mode Asset eXchange website.

Previous Article
Update Appmetrics to run on Node 12
Update Appmetrics to run on Node 12

Learn how I upgraded a Node.js-addon written in C++ to run on Node.js 12.

Next Article
R-Ladies Mini Hackathon: Building a deep learning powered application in R
R-Ladies Mini Hackathon: Building a deep learning powered application in R

Two months ago, we at R-Ladies San Francisco had this dream of bringing in people who do not have deep lear...

×

Want our latest news? Subscribe to our blog!

Last Name
First Name
Thank you!
Error - something went wrong!