Edge computing has attracted tremendous attention from industry and academia due to its promise to reduce latency, save bandwidth, improve availability, and protect data privacy to keep data secure in the past five years. At the same time, we have witnessed the proliferation of AI algorithms and models which accelerate the successful deployment of intelligence mainly in cloud services. These two trends, combined together, have created a new horizon: Edge Intelligence (EI). The development of EI requires much attention from both the computer systems research community and the AI community to meet these demands.
EI was defined as the capability to enable edges to execute artificial intelligence algorithms. The EI capability includes Accuracy, Latency, Energy, and Memory footprint.

EI Challenges


The edge is usually resource-constrained compared to the cloud data center, which is not a good fit for executing DNN represented AI algorithms since DNN requires a large footprint on both storage and computing power.


Different edges have diverse computing powers while AI algorithms have different computing and storage requirements. How to choose a matched algorithm for a specific configuration of the edge is a big challenge.


The data on the cloud data center is easy to be batch processed and managed, which is beneficial in terms of the concentration of data. However, the temporal-spatial diversity of edge data creates obstacles for the data sharing and collaborating.

OpenEI: An Open Framework for Edge Intelligence

OpenEI is a lightweight software platform to equip the edge with intelligent processing and data sharing capability. The goal of OpenEI is that any hardware, ranging from Raspberry Pi to a powerful Cluster, will become an intelligent edge after deploying OpenEI. Meanwhile, the EI attributes, accuracy, latency, energy, and memory footprint, will have an order of magnitude improvement comparing to the current AI algorithms running on the deep learning package.


libei provides RESTful APIs to make it possible to communicate and work together with the cloud, other edges, and IoT devices, which will improve the collaboration.
Model selector includes multiple optimized AI models and a selecting engine which is designed to find the most suitble models for the specific edge platform based on users’ requirements.
ML Package is a lightweight deep learning package which has been optimized to run AI algorithms on the edge platform to guarantee the low resource consumption.


Ease of use

Drawing on the idea of plug and play, OpenEI is deploy and play. By leveraging the API, OpenEI is easy to install and easy to develop third-party applications for users.

Optimal selection

The biggest problem is not the lack of algorithms, but how to choose a matched algorithm for a specific configuration of the edge. The model selector is designed to meet the requirements.


To collaborate with the cloud and other heterogeneous edges, OpenEI is designed as a cross-platform software. libei provides RESTful API for the edge to communicate and work with others.

Optimization for the edge

To run heavy AI algorithms on the edge, being lightweight is the core feature as well as a significant difference between OpenEI and other data analyze platforms.


02/27/2020 [Online] OpenEI project goes online, preparing to open-source!
07/09/2019 [Talk] We present OpenEI project on the 39th IEEE International Conference on Distributed Computing Systems (ICDCS 2019).


Xingzhou Zhang, Yifan Wang, Sidi Lu, Liangkai Liu, Lanyu Xu, Weisong Shi, OpenEI: An Open Framework for Edge Intelligence, in Proceedings of the 39th IEEE International Conference on Distributed Computing Systems (ICDCS), Vision/Blue Sky Track, July 7-10, 2019, Dallas, USA.


    author={X. {Zhang} and Y. {Wang} and S. {Lu} and L. {Liu} and L. {xu} and W. {Shi}},
    booktitle={2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS)},
    title={OpenEI: An Open Framework for Edge Intelligence},

Open Source

OpenEI project is in open source development (Visit GitHub). If you want to join us, please don't hesitate to contact us.


Weisong Shi, Professor, Wayne State University. (weisong AT wayne dot edu)


Xingzhou Zhang, Ph.D. Candidate, Institute of Computing Technology, CAS. (zhangxingzhou AT ict dot ac dot cn)

Contact Us

Address: 5057 Woodward Ave, Detroit, MI 48202
Email: contact AT thecarlab dot org