Microsoft open sources the inference engine in its Windows ML platform

by anirudhgargon 12/4/2018, 6:58 PMwith 1 comments

by whittenon 12/4/2018, 11:26 PM

Apparently, the Open Neural Network Exchange (ONNX) runtime is an API so you can run models locally instead of on another machine.

I didn't see any details about the inference engine, so I assume this is a neural net AI application programming interface instead of a symbolic AI inferencing engine.