TensorFlow Lite
Low-latency inference of on-device ML models
AndroidDeveloper ToolsArtificial IntelligenceSearch
TensorFlow Lite
Low-latency inference of on-device ML models
Android
Developer Tools
Artificial Intelligence
Search
Featured onNovember 15th, 2017
Product upvotes vs the next 3
Waiting for data. Loading
Product comments vs the next 3
Waiting for data. Loading
Product upvote speed vs the next 3
Waiting for data. Loading
Product upvotes and comments
Waiting for data. Loading
Product vs the next 3
Loading
TensorFlow Lite
Low-latency inference of on-device ML models
TensorFlow’s lightweight solution for mobile and embedded devices. TensorFlow has always run on many platforms but as the adoption of ML models has grown exponentially over the last few years, so has the need to deploy them on mobile and embedded devices. TensorFlow Lite enables low-latency inference of on-device machine learning models.