tflite 1.0.1

  • README.md
  • CHANGELOG.md
  • Example
  • Installing
  • Versions
  • 89

tflite #

A Flutter plugin for accessing TensorFlow Lite API. Supports Classification and Object Detection on both iOS and Android.

Breaking changes since 1.0.0: #

  1. Updated to TensorFlow Lite API v1.12.0.
  2. No longer accepts parameter inputSize and numChannels. They will be retrieved from input tensor.
  3. numThreads is moved to Tflite.loadModel.

Installation #

Add tflite as a dependency in your pubspec.yaml file.

Android #

In android/app/build.gradle, add the following setting in android block.

    aaptOptions {
        noCompress 'tflite'
    }

iOS #

If you get error like "'vector' file not found", please open ios/Runner.xcworkspace in Xcode, click Runner > Tagets > Runner > Build Settings, search Compile Sources As, change the value to Objective-C++;

Usage #

  1. Create a assets folder and place your label file and model file in it. In pubspec.yaml add:
  assets:
   - assets/labels.txt
   - assets/mobilenet_v1_1.0_224.tflite
  1. Import the library:
import 'package:tflite/tflite.dart';
  1. Load the model and labels:
String res = await Tflite.loadModel(
  model: "assets/mobilenet_v1_1.0_224.tflite",
  labels: "assets/labels.txt",
  numThreads: 1 // defaults to 1
);
  1. See Image Classication and Object Detection below.

  2. Release resources:

await Tflite.close();

Image Classification #

  • Output fomart:
{
  index: 0,
  label: "person",
  confidence: 0.629
}
  • Run on image:
var recognitions = await Tflite.runModelOnImage(
  path: filepath,   // required
  imageMean: 0.0,   // defaults to 117.0
  imageStd: 255.0,  // defaults to 1.0
  numResults: 2,    // defaults to 5
  threshold: 0.2    // defaults to 0.1
);
  • Run on binary:
var recognitions = await Tflite.runModelOnBinary(
  binary: imageToByteListFloat32(image, 224, 127.5, 127.5),// required
  numResults: 6,    // defaults to 5
  threshold: 0.05,  // defaults to 0.1
);

Uint8List imageToByteListFloat32(
    img.Image image, int inputSize, double mean, double std) {
  var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
  var buffer = Float32List.view(convertedBytes.buffer);
  int pixelIndex = 0;
  for (var i = 0; i < inputSize; i++) {
    for (var j = 0; j < inputSize; j++) {
      var pixel = image.getPixel(j, i);
      buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
      buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
      buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
    }
  }
  return convertedBytes.buffer.asUint8List();
}

Uint8List imageToByteListUint8(img.Image image, int inputSize) {
  var convertedBytes = Uint8List(1 * inputSize * inputSize * 3);
  var buffer = Uint8List.view(convertedBytes.buffer);
  int pixelIndex = 0;
  for (var i = 0; i < inputSize; i++) {
    for (var j = 0; j < inputSize; j++) {
      var pixel = image.getPixel(j, i);
      buffer[pixelIndex++] = img.getRed(pixel);
      buffer[pixelIndex++] = img.getGreen(pixel);
      buffer[pixelIndex++] = img.getBlue(pixel);
    }
  }
  return convertedBytes.buffer.asUint8List();
}
  • Run on image stream (video frame):

Works with camera plugin 4.0.0. Video format: (iOS) kCVPixelFormatType_32BGRA, (Android) YUV_420_888.

var recognitions = await Tflite.runModelOnFrame(
  bytesList: img.planes.map((plane) {return plane.bytes;}).toList(),// required
  imageHeight: img.height,
  imageWidth: img.width,
  imageMean: 127.5,   // defaults to 127.5
  imageStd: 127.5,    // defaults to 127.5
  rotation: 90,       // defaults to 90, Android only
  numResults: 2,      // defaults to 5
  threshold: 0.1,     // defaults to 0.1
);

Object Detection #

  • Output fomart:

x, y, w, h are between [0, 1]. You can scale x, w by the width and y, h by the height of the image.

{
  detectedClass: "hot dog",
  confidenceInClass: 0.123,
  rect: {
    x: 0.15,
    y: 0.33,
    w: 0.80,
    h: 0.27
  }
}

SSD MobileNet: #

  • Run on image:
var recognitions = await Tflite.detectObjectOnImage(
  path: filepath,       // required
  model: "SSDMobileNet",
  imageMean: 127.5,     
  imageStd: 127.5,      
  threshold: 0.4,       // defaults to 0.1
  numResultsPerClass: 2,// defaults to 5
);
  • Run on binary:
var recognitions = await Tflite.detectObjectOnBinary(
  binary: imageToByteListUint8(resizedImage, 300), // required
  model: "SSDMobileNet",  
  threshold: 0.4,                                  // defaults to 0.1
  numResultsPerClass: 2,                           // defaults to 5
);
  • Run on image stream (video frame):

Works with camera plugin 4.0.0. Video format: (iOS) kCVPixelFormatType_32BGRA, (Android) YUV_420_888.

var recognitions = await Tflite.detectObjectOnFrame(
  bytesList: img.planes.map((plane) {return plane.bytes;}).toList(),// required
  model: "SSDMobileNet",  
  imageHeight: img.height,
  imageWidth: img.width,
  imageMean: 127.5,   // defaults to 127.5
  imageStd: 127.5,    // defaults to 127.5
  rotation: 90,       // defaults to 90, Android only
  numResults: 2,      // defaults to 5
  threshold: 0.1,     // defaults to 0.1
);

Tiny YOLOv2: #

  • Run on image:
var recognitions = await Tflite.detectObjectOnImage(
  path: filepath,       // required
  model: "YOLO",      
  imageMean: 0.0,       
  imageStd: 255.0,      
  threshold: 0.3,       // defaults to 0.1
  numResultsPerClass: 2,// defaults to 5
  anchors: anchors,// defaults to [0.57273,0.677385,1.87446,2.06253,3.33843,5.47434,7.88282,3.52778,9.77052,9.16828]
  blockSize: 32,        // defaults to 32
  numBoxesPerBlock: 5   // defaults to 5
);
  • Run on binary:
var recognitions = await Tflite.detectObjectOnBinary(
  binary: imageToByteListFloat32(resizedImage, 416, 0.0, 255.0), // required
  model: "YOLO",  
  threshold: 0.3,       // defaults to 0.1
  numResultsPerClass: 2,// defaults to 5
  anchors: anchors,     // defaults to [0.57273,0.677385,1.87446,2.06253,3.33843,5.47434,7.88282,3.52778,9.77052,9.16828]
  blockSize: 32,        // defaults to 32
  numBoxesPerBlock: 5   // defaults to 5
);
  • Run on image stream (video frame):

Works with camera plugin 4.0.0. Video format: (iOS) kCVPixelFormatType_32BGRA, (Android) YUV_420_888.

var recognitions = await Tflite.detectObjectOnFrame(
  bytesList: img.planes.map((plane) {return plane.bytes;}).toList(),// required
  model: "YOLO",  
  imageHeight: img.height,
  imageWidth: img.width,
  imageMean: 0,         // defaults to 127.5
  imageStd: 255.0,      // defaults to 127.5
  numResults: 2,        // defaults to 5
  threshold: 0.1,       // defaults to 0.1
  numResultsPerClass: 2,// defaults to 5
  anchors: anchors,     // defaults to [0.57273,0.677385,1.87446,2.06253,3.33843,5.47434,7.88282,3.52778,9.77052,9.16828]
  blockSize: 32,        // defaults to 32
  numBoxesPerBlock: 5   // defaults to 5
);

Demo #

1.0.1 #

  • Add detectObjectOnBinary
  • Add runModelOnFrame
  • Add detectObjectOnFrame

1.0.0 #

  • Support Object Detection with SSD MobileNet and Tiny Yolov2.
  • Updated to TensorFlow Lite API v1.12.0.
  • No longer accepts parameter inputSize and numChannels. They will be retrieved from input tensor.
  • numThreads is moved to Tflite.loadModel.

0.0.5 #

  • Support byte list: runModelOnBinary

0.0.4 #

  • Support Swift based project

0.0.3 #

  • Pass error message in channel in Android.
  • Use non hard coded label size in iOS.

0.0.2 #

  • Fixed link.

0.0.1 #

  • Initial release.

example/README.md

tflite_example #

Use tflite plugin to run model on images. The image is captured by camera or selected from gallery (with the help of image_picker plugin).

Prerequisites #

Create a assets folder. From https://github.com/shaqian/flutter_tflite/tree/master/example/assets dowload the following files and place them in assets folder.

  • mobilenet_v1_1.0_224.tflite
  • mobilenet_v1_1.0_224.txt
  • ssd_mobilenet.tflite
  • ssd_mobilenet.txt
  • yolov2_tiny.tflite
  • yolov2_tiny.txt

Install #

flutter packages get

Run #

flutter run

Caveat #


Use this package as a library

1. Depend on it

Add this to your package's pubspec.yaml file:


dependencies:
  tflite: ^1.0.1

2. Install it

You can install packages from the command line:

with Flutter:


$ flutter packages get

Alternatively, your editor might support flutter packages get. Check the docs for your editor to learn more.

3. Import it

Now in your Dart code, you can use:


import 'package:tflite/tflite.dart';
  
Version Uploaded Documentation Archive
1.0.1 Feb 17, 2019 Go to the documentation of tflite 1.0.1 Download tflite 1.0.1 archive
1.0.0 Feb 7, 2019 Go to the documentation of tflite 1.0.0 Download tflite 1.0.0 archive
0.0.5 Jan 7, 2019 Go to the documentation of tflite 0.0.5 Download tflite 0.0.5 archive
0.0.4 Nov 19, 2018 Go to the documentation of tflite 0.0.4 Download tflite 0.0.4 archive
0.0.3 Nov 14, 2018 Go to the documentation of tflite 0.0.3 Download tflite 0.0.3 archive
0.0.2 Sep 23, 2018 Go to the documentation of tflite 0.0.2 Download tflite 0.0.2 archive
0.0.1 Sep 23, 2018 Go to the documentation of tflite 0.0.1 Download tflite 0.0.1 archive
Popularity:
Describes how popular the package is relative to other packages. [more]
78
Health:
Code health derived from static analysis. [more]
99
Maintenance:
Reflects how tidy and up-to-date the package is. [more]
100
Overall:
Weighted score of the above. [more]
89
Learn more about scoring.

We analyzed this package on Feb 17, 2019, and provided a score, details, and suggestions below. Analysis was completed with status completed using:

  • Dart: 2.1.0
  • pana: 0.12.13
  • Flutter: 1.2.0

Platforms

Detected platforms: Flutter

References Flutter, and has no conflicting libraries.

Health issues and suggestions

Document public APIs. (-1 points)

12 out of 12 API elements have no dartdoc comment.Providing good documentation for libraries, classes, functions, and other API elements improves code readability and helps developers find and use your API.

Dependencies

Package Constraint Resolved Available
Direct dependencies
Dart SDK >=2.0.0-dev.68.0 <3.0.0
flutter 0.0.0
Transitive dependencies
collection 1.14.11
meta 1.1.6 1.1.7
sky_engine 0.0.99
typed_data 1.1.6
vector_math 2.0.8