A Flutter plugin to use the Firebase ML Kit.
:star:Only your star motivate me!:star:
The flutter team now has the firebase_ml_vision package for Firebase ML Kit. Please consider trying to use firebase_ml_vision.
|Recognize text(on device)||✅||✅|
|Detect faces(on device)||✅||✅|
|Scan barcodes(on device)||✅||✅|
|Label Images(on device)||✅||✅|
To use this plugin, add
mlkit as a dependency in your pubspec.yaml file.
Check out the
example directory for a sample app using Firebase Cloud Messaging.
To integrate your plugin into the Android part of your app, follow these steps:
- Using the Firebase Console add an Android app to your project: Follow the assistant, download the generated
google-services.jsonfile and place it inside
android/app. Next, modify the
android/build.gradlefile and the
android/app/build.gradlefile to add the Google services plugin as described by the Firebase assistant.
To integrate your plugin into the iOS part of your app, follow these steps:
Using the Firebase Console add an iOS app to your project: Follow the assistant, download the generated
ios/Runner.xcworkspacewith Xcode, and within Xcode place the file inside
ios/Runner. Don't follow the steps named "Add Firebase SDK" and "Add initialization code" in the Firebase assistant.
ios/Podfile(workaround for flutter/flutter#9694).
From your Dart code, you need to import the plugin and instantiate it:
import 'package:mlkit/mlkit.dart'; FirebaseVisionTextDetector detector = FirebaseVisionTextDetector.instance; // Detect form file/image by path var currentLabels = await detector.detectFromPath(_file?.path); // Detect from binary data of a file/image var currentLabels = await detector.detectFromBinary(_file?.readAsBytesSync());