Flutter Vision — Flutter + Firebase ML Vision + Firebase Cloud Firestore

It uploads the file then awaits the StorageUploadTask and grabs the downloadURL:Future<String> _uploadFile(filename) async { final File file = File(imagePath); final StorageReference ref = FirebaseStorage.instance.ref().child('$filename.jpg'); final StorageUploadTask uploadTask = ref.putFile(file, StorageMetadata(contentLanguage: 'en')); final downloadURL = await (await uploadTask.onComplete).ref.getDownloadURL(); return downloadURL.toString();}Update our _addItem function to save the URL:Future<void> _addItem(String downloadURL, List<String> labels) async {await Firestore.instance.collection('items').add(<String, dynamic> {'downloadURL': downloadURL,'labels': labels});}We need a unique URL, so we will use the handy uuid package like the example:Future<void> detectLabels() async {final FirebaseVisionImage visionImage = FirebaseVisionImage.fromFilePath(imagePath);final LabelDetector labelDetector = FirebaseVision.instance.labelDetector();final List<Label> labels = await labelDetector.detectInImage(visionImage);List<String> labelTexts = new List();for (Label label in labels) { final String text = label.label; labelTexts.add(text);}final String uuid = Uuid().v1();final String downloadURL = await _uploadFile(uuid);_addItem(downloadURL, labelTexts);}F5 again to try it out:At this point our image and data is stored in Firebase..Next we need to display the list of images with labels to the user..We are going to restructure our app to have 2 screens: one that is responsible for displaying a list of images and labels to the user; another that takes a picture and kicks off the ML Vision process..Let’s create a new screen, ItemListScreen, to display a list of cards..I’m going to adapt the examples here and here to fit our needs:class ItemsListScreen extends StatelessWidget {@overrideWidget build(BuildContext context) { return Scaffold( appBar: AppBar( title: const Text('My Items'), ), body: ItemsList(firestore: Firestore.instance), floatingActionButton: FloatingActionButton( onPressed: () { Navigator.push( context, MaterialPageRoute(builder: (context) => CameraScreen()), ); }, child: const Icon(Icons.add), ),);}}…and an ItemsList widget (sorry for the formatting, available in a gist here):class ItemsList extends StatelessWidget {ItemsList({this.firestore});final Firestore firestore;@overrideWidget build(BuildContext context) {return StreamBuilder<QuerySnapshot>(stream: firestore.collection('items').snapshots(),builder: (BuildContext context, AsyncSnapshot<QuerySnapshot> snapshot) {if (!snapshot.hasData) return const Text('Loading…');final int itemsCount = snapshot.data.documents.length;return ListView.builder(itemCount: itemsCount,itemBuilder: (_, int index) {final DocumentSnapshot document = snapshot.data.documents[index];return SafeArea(top: false,bottom: false,child: Container(padding: const EdgeInsets.all(8.0),height: 310.0,child: Card(shape: RoundedRectangleBorder(borderRadius: BorderRadius.only(topLeft: Radius.circular(16.0),topRight: Radius.circular(16.0),bottomLeft: Radius.circular(16.0),bottomRight: Radius.circular(16.0),),),child: Column(crossAxisAlignment: CrossAxisAlignment.start,children: <Widget>[// photo and titleSizedBox(height: 184.0,child: Stack(children: <Widget>[Positioned.fill(child: Image.network(document['downloadURL']),),],),),Expanded(child: Padding(padding: const EdgeInsets.fromLTRB(16.0, 16.0, 16.0, 0.0),child: DefaultTextStyle(softWrap: true,//overflow: TextOverflow.,style: Theme.of(context).textTheme.subhead,child: Column(crossAxisAlignment: CrossAxisAlignment.start,children: <Widget>[Text(document['labels'].join(', ')),]),),),),],),),),);},);},);}}Let’s change the home screen to the ItemsListScreen:class FlutterVisionApp extends StatelessWidget {@overrideWidget build(BuildContext context) { return MaterialApp( home: ItemsListScreen(), );}}We should also take the user back to the list after they take a picture so they can see the results by adding the following to the detectLabels function:Navigator.push( context, MaterialPageRoute(builder: (context) => ItemsListScreen()),);I also decided to do some cleanup by renaming FlutterVisionHome to CameraScreen since we now have more than one screen.Run the app one last time to see the final results!.Firebase ML Vision is crazy fast..Paired with Flutter we were able to get a snappy app that we can run on both iOS and Android in hours, not days..Incredible..If you were like me and hesitant to try Flutter because you didn’t know Dart — give it a go today!The source code is available here..Feel free to open an issue there if you run into trouble or hit me up in the comments..Thanks for listening!. More details

Leave a Reply