An In-Depth Dive Into Streaming Data Across Platform Channels on Flutter

In this tutorial, you’ll learn how to use Platform Channels to stream data into your Flutter app. By Wilberforce Uwadiegwu.

5 (6) · 1 Review

Download materials
Save for later
Share
You are currently viewing page 3 of 4 of this article. Click here to view the first page.

Tearing Down the EventChannel

To clean up the EventChannel, you need to nullify eventSink and stop observing network state changes.

Inside onCancel, paste this code above the return statement:

reachability.stopNotifier()
NotificationCenter.default.removeObserver(self, name: .reachabilityChanged, object: reachability)
eventSink = nil

Run the project on iOS. You’ll see something like this:

Show network connectivity changes on iOS

Congratulations! You completed the first part of this tutorial!

Congratulatios gif

Next up, you’ll learn how to stream an image from native code to Dart.

Streaming Arbitrary Bytes

The solution above works when you want an infinite stream of events. But what if you just want to stream a sequence of bytes and terminate? For example, maybe you want to stream an image, video or file.

In this section, you’ll solve this problem by streaming an image from native Android and iOS to Flutter. Additionally, a progress indicator will visualize the stream’s progress.

You’ll start by listening for images in the form of raw bytes.

Receiving, Concatenating and Consuming Bytes on Flutter

Since you’re streaming an arbitrary list of bytes, you’ll need some way to know that you’re actually at the end of that list. To do that, you’ll need a special delimiter signal.

In constants.dart, add the following Constants:

/// Event that denotes an end of the stream
static const eof = -0xFFFFFF;

You’ll use this value at the end of the stream to signify the end of your image. Note that like the other constants in Constants, this can be any value as long as it’s consistent across Android and iOS.

Next, inside main.dart, find build() of _MyHomePage. Replace the the Expanded widget with const Expanded(child: ImageStreamWidget()). Then import 'network_stream_widget.dart'.

body should now look like this:

body: Column(
  children: [
     const NetworkStreamWidget(),
     const Expanded(child: ImageStreamWidget()),
  ],
),

To receive and process the events, you’ll divide the image into three events:

  1. File size: The first event.
  2. The actual image bytes: Received in chunks. Successive bytes are concatenated.
  3. End of stream: The last event.

In _ImageStreamWidgetState, below startImageStream(), add the following function.

void onReceiveImageByte(dynamic event) {
  // Check if this is the first event. The first event is the file size
  if (imageSize == null && event is int && event != Constants.eof) {
    setState(() => imageSize = event);
    return;
  }

  // Check if this is the end-of-file event.
  // End-of-file event denotes the end of the stream
  if (event == Constants.eof) {
    imageSubscription?.cancel();
    setState(() => streamComplete = true);
    return;
  }

  // Receive and concatenate the image bytes
  final byteArray = (event as List<dynamic>).cast<int>();
  setState(() => imageBytes.addAll(byteArray));
}

This function handles processing the events.

Make sure to import 'constants.dart' as well.

Next, you need a channel to handle the events. Declare the image stream EventChannel in _ImageStreamWidgetState anywhere above build() like this:

final eventChannel = const EventChannel('platform_channel_events/image');

Import 'package:flutter/services.dart' into the same file.

Note that this channel’s name is different than what you used for streaming connectivity because this is a new channel that serves a different purpose.

Now, you need to receive the events from the channel. Paste this statement inside startImageStream():

imageSubscription = eventChannel.receiveBroadcastStream(
        {'quality': 0.9, 'chunkSize': 100}).listen(onReceiveImageByte);

In the statements above, you:

  1. Listen to the event stream and call onReceiveImageByte() when there’s new data.
  2. You pass quality and chunkSize to the native ends. quality is the quality of the image you want to receive and chunkSize determines the number of chunks you want to split the image bytes into.

Run the project. You’ll see something like this:

Screenshot after adding ImageStreamWidget to _MyHomePage

Tap Stream Image. At this point, you won’t see any UI changes. You’ll also see a MissingPluginException error in the log. Don’t worry, that’s expected!

Next up you’ll send the image on Android.

Streaming Bytes on Android

In Constants.kt, inside Constants add:

const val eof = -0xFFFFFF

Go back to MainActivity.kt. Below networkEventChannel, declare the image event channel name like this:

private val imageEventChannel = "platform_channel_events/image"

Next, in configureFlutterEngine(), right below the call to super, call this statement:

EventChannel(flutterEngine.dartExecutor.binaryMessenger, imageEventChannel)
    .setStreamHandler(ImageStreamHandler(this))

Now, open ImageStreamHandler.kt, and declare the event callback above onListen() like so:

private var eventSink: EventChannel.EventSink? = null

This should all look pretty similar, since it’s exactly the same setup you used for connectivity changes!

Next, paste this function in ImageStreamHandler.kt below onCancel():

private fun dispatchImageEvents(quality: Double, chunkSize: Int) {
    GlobalScope.launch(Dispatchers.IO) {
        if (activity == null) return@launch

        // Decode the drawable
        val bitmap = BitmapFactory.decodeResource(activity!!.resources, R.drawable.cute_cat_unsplash)

        // Compress the drawable using the quality passed from Flutter
        val stream = ByteArrayOutputStream()
        bitmap.compress(Bitmap.CompressFormat.JPEG, (quality * 100).toInt(), stream)

        // Convert the compressed image stream to byte array
        val byteArray = stream.toByteArray()

        // Dispatch the first event (which is the size of the array/image)
        withContext(Dispatchers.Main) {
            eventSink?.success(byteArray.size)
        }

        // Split the array into chunks using the chunkSize passed from Flutter
        val parts = byteArray.size / chunkSize
        val chunks = byteArray.toList().chunked(parts)

        // Loop through the chunks and dispatch each chuck to Flutter
        chunks.forEach {
            // Mimic buffering with a 50 mills delay
            delay(50)
            withContext(Dispatchers.Main) {
                eventSink?.success(it)
            }
        }
        withContext(Dispatchers.Main) {
            // Finally, dispatch an event to indicate the end of the image stream
            eventSink?.success(Constants.eof)
        }
    }
}

And add the following import statements:

import kotlinx.coroutines.*
import java.io.ByteArrayOutputStream
import android.graphics.Bitmap
import android.graphics.BitmapFactory

That’s a lot of code! Luckily you don’t really need to know much about it – all you need to know is that it gets an image contained in the project and converts it into a list of bytes on a background thread using Coroutines.

Next, you’ll assign eventSink and get the parameters you passed from Flutter. Then, you’ll call dispatchImageEvents() with these parameters.

Write these statements inside onListen():

eventSink = events
val quality = (arguments as Map<*, *>)["quality"] as Double
val chunkSize = arguments["chunkSize"] as Int
dispatchImageEvents(quality, chunkSize)

Then, clean up by nullifying eventSink and activity in onCancel():

eventSink = null
activity = null

Run the app on Android and tap Stream Image. You’ll see something like this:

Loading a cat image

This beautiful cat photo is by Cédric VT on Unsplash

Now, you’ll replicate the same steps you did on Android for iOS but in Swift.