How to implement gestures across several widgets in Flutter

Flutter makes creating custom UI experiences easy. It really does. I mean it.

One example is complex gestures. How about starting a drag gesture to trigger something, and then moving the finger to control something else, and finally dropping it to confirm?

In this code tutorial, we will set up a screen showing a picture and an edit button. The user can drag the edit button onto the image.

When the edit button is on the image, finger still on screen, a red overlay is added. The finger position controls the opacity of the overlay. We move the finger up for more, down for less. When the user is happy, they can drop the edit icon (ie lift their finger).

If the user wants to cancel, they can drag the edit icon out of the image area and drop it there.

In the code tutorial, a snackbar is then shown, to confirm the edit was completed, or was cancelled. We do not actually edit the image, that’s beyond the scope of this tutorial.

The full source code is available on github.

General approach

For the drag gesture, we use Draggable and DragTarget.

To get the general position of the pointer, we can use a Listener widget.

We need to bear in mind that a listener can only get events if it exists at the time the “pointer down” event is fired (refer to Gestures documentation).

We will use a stream to track the edit state of the image.

Note: I have called it “BLoC” as it makes a great introduction to the topic, but we could call it “Presenter” or “Controller”. It basically is a class that is used by the views to change the edit state and to redraw themselves.

Setting up the app

To follow the code tutorial, create a new app as follows.

If you’re unsure how to set up a Flutter app, check out Getting started with Flutter official tutorial.

Firstly, we add rxdart to pubspec.yaml and run  flutter packages get .

Secondly, we create a Material app in main.dart. It launches MyHomePage, which displays the PhotoView and  EditControlsView widgets.

Thirdly, we create PhotoView and EditControlsView, in photo_view.dart and edit_controls_view.dart respectively.

At this point, we have an image of the Flutter logo in the middle of the screen, and a black edit icon with a grey background at the bottom. Let’s add some interactivity!

Dragging the edit icon onto the image

In this section, we are going to add drag and drop of the edit icon onto the image.

When the icon is dragged, we will show it where the finger is. This is what Flutter calls “feedback”.

And instead of the black icon on grey background at the bottom of the screen, we will show it in grey, on white background. This is what Flutter calls “childWhenDragging”.

Let’s amend EditControlsView to make it draggable.

And let’s accept the drag in PhotoView.

At this point, the edit icon is draggable but not much else  happens, except for a print statement when it enters the image zone and another when it is dropped on the image. Let’s add the edit view!

Showing the edit view

Now that we can drag the edit icon onto the image, we can trigger the edit mode. So we need to think about tracking the edit state of the image.

We’ll do this using a singleton, accessible from all views. We create a new file image_edit_state_bloc.dart.

As a reminder, when the edit button is on the image, finger still on screen, a red overlay is added. The finger position controls the opacity of the overlay. We move the finger up for more, down for less.

We set up a model to encapsulate the edit data. There are 2 variables to track: the edit state (ie is it in progress, cancelled, or completed), and the edit value (ie the vertical position of the finger on the screen).

We add EditStateData and EditState to the same file.

We can now set up a stream of EditStateData, and methods for state changes, in ImageEditStateBloc.

We use a BehaviourSubject stream (part of rxdart plugin): it’s a broadcast stream that emits the current item to new listeners. This particular functionality is used in finishEdit().

We can now call startEdit() and finishEdit() from the PhotoView. Additionally, we listen to the stream and add an EditView when edit is in progress.

And we can call cancelEdit() from the Draggable, in EditControlsView.

Finally, we create EditView, in edit_view.dart. We use the screen height to calculate an opacity value that is between 0.0 and 1.0.

At this point, well… nothing has changed visually. This is because we’re not actually tracking the finger on the screen, after the drag movement.

Tracking the finger after the drag movement

One of the limitations of the Listener widget is that it only gets pointer events if it was present in the widget hierarchy when the pointer down event was fired. It means that we need to add it to a widget always present in the hierarchy, and we need some logic so it only does something with the event when we want it to.

We are going to add it to MyHomePage, and use the stream to make sure we only do something with it when the edit state is in progress.

Now, we can see the red overlay increasing its intensity as we move the finger upwards. All we need to do is show the snackbars to confirm when the edit is completed or cancelled. We do this in MyHomePage.

Note: If you need a refresher on displaying snackbars, check out my tutorial How to show a snackbar in Flutter.

Voila!

What next?

In this code tutorial (full code on github), we have used Draggable, DragTarget, and Listener. Two other very important classes for gestures are GestureDetector and InkWell. A good starting point for those are How to implement a GestureDetector in Flutter and Flutter Deep Dive: Gestures.

Author: Natalie Masse Hooper

Mobile app developer with 12 years experience. Started with the Android SDK in 2010 and switched to Flutter in 2017. Past apps range from start ups to large tech (Google) and non tech (British Airways) companies, in various sectors (transport, commercial operations, e-commerce).

Leave a Reply

Your email address will not be published.