Handling touchscreen or mouse events: Simple Touch Handler
With the Mosaic class Core::SimpleTouchHandler you can react to touchscreen or mouse device events. This so-called Simple Touch Handler object can be used within your GUI component to add functionality to be executed when the user touches with the finger or clicks with the mouse on the screen inside the boundary area of the Touch Handler.
The Simple Touch Handler can be configured to react to particular touch or mouse interactions. For example, the Handler can respond to double taps and ignore any other interaction. When an event arrives, the Handler sends a signal to one of the slot methods associated to it where your particular implementation is executed.
The Simple Touch Handler is the right one if you want to process raw touch events or even simple taps. Processing raw events is essential if you intend to implement a special (e.g. multi-touch) gesture detector. Please note, that the Mosaic framework provides already a Wipe Touch Handler, Rotate Touch Handler and Slide Touch Handler, which implement the most important gesture processors.
The following sections are intended to provide you an introduction and useful tips of how to work with the Simple Touch Handler object and process touchscreen and mouse events. For the complete reference please see the documentation of the Core::SimpleTouchHandler class.
Add new Simple Touch Handler object
To add a new Simple Touch Handler object just at the design time of a GUI component do following:
★First ensure that the Templates window is visible.
★In Templates window switch to the folder Event Handlers.
★In the folder locate the template Simple Touch Handler.
★Drag & Drop the template into the canvas area of the Composer window:
★Eventually name the new added Handler.
IMPORTANT
Although being an event handler object and not a real view, the Touch Handler appears within the canvas area as a slightly red tinted polygon. This effect exists just for your convenience and permits you to recognize the Touch Handler while assembling the GUI component. In the Prototyper as well as in the target device the Handler itself is never visible.
Inspect the Simple Touch Handler object
As long as the Touch Handler object is selected you can inspect and modify its properties conveniently in the Inspector window as demonstrated with the property Enabled in the screenshot below:
This is in so far worth mentioning as all following sections describe diverse features of the Touch Handler object by explicitly referring to its corresponding properties. If you are not familiar with the concept of a property and the usage of Inspector window, please read first the preceding chapter Compositing component appearance.
Arrange the Simple Touch Handler object
Once added, you can freely move the Touch Handler, or you simply grab one of its corners and resize it in this way. Being a quad (four corners polygon), the position of each corner can be modified individually. You can control the corners also by directly modifying the corresponding properties Point1, Point2, Point3 and Point4. If you want the Touch Handler to be arranged behind other views you can reorder it explicitly.
Although the Touch Handlers are't visible in the resulting application, their areas determine where the application should in fact react to touch interactions. Only when the user touches inside the Handler's boundary area, the Handler reacts to this interaction. Moreover, if two Touch Handlers overlap, the touch events are per default processed only by the Handler lying top most suppressing the Handler in the background. Thus the right arrangement of how Touch Handlers are placed within your application is an important aspect for its correct function.
The area of a Simple Touch Handler doesn't necessarily have to be rectangular. You can deform the Touch Handler as required in your application case regardless of whether the Touch Handler assumes a convex or concave shape. In any case the Handler will correctly detect the touch events occurred within its area (in the following figure red tinted area):
Implement Touch Handler slot methods
Every time the Touch Handler receives a touchscreen related event, the Handler sends a signal to one of the slot methods connected with it. Within the slot method your particular implementation can react and process the event. The slot methods are connected to the Handler by simply storing them in the for this purpose available Handler's properties. The following table describes them:
Property |
Description |
---|---|
Slot method associated to this property receives a signal just at the beginning of an interaction, in other words immediately after the user has touched the screen. |
|
Slot method associated to this property receives a signal just at the end of an interaction, in other words immediately after the user has released the touchscreen again. |
|
Slot method associated to this property receives signals periodically as long as the user continues touching the screen. The period is about 50 milliseconds (~ 20 times per second). |
|
Slot method associated to this property receives signals every time the user drags the finger. This is true even if the finger has left the boundary area of the Touch Handler. As long as the user has not finalized the interaction, the OnDrag slot method will be signaled with every finger movement. |
|
Slot method associated to this property receives a signal when while dragging the finger, the finger leaves the boundary area of the Touch Handler. This can also be the case just at the end of the interaction. When the user releases the screen with the finger lying inside the boundary area of the Touch Handler, the OnLeave slot method is signaled before OnRelease. |
|
Slot method associated to this property receives a signal when while dragging the finger, the finger enters again the boundary area of the Touch Handler. This can also be the case just at the beginning of the interaction. When the user touches the screen, OnEnter slot method is signaled immediately after OnPress. |
The following sequence diagram demonstrates a typical order in which the slot methods receive signals while the user interacts with the Touch Handler. Please note, that every interaction starts with a signal sent to the OnPress and ends with a signal sent to the OnRelease slot method. If the user has begun or finalized the interaction with the finger lying inside the boundary area of the Touch Handler, an initial OnEnter and/or a final OnLeave signal is sent. Otherwise, these methods are signaled by pairs when the user enters and leaves the Handler's boundary area. In turn, the OnHold slot method is signaled periodically every 50 milliseconds as long as the user touches the screen:
Providing slot methods for all properties is not obligatory. If your application case requires e.g. the release touch events being processed only, then you leave all properties initialized with null except the property OnRelease. You can initialize the properties with already existing slot methods or you add a new one and implement it as desired. The following steps describe how to do this:
★First add a new slot method to your GUI component.
★To react to the press event: assign the slot method to the property OnPress of the Handler object.
★To react to the release event: assign the slot method to the property OnRelease of the Handler object.
★To react to the drag event: assign the slot method to the property OnDrag of the Handler object.
★To react to the hold event: assign the slot method to the property OnHold of the Handler object.
★To react to the enter event: assign the slot method to the property OnEnter of the Handler object.
★To react to the leave event: assign the slot method to the property OnLeave of the Handler object.
★Open the slot method for editing.
★In the Code Editor implement your desired operation to execute when the event occurs.
The Simple Touch Handler object manages several variables reflecting its current state. You can evaluate these variables whenever your GUI component implementation requires the corresponding information. Usually, however, you do this from the above described slot methods. The following table provides an overview of the available variables:
Variable |
Description |
---|---|
The variable is true if the Handler has recently received the press event without the corresponding release event. In other words, the user still holds the finger on the screen. |
|
The variable stores the latest position of the finger expressed in coordinates relative to the top-left corner of the GUI component containing the Touch Handler. In other words, when the user drags the finger the variable is updated accordingly. |
|
The variable stores the position of the finger just at the beginning of the interaction. The position is expressed in coordinates relative to the top-left corner of the GUI component containing the Touch Handler. By calculating the difference between CurrentPos and HittingPos you get the absolute movement of the finger since the interaction begun. |
|
The variable stores the movement of the finger relative to the preceding OnDrag event. This is useful if you need to incrementally track the finger movements. |
|
The variable is true if the finger lies currently inside the boundary area of the Touch Handler or the user has finalized the interaction with the finger lying inside the area. |
|
The variable stores the number of consecutive taps the user has performed at the beginning of the interaction. For example, if the user double tap inside the Touch Handler area, the variable contains the value 2. |
|
The variable stores the number of the finger triggering the current event. The fingers are counted starting with 0 (zero) for the first finger. This variable allows in a multi-touch environment to distinguish between several fingers. If your application is controlled by the mouse device, the variable Finger identifies the mouse button causing the event. Per default, the left button has the number 0 (zero), the right button 1 and the middle button 2. See below Support for mouse input devices. |
|
The variable stores the time elapsed since the interaction begun. In other words, it stores how long the user holds the finger pressed on the screen. The variable is expressed in milliseconds. |
|
This variable stores a time stamp when the event has been generated by the touchscreen device. Knowing the precise time is essential to calculate from the finger displacement the movement velocity. |
|
The variable is true if the current interaction has been taken over by another Touch Handler. See below Combine several Touch Handlers together. |
Which variables your implementation does in fact evaluate depends on your particular application case. The variable Down is useful if you have implemented a single slot method for both the OnPress and OnRelease events. In such case you can easily distinguish whether the method is called because of the user has pressed or released the screen. Similarly, by evaluating the variable Inside within the OnDrag or OnHold slot method you can determine whether the finger still lies inside the area of the Touch Handler or not. The following Chora code demonstrates few examples of how the various variables are used:
if ( !TouchHandler.Down ) { // The user has finalized the interaction. } if ( !TouchHandler.Down && TouchHandler.Inside ) { // The user has finalized the interaction while the // finger was inside the Touch Handler boundary area. } if ( TouchHandler.Down && TouchHandler.Inside ) { // The user still touches the screen and the finger lies // inside the Touch Handler boundary area. } if ( TouchHandler.Down && ( TouchHandler.HoldPeriod > 1000 )) { // The user touches the screen for at least 1 second. } if ( TouchHandler.StrikeCount == 2 ) { // The user has double tapped the Touch Handler } // Within the OnDrag slot method let an Image view follow every finger // movement. ImageView.Bounds = ImageView.Bounds + TouchHandler.Offset;
The following example demonstrates the usage of the Touch Handler to implement a simple paint GUI component. The user can touch the component and in this manner stroke line segments. Additional Touch Handlers implement simple buttons the user can tap to select the desired color for the drawing operation:
Please note, the example presents eventually features available as of version 8.10
Configure the filter condition
Every time the user touches the screen, the Mosaic framework searches at the affected screen position for a Touch Handler, which is enabled and configured to accept the new interaction. The Simple Touch Handler provides various properties to determine the condition when the Handler should be taken in account.
One condition determines the minimum and maximum number of consecutive taps the user may perform at the same spot. This condition is controlled by the properties MinStrikeCount and MaxStrikeCount. Per default, the both properties are initialized with the value 1 (one) resulting in the Touch Handler being ready to accept only single taps. To configure the Handler to react to a double tap, initialize the both properties with the value 2. Accordingly, if you want the Handler to react to any number of taps, initialize MinStrikeCount with 1 and MaxStrikeCount with e.g. 100 which should be sufficient even for a very hyperactive user.
Two taps are considered as consecutive if the delay between them is shorter than the value stored in the property CursorSequelDelay and the displacement between two consecutive taps is less than CursorDragLimit. These properties can be adapted when you edit the application component. Per default, CursorSequelDelay is configured with 250 milliseconds and CursorDragLimit with 8 pixel.
If the device does support multi-touch you can restrict a Handler to respond only to interactions associated with a particular finger number. This setting is controlled by the property LimitToFinger. The fingers are counted starting with 0 (zero) for the first finger touching the screen and 9 for the tenth finger when the user uses all fingers simultaneously. Per default, this property is initialized with -1, which instructs the Handler to accept any interaction regardless of the number of fingers the user touches currently the screen with.
Initializing LimitToFinger with the value 0 (zero) restricts the Handler to accept the interaction associated with the first finger. In other words, the Handler will ignore any further interaction if the user touches already the screen with at least one finger. If this property is initialized with the value 4, the Handler will be activated only when the user touches the screen with the fifth finger.
The numbering of fingers is an aspect you have to take care of in the main software where the GUI application and your particular touchscreen driver are integrated together. Precisely, when you feed touchscreen events into the application by calling the method DriveMultiTouchHitting you should pass in its aFinger argument the number of the associated finger. This requires however, that the touchscreen driver in your device provides information to uniquely identify all fingers involved currently in touchscreen interactions.
On the other hand if your application case does require the handler to react only when the user has touched it simultaneously with two or more fingers, you can specify the expected number of fingers in the property NoOfFingers. For example, if you initialize NoOfFingers with the value 2 the handler will process touch events only when the user touches it with two fingers. If the user touches the handler with one finger or the delay between the first and the second touch event is too long, the handler will ignore the interaction. In such case other handlers lying behind it can eventually take over and process the interaction.
When configuring a touch handler to react to two or more fingers you should note that once all fingers are placed within the handler's area the handler processes the events generated by the last finger only. The movement of the other fingers is simply ignored. However, when the user releases one of the fingers, the entire interaction ends.
Please note, the both properties NoOfFingers and LimitToFinger can't be combined together. You can either configure the handler to react to events generated by a finger with a particular number or to react when the user performs a two or more fingers gesture regardless of the involved fingers.
Touch events and the grab cycle
When the user touches the screen, the corresponding press event is dispatched to the Touch Handler enclosing the affected screen position. With this begins the so-called grab cycle. The grab cycle means, that the Touch Handler becomes the direct receiver for all related subsequent touchscreen events. In other words, the Touch Handler grabs temporarily the touchscreen. However, if your device supports multi-touch, the grab cycle is limited to events associated with the finger, which has originally initiated the touch interaction. The grab cycle doesn't affect the events triggered by other fingers.
With the grab cycle the Mosaic framework ensures, that Touch Handler which has received a press event also processes the corresponding release event. Disabling the Touch Handler, hiding the superior component or removing its parent GUI component from the screen while the Handler is involved in the grab cycle does not rupture the event delivery. Similarly, when the user drags the finger outside the boundary area of the Touch Handler, the grab cycle persists and the Touch Handler will continue receiving all events associated with the finger.
The grab cycle ends automatically when the user releases the corresponding finger. With the end of the grab cycle the involved Handler will receive the final release event. The grab cycle technique is thus convenient if it is essential in the implementation of your GUI component to always correctly finalize the touchscreen interaction.
Under normal circumstances you don't need to think about the mechanisms behind the grab cycle. You should however understand that the Simple Touch Handler after sending a signal to the slot method stored in its OnPress property will always send a signal to the slot method stored in OnRelease property as soon as the user finalizes the interaction.
Combine several Touch Handlers together
According to the above described concept of the grab cycle the touchscreen events are delivered to the Handler, which has originally responded to the interaction. This relation persists until the user lifts the finger again from the touchscreen surface and finalizes so the interaction. In other words, events generated by a finger are processed by one Touch Handler only.
This is in so far worth mentioning, as trying to arrange several Touch Handler one above the other with the intention to process with every Handler a different gesture will per default not work. Only the Handler lying top most will react and process the events. This Touch Handler, however, after having detected, that it is not responsible for the gesture can reject the active interaction causing another Touch Handler to take over it. In this manner, several Touch Handler can seamlessly cooperate.
With the property RetargetCondition you can select one or several basic gestures, the affected Touch Handler should not process. As soon as the Touch Handler detects, that the user tries to perform one of the specified gestures, the next available Handler lying in the background and willing to process the gesture will automatically take over the interaction. Following table provides an overview of the basic gestures you can activate in the RetargetCondition property and additional properties to configure the gesture more in detail:
Gesture |
Description |
---|---|
WipeDown |
Vertical wipe down gesture. Per default the gesture is detected after the user has dragged the finger for at least 8 pixel. With the property RetargetOffset the minimal distance can be adjusted as desired. |
WipeUp |
Vertical wipe up gesture. Per default the gesture is detected after the user has dragged the finger for at least 8 pixel. With the property RetargetOffset the minimal distance can be adjusted as desired. |
WipeLeft |
Horizontal wipe left gesture. Per default the gesture is detected after the user has dragged the finger for at least 8 pixel. With the property RetargetOffset the minimal distance can be adjusted as desired. |
WipeRight |
Horizontal wipe right gesture. Per default the gesture is detected after the user has dragged the finger for at least 8 pixel. With the property RetargetOffset the minimal distance can be adjusted as desired. |
LongPress |
This gesture is detected when the user holds the finger pressed for a period longer than the value specified in the property RetargetDelay. This is per default 1000 milliseconds. |
ForeignPress |
With this condition, the Handler will resign if a new touch event arrives and this event is handled by a foreign Touch Handler lying in front or behind the Handler itself. This is useful when the application combines Handlers configured to react to gestures with two or more fingers (See also Simple Touch Handler and multi-touch screen). |
Let's assume, you want to implement a GUI component to display within a Text view some text pages, the user can scroll vertically. With a wipe left and wipe right gestures the user should be able to browse between the various text pages. Moreover, a double tap within the page area should automatically scroll the text to its begin.
This example can be implemented by combining three Touch Handlers: the Slide Touch Handler to scroll the text vertically, the Wipe Touch Handler to detect the wipe left and wipe right gestures and to the regular Simple Touch Handler to detect the double taps. Following are the steps to configure these Handlers:
★Add a new Text view to your component.
★Add a new Slide Touch Handler to your component.
★Arrange the Slide Touch Handler so it covers the area of the Text view.
★Add a new Wipe Touch Handler to your component.
★Arrange the Wipe Touch Handler so it covers the area of the Text view.
★Add a new Simple Touch Handler to your component.
★Arrange the Simple Touch Handler so it covers the area of the Text view.
★For the Simple Touch Handler configure the property RetargetCondition with the value Core::RetargetReason[WipeDown, WipeLeft, WipeRight, WipeUp].
★For the Wipe Touch Handler configure the property RetargetCondition with the value Core::RetargetReason[WipeDown, WipeUp].
★For the Slide Touch Handler configure the property SlideHorz with the value false.
The following figure demonstrates the arrangement and the configuration of the three Touch Handlers:
When the user touches the GUI component, the events are primarily processed by the Simple Touch Handler. Later when the user drags the finger to the left, the Simple Touch Handler detects the wipe gesture and retargets the interaction to the Wipe Touch Handler. In turn, if the user has dragged the finger up, the Slide Touch Handler will take over the interaction since the Wipe Touch Handler is configured to also reject all wipe up and down gestures.
The following example demonstrates this application case:
Please note, the example presents eventually features available as of version 8.10
This so-called deflection mechanism is not limited to Touch Handlers within the same GUI component. When the Touch Handler decides to reject the actual interaction, all Touch Handler existing at the moment in the application and lying at the current touch position are asked in order to find the best one which should take over the interaction. The old and the new Touch Handler can thus belong to different GUI components.
Please note, that rejecting the active interactions finalizes the corresponding grab cycle for the old Touch Handler and initiates a new one for the new Touch Handler. Accordingly, the old Handler receives the final release event as if the user had lifted the finger from the screen and the new Handler receives the initial press event as if the user had just touched it. The both Handler act independently.
If you process the release event for the Handler, which may reject the interaction, you can evaluate its variable AutoDeflected. This variable is set true if the event is generated because the interaction has been taken over by another handler. Depending on your application case, you may then decide to not process the event. For example, a push button should be activated only if the user really lifts the finger. In turn, when the interaction is taken over by another Handler, the push button should restore its original visual aspect but it should not fire. Following code could be the implementation of the OnRelease slot method within a push button component:
// Request the push button widget to restore its normal (not pressed) visual aspect InvalidateViewState(); // The event is generated because the interaction has been taken over by another // Handler. Ignore the event. if ( TouchHandler.AutoDeflected ) return; /* Process the event. For example, execute some code associated with the push button, etc. */
Handle multi-touch events (e.g. pinch, zoom and rotate gesture)
Per default, the Simple Touch Handler responds to the first touch interaction matching the specified condition. Afterwards, touching inside the Handler's area with a further finger is ignored. If there is another Handler lying in the background, then it has eventually the chance to respond and process the event. You can imagine, the Simple Touch Handler is restricted to process at the same time events associated with only one finger.
Accordingly, to handle multi-touch interactions it is the simplest to manage several Touch Handlers within the application. For example, in an application containing many push buttons, the user can press the buttons simultaneously, similarly as it is possible with a clavier. Every button with its embedded Touch Handler will then individually process the touch events associated with the corresponding finger. From the user's point of view, the buttons can be controlled independent of each other.
Knowing this, you can arrange several Simple Touch Handlers one above each other. When the user touches inside this area with one finger, the Handler lying top most will respond. Later when the user touches with a second finger, the top most Handler (since being busy with the first finger) will ignore the event and the Handler lying in the background responds. This will continue as long as there are Touch Handler available. The slot methods associated to the Handlers will permit you to process the events individually for every finger.
For example, using this approach you can implement the two finger pinch-zoom gesture to scale and rotate conveniently an image displayed within the Warp Image view:
★Add a Simple Touch Handler to your component.
★Arrange the Handler so that it occupies the area where you want to perform the pinch gesture.
★Duplicate the Simple Touch Handler.
★Name the both Handlers: HandlerA and HandlerB.
★As described in the section above add a new slot method and connect it to the OnDrag properties of the both Handlers.
★Implement the slot method with following Chora code:
// Handle only events if the user does hold two fingers pressed on the touch // screen. The first finger is associated with HandlerA. The second finger with // HandlerB. if ( !HandlerA.Down || !HandlerB.Down ) return; // The distance as vectors between the both fingers just at the beginning of the // gesture and the current distance after the user has moved the fingers. var point start = HandlerA.HittingPos - HandlerB.HittingPos; var point cur = HandlerA.CurrentPos - HandlerB.CurrentPos; // Calculate the length of the corresponding vectors. var float startLen = math_length( start ); var float curLen = math_length( cur ); var float factor = 1.0; // Calculate the angle of the vector between the fingers at the beginning of // the interaction and for the current moment. Note, the Y axis is mirrored // in the coordinate system. var float startAngle = math_atan2( -start.y, start.x ); var float curAngle = math_atan2( -cur.y, cur.x ); // From the both distances calculate a scale factor. if ( startLen > 0.0 ) factor = curLen / startLen; // Where should the image appear within the GUI component? var point destPos = ... // Scale and rotate the Warp Image view accordingly. WarpImageView.RotateAndScale( destPos, curAngle - startAngle, factor, factor );
The following example demonstrates this implementation (If your development computer does not have any touch-capable display, you can simulate multi-touch events by using the mouse as described in the section Touch screen inputs):
Please note, the example presents eventually features available as of version 11.00
An alternative approach is to explicitly enable the Simple Touch Handler to process multi-touch events. When you set the property EnableMultiTouch to the value true, the above described restrictions to one finger are obsolete. Now, the Simple Touch Handler responds to every touch interaction as long as it matches the specified condition. The user can touch inside the Handler's area with multiple fingers. It is no more necessary to use several Touch Handlers arranged one above each other to handle multi-touch gestures.
With the multi-touch functionality enabled, the Touch Handler manages for every finger an individual grab cycle. Accordingly, the slot methods associated to the Hander will receive signals every time the user moves one of the fingers, releases a finger or touches with a new one inside the Handler's area. Therefore, in order to distinguish the events triggered by different fingers, your implementation of the slot methods should evaluate the variable Finger:
if ( TouchHandler.Finger == 0 ) { // The event is generated by the finger #1 } else if ( TouchHandler.Finger == 1 ) { // The event is generated by the finger #2 }
The following example demonstrates again the implementation of the pinch gesture, now however with a single Simple Touch Handler configured to process multi-touch events. (If your development computer does not have any touch-capable display, you can simulate multi-touch events by using the mouse as described in the section Touch screen inputs):
Please note, the example presents eventually features available as of version 11.00
TIP
If you want the handler to react to two or more finger taps, you can specify the expected number of fingers in the property NoOfFingers. Please see the section Configure the filter condition. Please note, doing this, the handler will process touch events generated by several fingers. Configuring the above described property EnableMultiTouch is in this case not necessary.
Support for mouse input devices
Embedded Wizard developed applications are not restricted to be controlled by tapping on a touchscreen only. With the Simple Touch Handler you can process also events triggered by a regular mouse input device. Accordingly, when the user clicks with the mouse inside the boundary area of a Touch Handler, the Handler will receive the corresponding press, drag, hold, ... release events. Thus you can use Embedded Wizard to develop desktop applications able to run under Microsoft Windows, Apple macOS or diverse Unix distributions.
There are no essential differences when handling mouse or touchscreen events except the multi-touch functionality, which is not available when using the mouse device. Even mouse devices with more than one button can be handled. Per default, the left mouse button is associated to the first finger (finger with the number 0 (zero)). The right mouse button corresponds to the second finger and the middle button to the third finger. Knowing this, you can easily configure the Touch Handler to respond to the interesting mouse events.
In principle, in the main software where the GUI application and the particular mouse driver or OS API are integrated together you are free to map the received mouse events to the corresponding invocations of Embedded Wizard methods. Precisely, you feed the events into the application by calling the methods DriveCursorHitting and DriveCursorMovement.
Simple Touch Handler and invisible components
Please note, the function of a Touch Handler is not affected by the visibility status of the GUI component, the handler is used inside. Embedded Wizard differentiates explicitly between the status ready to be shown on the screen and able to handle user inputs. As long as the handler lies within a potentially visible screen area, hiding the component will not suppress the touch handler from being able to react to user inputs. Thus if you want a GUI component to be hidden and to not be able to handle user inputs you have to hide and disable the component.
Generally, whether a GUI component is visible or not is determined by its property Visible. If the component is embedded inside another component, the resulting visibility status depends also on the property Visible of every superior component. In other words, a component is visible only if its property Visible and the property of every superior component are true. If one of these properties is false the component is considered as hidden.
In turn, the ability to react to user inputs is controlled by the component's property Enabled. Again, if the component is embedded inside another component, the resulting enabled status depends also on the property Enabled of every superior component. In other words, a component is able to handle user inputs only if its property Enabled and the property of every superior component are true. If one of these properties is false the component is considered as disabled.
IMPORTANT
A hidden component will receive and handle all user inputs as if it were visible. If it is not desired, you have to set both the property Visible and Enabled of the affected component to the value false.
Disable a Simple Touch Handler
By initializing the property Enabled with the value false you can explicitly suppress a Touch Handler from being able to react to touchscreen inputs. In this manner, other Handlers lying behind it are exposed and can respond to the events.
Please note, that modifying Enabled while the Handler is currently involved in a grab cycle has no immediate effect. In other words, as long as the user does not finalize the previously initialized touchscreen interaction, the Handler will continue receiving events even if it has been disabled in the meantime. Finally, the grab cycle guarantees that Handler, which have received the press event will also receive the corresponding release event.
TIP
In the GUI application an individual Touch Handler can handle events only when the Handler and all of its superior Owner components are enabled, the Handler itself does lie within the visible area of the superior component and the Handler is not covered by any other Touch Handler.
Please note, Touch Handlers continue working even if you hide the superior components. See: Simple Touch Handler and invisible component.