Can someone PLEASE explain touch index handling? (I'm trying to do a dual analog stick scene)

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By Daniel G. Machado Jr

TLDR; I don’t understand touch indexes and can’t find who does. I want to handle two or more simultaneous touch events but simply can’t figure how.

I’m quite new to Godot engine. But I’m a programmer, so I kinda can get myself around the documentation and understand things from there.

So I’m trying to make a dual analog stick scene. I did a stick scene which handles events through _gui_event(), (which seems to handle events in local space, which is good in this case).

I have an analog stick scene that I want to be able to handle it’s events inputs independently (encapsulation, basically).

So in this scene I create a local “index” variable which is supposed to filter the events so that it only handles the event with it’s own index, unless this local index is set to null, on that case it will accept any event. It works wonderfully alone…

code is something like this:

_gui_input(event):
    if (event is InputEventScreenTouch) or
       (event is InputEventScreenDrag):
        if (index == null or index == event.index):
            index = event.index;
            //...handle the event
    else:
        index = null; //on release, the joystick index is null
        //...some other stuff

What actually happens though is that somehow the sticks are always responding only to the last touch index (even with an if statement blocking that) as if the two events with different indexes are the same instance of the last event.

But there’s absolutely ZERO explaining on how to use indexes. ZERO.

I’m obviously doing something VERY VERY wrong here. But I can’t figure what because I simply can’t find any more detailed documentation.

Sorry for being so ranty on it. But I’m already two days straight dealing with it.

See my answer below with code. But to add to that: the trick is to listen for events, and then place your analog stick graphic to follow the touch when the touch is in the right region, not the otherway around. This is anyway how a lot of games handle it, “floating stick” is what it’s often called.

billyb | 2018-06-13 15:10

:bust_in_silhouette: Reply From: Daniel G. Machado Jr

I don’t consider it to be a clean solution, but I think it’s a a good start. From this point on I’ll make something better and then post here again.

So, basically what I did was to give up on handling the indexes. Instead I used a single view with two analog sticks and filtered the events based on their distance to the center of the stick. In this case as you can see in the code, I used a radius of 200 pixels. I also used the TouchScreenButton.is_pressed() method to make sure that the event is related to that stick.

It works fine! Although I can’t know were or when this solution could break. But so far so good.


Here I simply temporarily added a shape on the TouchScreenButton just to see the reach of the stick touch.

I still didn’t implement any stick direction and magnitude. But that is the easiest part, so I didn’t bother yet.

Now based on this I’ll try to implement an independent stick scene, for the sake of encapsulation.

I hope it helps someone. I surely struggled against this issue.

Images are broken :frowning:

aggregate1166877 | 2019-12-19 02:53

:bust_in_silhouette: Reply From: billyb

Just got this working beautiflly on androd, so here’s the key:

setup a canvas layer for your GUI.
For plain buttons, use TouchScreenButton nodes.

For other touches, you have to script on the main Gui canvaslayer node, and catch and track ids. There is no visible object in the scene that is associated with the events. Events are sent to _input(event); ALL events, mouse, touch, everything.

An initial touch event will look like:
InputEventScreenTouch.index = 0
InputEventScreenTouch.pressed = true
InputEventScreenTouch.position = Vector2( etc. )

second touches increment to index = 1, index = 2 etc.

A touch motion is:
InputEventScreenDrag.index = 0
InputEventScreenDrag.position … etc…

a release is:
InputEventScreenTouch.index = 0
InputEventScreenTouch.pressed = false
InputEventScreenTouch.position = Vector2( etc. )

MAKE SURE: you are not using logic on any mouse events, because these are also being received by _input(ev)

Here’s my code for godot 3, email me at wbeck@gmail.com if you need more help:
script on the canvas GUI node:

extends CanvasLayer

"""
this script is attached to my GUI canvas layer, with the following nodes structure:

Game
  PlayerObjects
  Camera
  ...
  GUI
    TouchScreenButton
	...
"""

# a ref back up to my game script, which itself refs playership, camera, etc.
var Game
var lc = 0

#button flags
var touchID = -1
var touchStart = Vector2()
var touchBoundaryX
var touchBoundaryY

func _ready():
	# Called every time the node is added to the scene.
	# Initialization here
	Game = get_parent()
	
	#get the viewport size, and find the halfway points
	var scrSize = get_viewport().get_visible_rect()
	
	#we will later check touch events for the lower lh corner,
	#using this boundarys
	touchBoundaryX = scrSize.size.x * .5
	touchBoundaryY = scrSize.size.y * .5






### all gui layer events are padded to _input before anything else
### NOTE these events are any and all touch events on the entire screen
### because this script is attached to the gui layer
func _input(ev):
	processOther()
	
	#TOUCH TILLER HERE
#	PRINT(ev.as_text())

	#new events: we aren't tracking an ID, the event is pressed and touchscreen
	if(touchID == -1 && ev is InputEventScreenTouch && ev.pressed):	
		#only interested in events in our region, lower l.h. corner
		if(ev.position.x > touchBoundaryX || ev.position.y < touchBoundaryY): return
		# OK, we have a new press event
		touchID = ev.index
		#we add a vector point straight down, so our drags will all "tilt" away,
		# and avoid wrapping around and flipping the angle
		touchStart = ev.position + Vector2(0, 300)
		
		#my player had a flag to track input methods
		Game.PlayerScript.pointerIs = "touch"
		
		#do something to my player's ship
		Game.PlayerScript.grabTiller()
	
	#event matches our id and is a drag
	elif(ev is InputEventScreenDrag && ev.index == touchID):
		
		#track the change in position, with a sensitivity factor
		var md = (ev.position - touchStart) * .5
		
		#... and then pass an angle, first rotated (since 0 faces East, but I want it north) and converted from radians to degs
		Game.PlayerScript.setTiller((md.angle() + Game.CC90) * Game.RAD_TO_DEG)

	#event matches our id and is a release
	elif(ev is InputEventScreenTouch && (ev.index == touchID) && (ev.pressed == false)):
		touchID = -1
		Game.PlayerScript.releaseTiller()

Thank you for your answer Billy. I’ll take a deeper look at it asap. I figured that out after a few days trying hehehe. But then I realized that the issue was really with the gui_input() function, it’s different somehow and it isn’t only a matter of order of listening. I solved my problem by simply using _input() instead and only the touch location, not even needed to use the index (probably it’s useful when the event intersects with the region of the other analog). I’ll post my solution whenever possible and also test yours. Thanks again!

Daniel G. Machado Jr | 2018-06-13 15:26