is there some settings that need to be changed for the Touch behavior on android to work as expected ?

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By igbow

When I try my game on the phone (S8) through the Godot editor, the touch input work as expected but when I export the game and install it on the same phone , it doesn’t work as expected, as if sometime it register the touch input and some other time it just ignore it.
is there some special settings that need to be changed be fore the export? or maybe my code is wrong. ( I am very beginner )

  • it is GLES3 and I only add the package unique name and name before the export.
    -and under project settings>Input Devices>Pointing both case are checked

I register the touch through

func _input(event):
if event is InputEventScreenTouch:
	if event.is_pressed():  
		touch_hold = true
		state=1

	elif not event.is_pressed():  
		 touch_hold = false

touch_hold is a Boolean variable
and then I check every frame trough _process() the state of touch_hold

func _process(delta):
if touch_hold:
	
	x= get_viewport().get_mouse_position().x
	y= get_viewport().get_mouse_position().y
	
else:
	die()

if state == 0:
	centerx=get_viewport().get_mouse_position().x
	centery=get_viewport().get_mouse_position().y

The intent from this code is to be able to know if the player touched his phone the start the game (state=1)
and place an object on the touch coordinate (centerx, centery)
if he is still touching his phone, the game still on if not game over and state =0

:bust_in_silhouette: Reply From: wombatstampede

Your code is a bit odd:
#1 You want to get a touch position but query the mouse position.
#2 And you query the mouse (!) position not at the time of touch but later.
#3 You ignore the index property of the event. This may lead to “misunderstandings”.

Android devices can actually have mouse devices attached but i doubt that this is your intent. Touch event may also trigger mouse events if the touch->mouse emulation is set on in the project settings (and vice versa) but this is unnecessary here.

If you don’t want to completely process the event in the _input handler then you should at least save event.position for later use in a Vector2 variable. In _input this position is the viewport position of the touch (not the local position inside the control where _input is triggered).

If you ignore event.index then the touch_hold variable may have a wrong value at times if the user touches with multiple fingers. I.e. press with index=1 for finger 1, then 2nd press with index=2 for another finger, when finger 1 is now released then touch_input would be set to false even though finger 2 is still pressed. But this may not be a problem for your app.

With my code I want to know if the player is touching his phone at any time and be able to get his finger position for some calculation, even when dragging his fingers.
As for the index I only need the first touch thus ignoring the event.index.

I used the get_mouse_position() on the _process because if I don’t know how to find the touch position, a similar func that get the touch position doesn’t seem to exist to my knowledge ( get_touch_position() )

If you don’t want to completely process the event in the _input handler then you should at least save event.position for later use in a Vector2 variable. In _input this position is the viewport position of the touch (not the local position inside the control where _input is triggered).

It is not a question of preference, but mainly what worked to a degree, when I try to save event.position at _input() for some reason I get nothing.

Android devices can actually have mouse devices attached but i doubt that this is your intent. Touch event may also trigger mouse events if the touch->mouse emulation is set on in the project settings (and vice versa) but this is unnecessary here.

When you say android devices can have mouse device you are talking about physical device, is this intent to mouse->touch ?
what I understood is you can call the mouse position on your code and it will be equivalent to touch position.

igbow | 2019-06-17 18:05

If you drag across the screen the event InputEventScreenDrag will be triggered (multiple times). It also has a position attribute to query the drag position but no pressed attribute.

Saving the touch position should work somewhat this way: (untested)

var touch_pos = Vector2()
var touch_hold = false

func _input(event):
  if event is InputEventScreenTouch:
    if event.pressed:
       touch_hold = true
       touch_pos = event.position
       state=1
     else:
       touch_hold = false
  elif event is InputEventScreenDrag:
    touch_pos = event.position

func _process(delta):
  if touch_hold:
    x = touch_pos.x
    y = touch_pos.y
  #and so on...

If this still does not work then you could put some print statements in the touch pressed/!pressed handling. Perhaps this will show what is happening while debugging.

wombatstampede | 2019-06-18 06:23

Thank you so much !!!
touch_pos = event.position was the key I didn’t know that we can do this (again I’m beginner but I probably need to re-read the documentation with a different set of mind ) I though event.position work only with the mouse thus my odd code.
Mouse and input coordinates

igbow | 2019-06-18 23:05