Hmm, sorry, I've never tested such a thing.
It says it continues to track movement, maybe it's not a compatible thing.
If you're able to track to the motion and clicks with regular input events, then you can cast your own rays.
The camera can give you a unprojected ray normal, based on the viewport coordinates.
You can put a RayCast node on the camera and call something like this, and then test the ray intersection in the fixed_process.
my_ray.set_cast_to(my_camera.project_ray_normal(event.pos) * ray_length)
Or you can also store the coordinates, and during the fixed_process grab the world state and use the PhysicsServer to cast a ray.
var state = get_world().get_direct_space_state()
var intersections = state.intersect_ray(from_vector, to_vector)