This component defines nodes to interact with a pointing device (a mouse).
TouchSensor allows to catch click events on 3D objects.
Drag sensors allow user to edit a transformation of 3D objects:
PlaneSensor allows to move objects,
SphereSensor allows to rotate objects,
CylinderSensor allows to rotate objects around a constrained
For demos and tests of these features,
sensors_pointing_device subdirectory inside our VRML/X3D demo models.
Full support, including:
You should apply some texture on your shape,
otherwise texture coordinates will not be geneted (and this event
will always generate zero vector).
Note that it's only a single 2D texture coordinate. If you use volumetric 3D textures (from DDS file), the additional texture coordinate components will be ignored. If you use multi-texturing, the additional texture units (above the first) will be ignored.
hitNormal_changed event. Generates nice smooth normals when
the shape is smooth (e.g. creaseAngle > 0).
Note: Normals output by
hitNormal_changed are in
the shape local coordinate system.
Spec doesn't say in which coordinate system they should be,
please report if you have any idea what is expected /
what other browsers do.
axisRotation with non-zero rotation is used,
trackPoint_changed is generated in local sensor coordinates
(with transformation and axisRotation applied),
axisRotation is still useful, it is not
a shortcut for using
around the sensor. Reason: wrapping sensor in a
would change it's siblings. So
axisRotation is useful
under our interpretation.
axisRotation, notes above about
PlaneSensor.axisRotation apply also here.