Linux touch screen test

Touchscreen

If you ever tried to set up a touchscreen device in linux, you might have noticed that it is either working out of the box (besides some calibration) or is very tedious, especially when it is not supported by the kernel.

Introduction

This article assumes that your touchscreen device is supported by the kernel (e.g. by the usbtouchscreen module). That means there exists a /dev/input/event* node for your device. Check out

to see if your device is listed or try

# cat /dev/input/event? # replace ? with the event numbers

for every of your event nodes while touching the display. If you found the corresponding node, it is likely that you will be able to get the device working.

Available X11 drivers

There are a lot of touchscreen input drivers for X11 out there. The most common ones are in the extra repository:

  • xf86-input-evdev (likely the default driver if you plug in your touchscreen and it «just works»)
  • xf86-input-libinput ; see also libinput
  • xf86-input-elographics

Less common drivers, not contained in the repository, are:

  • xf86-input-magictouch
  • xf86-input-mutouch
  • xf86-input-plpevtch
  • xf86-input-palmax

Proprietary drivers exist for some devices (e.g.: xf86-input-egalax AUR ), but it is recommended to try the open source drivers first.

Depending on your touchscreen device choose an appropriate driver. Again, evdev is likely to be the default if your touchscreen «just works.»

Two-fingers scrolling

The two-fingers scrolling has to be implemented on the application side (see this link). For Firefox, see Firefox/Tweaks#Enable touchscreen gestures.

There is a hack to emulates this scrolling behavior for every application in #Touchegg, but the X server still handles it as text selection (at least with Plasma).

evdev drivers

Calibration

Install xinput_calibrator AUR (AUR). Then, run xinput_calibrator and follow the instructions.

Using a touchscreen in a multi-head setup

To use multiple displays (some of which are touchscreens), you need to tell Xorg the mapping between the touch surface and the screen. This can be achieved with xinput as follows.

Take for example the setup of having a wacom tablet and an external monitor; xrandr shows both displays:

Screen 0: minimum 320 x 200, current 2944 x 1080, maximum 8192 x 8192 LVDS1 connected 1024x768+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 1024x768 60.0*+ 800x600 60.3 56.2 640x480 59.9 VGA1 connected 1920x1080+1024+0 (normal left inverted right x axis y axis) 477mm x 268mm 1920x1080 60.0*+ 1600x1200 60.0 1680x1050 60.0 1680x945 60.0

You see we have two displays here. LVDS1 and VGA1. LVDS1 is the display internal to the tablet, and VGA1 is the external monitor. We wish to map our stylus input to LVDS1. So we have to find the ID of the stylus input:

⎡ Virtual core pointer id=2 [master pointer (3)] ⎜ ↳ Virtual core XTEST pointer id=4 [slave pointer (2)] ⎜ ↳ QUANTA OpticalTouchScreen id=9 [slave pointer (2)] ⎜ ↳ TPPS/2 IBM TrackPoint id=11 [slave pointer (2)] ⎜ ↳ Serial Wacom Tablet WACf004 stylus id=13 [slave pointer (2)] ⎜ ↳ Serial Wacom Tablet WACf004 eraser id=14 [slave pointer (2)] ⎣ Virtual core keyboard id=3 [master keyboard (2)] ↳ Virtual core XTEST keyboard id=5 [slave keyboard (3)] ↳ Power Button id=6 [slave keyboard (3)] ↳ Video Bus id=7 [slave keyboard (3)] ↳ Sleep Button id=8 [slave keyboard (3)] ↳ AT Translated Set 2 keyboard id=10 [slave keyboard (3)] ↳ ThinkPad Extra Buttons id=12 [slave keyboard (3)]

We see that we have two stylus inputs. We now need to simply map our inputs to our output like so:

$ xinput --map-to-output 'Serial Wacom Tablet WACf004 stylus' LVDS1 $ xinput --map-to-output 'Serial Wacom Tablet WACf004 eraser' LVDS1

You can automate this by putting these commands in your ~/.xinitrc or similar. The mapping will be lost if the touchscreen is disconnected and re-connected, for example, when switching monitors via a KVM. In that case it is better to use a udev rule. The Calibrating Touchscreen page has an example udev rule for the case when a transformation matrix has been calculated manually and needs to be applied automatically.

Читайте также:  Linux sh unexpected operator

Using xrandr-watch-git to automate map-to-output

There are xrandr events we can capture from a script. Install xrandr-watch-git AUR , create a script ~/.xrandr-changed with execution permission to perform map-to-output, for example:

#!/bin/sh xinput --map-to-output "Wacom HID 4861 Finger touch" "eDP1"

and start, test and enable the systemd/User service xrandr-watcher.service .

Wayland/Weston

Wayland does not currently have a known method to lock touching to a specific display in any environment other than sway (or wlroots-based supported compositors). There are tools such as weston-touch-calibrator, but Gnome Wayland uses Xwayland leaving the calibrator unable to locate any touchscreen.

Wayland/Xwayland also masks the xinput list and funnels them down to generic xwayland devices such as «xwayland-pointer»,»xwayland-relative-pointer»,»xwayland-touch-pointer», etc. The Wayland method of «Xinput» is «Libinput», but does not have all the same functionality. The current known method to use touchscreens in a multi-head setup is to force Gnome or KDE to use X11. libinput currently assumes the touchscreen(s) covers all available monitors.

Touchegg

Touchegg is a multitouch gesture program, only compatible with X, that runs as a user in the background, recognizes gestures, and translates them to more conventional events such as mouse wheel movements, so that you can for example use two fingers to scroll. But it also interferes with applications or window managers which already do their own gesture recognition. If you have both a touchpad and a touchscreen, and if the touchpad driver (such as synaptics or libinput) has been configured not to recognize gestures itself, but to pass through the multi-touch events, then Touchegg will recognize gestures on both: this cannot be configured. In fact it does a better job of recognizing gestures than either the synaptics or libinput touchpad drivers; but on the touchscreen, it is generally better for applications to respond to touch in their own unique ways. Some Qt and GTK applications do that, but they will not be able to if you have Touchegg «eating» the touch events. So, Touchegg is useful when you are running mainly legacy applications which do not make their own use of touch events.

Читайте также:  Python linux где расположен

The two-fingers scrolling has been disabled in the recent rewrite of touchegg 2.0. To enable it, install xdotool and see this closed issue.

Источник

Calibrating Touchscreen

To use multiple displays (some of which are touchscreens), you need to tell Xorg the mapping between the touch surface and the screen. This can be done using xinput to set the touchscreen’s coordinate transformation matrix.

This is a guide to do that, the old-fashioned way, in cases when xrandr does not know about your separate screens because they have been merged into one (e.g., when using TwinView). Everyone else, please go to Touchscreen to do it the easy way.

You will need to run the xinput command every time you attach the monitor or log in. Or course, you can add the command to your session-autostart. You can also use Udev to automate this.

Using nVidia’s TwinView

Get to know your system

Your screen

Using TwinView, X will see all your Screens as one big screen. You can get your total height and width by executing

$ xrandr | grep \* # xrandr uses "*" to identify the screen being used

You should see a line like this:

what means, your total width is 3600 and your total height is 1230.

Your touch device

Your next job is to get your device’s name. Execute

and find it by its name. Find the item containing [slave pointer (2)] , which is usually your own device name. E.g. if the line can look like this

⎜ ↳ Acer T230H id=24 [slave pointer (2)]

Tip: If your device contains both a stylus and a touch screen and more touch devices, then please pay attention to the name when determining the device.

$ xinput list-props "Device Name"

and make sure there is a property called

Coordinate Transformation Matrix

(If not, you may probably selected the wrong device, please try another one.)

Читайте также:  Хост при установке линукс

Touch area

You need to shrink your touch area into a rectangle which is smaller than the total screen. This means, you have to know four values:

  • Height of touch area
  • Width of touch area
  • horizontal offset (x offset) (amount of pixels between the left edge of your total screen and the left edge of your touch area)
  • vertical offset (y offset) (amount of pixels between the top edge of your total screen and the top edge of your touch area)

Calculate the Coordinate Transformation Matrix

Now, calculate these as accurate as possible:

  • c0 = touch_area_width / total_width
  • c2 = touch_area_height / total_height
  • c1 = touch_area_x_offset / total_width
  • c3 = touch_area_y_offset / total_height
[ c0 0 c1 ] [ 0 c2 c3 ] [ 0 0 1 ]

which is represented as a row-by-row array:

Apply the Matrix

$ xinput set-prop "Device Name" --type=float "Coordinate Transformation Matrix" c0 0 c1 0 c2 c3 0 0 1
$ xinput set-prop "Acer T230H" --type=float "Coordinate Transformation Matrix" 0.533333333 0 0 0 0.87804878 0.12195122 0 0 1

to calibrate your touchscreen device. Now, it should work properly.

Do it automatically via a udev rule

Create a file something like /etc/udev/rules.d/99-acer-touch.rules with contents like this:

/etc/udev/rules.d/99-acer-touch.rules
ENV=="2149",ENV=="2703",ENV="DVI1",ENV="1 0 0 0 1 0"

Substitute your own touchscreen’s vendor ID, model ID, the xrandr output name, and the calibration matrix that you calculated above. This is based on the assumption that you are using the libinput driver for your touchscreen.

Wayland

Using libinput you can calibrate your touchscreen on Wayland compositors. See the libinput documentation.

If you have weston installed, you can use the weston-calibrator utility to get the transformation matrix. You can then apply it using a udev rule.

Troubleshooting

If, after following these instructions, multiple clicks occur in different places when you touch the screen, you will need to build the xorg-server package using the ABS, applying this patch before you build the package. (This patch fails on the current xorg source, but the bug is present on at least 1 system.)

Using libinput

The libinput package provides a few utilities to debug input events:

  • The libinput debug-events command provides a list of events emitted by all devices, including the touchscreen driver. You can use the —verbose option to get more information.
  • libinput debug-gui provides a graphical debug environment. This can be useful to verify visually that the transformation matrix has the correct values.
  • libinput list-devices lists all input devices. This can be useful to identify the name and eventual attributes of an input device. Using this command you can also verify that the transformation matrix was applied correctly.

For more information see the troubleshooting section of the libinput page.

See also

Источник

Оцените статью
Adblock
detector