Proving the G1 screen can handle multi-touch

Nov 22nd Update: I posted a video of a proof-of-concept implementation of multitouch on the g1 in a post today. The contents of this post are still valid – but it is easier to see proof in the implementation video for some people than it is to read log files.

So there has been a lot of speculation about if the G1 screen can handle multitouch or not. Most of the people claiming it can handle it have posted links to some crazy videos showing it doing some kind of thing inside of android – but that approach doesn’t hold much weight with me. If the driver doesn’t have multitouch, then obviously nothing in the OS can make use of multitouch…

I’d long since dismissed the possibility of the G1 having multitouch until ionstorm in #android posted some links confirming the G1 uses a synaptics touch screen and that synaptics likes to brag about the multi-touch on their screens.

Then unix_infidel pointed out that there was some stuff in the synaptics driver that was commented out… which – was true. (the file is located in drivers/input/touchscreen/synaptics_i2c_rmi.c of the msm kernel source – you can see the git info for the msm kernel online)

By uncommenting a bunch of lines in the synaptics touchscreen driver, and recompiling my kernel and replacing my boot.img – I was able to enable the debug logging of the touch input that tracks 2 fingers.

The following code is what is printing out the debug info that the linked log snippets show:

printk("x %4d, y %4d, z %3d, w %2d, F %d, 2nd: x %4d, y %4d, z %3d, w %2d, F %d, dx %4d, dy %4d\n",
x, y, z, w, finger,
x2, y2, z2, w2, finger2,
dx, dy);

Here’s some of the debugging output:
One finger swirling around
One finger held constant, the other finger swirled around
2 Fingers flicking left, then right
2 fingers rotating counter-clockwise (kind of… that motion is rather hard to do)
2 fingers (separate hands) – one finger moving up, while the other moves down
2 fingers – one held constant and varying pressure – another finger tapping at various points on the screen and also varying pressure

So… Here is the conclusion:
The G1 screen can definitely track 2 fingers. Why Android does not make use of this is an exercise for the reader to answer. (Patents is the most obvious answer. But I’m sure more interesting conspiracy theories can be concocted)

UPDATE: in #android – morrildl – who works for google but probably doesn’t officially speak in any capacity for Google brought up a few interesting points with regard to this:

  • HTC has specified that the G1 will have a single-touch screen. (This is significant, because their spec is for single touch for the G1, this means that they could in the future source touchscreens that are not multi-touch capable – so just because a certain run of G1s might have a multitouch capable screen, they have the liberty to swap out parts [and they may already have G1s out in the field that don’t have a multi-touch capable screen])
  • The other issue is with how the driver reports the width of the touch. It appears that the “w” element is the same on both of the fingers (altough this might just be a quirk in the driver code that was commented out – since it does seem to be based on pressure and putting fingers on opposite corners and pressing lightly still shows a 1 for “w” – but placing 2 fingers close together and pressing hard will show a 15 for “w” – so I’m not entirely convinced of this)

It sounds like the road to multitouch on the G1 could now be a little more complicated. I’m not sure if HTC could ever revise the specs of the G1 without changing the model name… so even if the hardware is identical they might have to have a G1m or something? (who knows).