How to detect tapping (touch input) globally instead of mouse clicking?

15,340

RegisterTouchWindow is not necessary for Win 8 apps any more.

As far as I known, there are several ways to achieve your purpose with some limitations.

  1. This article here works on both Win 7 & 8. But the vendor ID and the product ID of the touchscreen is required. There is a possibility that your application won't work properly with some touch device.

  2. Use RegisterPointerInputTarget on Win 8. From my debugging, the windows 8 touch mechanism has its unique characteristics. Between a touch down and a touch up, all the touch events will be sent to the window which receives the first touch event, no matter the window is minimized or covered by another window or set WS_EX_TRANSPARENT attribute in its later period. The touch events between one pressing and releasing can only be shared to another window if the first one is destroyed. Using this API, all the touch events will be sent to the registered window. Other windows cannot receive touch events any more until UnregisterPointerInputTarget is called by the registered window, or the touch input is injected back into the system by using InjectTouchInput. "Input injected by the registered input target will not be intercepted." Please notice that UI Access privilege is required for using this API. A sample can be found in here.

  3. Windows hook. For desktop app on Win 7/8, touch events can be easily hooked by using SetWindowsHookEx with WH_CALLWNDPROC or WH_GETMESSAGE. For metro app on Win 8, only the first pointer event can be detected in the message loop of the window. Although a pointer event can be occurred by either a click or a tap, GetPointerType can tell you if its a touch pointer or a mouse pointer. Samples for using hooks can be found at here.

A code snippet for handling pointer events:

switch(Msg)  
{  
...  
case WM_POINTERENTER:  
case WM_NCPOINTERDOWN:  
case WM_NCPOINTERUP:  
case WM_NCPOINTERUPDATE:  
case WM_POINTERACTIVATE:  
case WM_POINTERCAPTURECHANGED:  
case WM_POINTERDOWN:  
case WM_POINTERLEAVE:  
case WM_POINTERUP:  
case WM_POINTERUPDATE:  
    {  
        UINT32 pointerId = GET_POINTERID_WPARAM(wParam);  
        POINTER_INPUT_TYPE pointerType;  

        if (GetPointerType(pointerId, &pointerType))  
        {
            if (pointerType == PT_TOUCH)   
            {  
                ...  
            }  
        }  
    }  
    break;  
...
Share:
15,340
blez
Author by

blez

Updated on June 05, 2022

Comments

  • blez
    blez almost 2 years

    I want to make an app that shows itself when the user touches his screen. It shouldn't work for click. I looked up for the touch hanlders in Windows 7/8. But I saw that every touch window must be registered with RegisterTouchWindow

    TL;DR

    Is there a way to get the touch points position outside my window (globally)?

  • blez
    blez over 11 years
    Thanks for helping. Do you have any examples of the above?
  • Luke
    Luke over 11 years
    I added a link and a code snippet for using Windows hook. I am not sure if these are the examples you need.
  • phyatt
    phyatt over 9 years
    FYI for the Window Hook solution, on Windows 7, both for the WH_CALLWNDPROC and the WH_GETMESSAGE Hooks, they run in the other process, so the standard error and standard output don't show up as expected. Instead create a file and log to a file. And neither one seems to get WM_INPUT or WM_TOUCH on Windows 7. Right now I have the raw input working as just a binary, yes touch is in use or no touch is in use, that works properly when the hardware mouse is active or not. Thanks for the post.
  • tofutim
    tofutim about 8 years
    @blez which one did you end up going with/