Masm32 SDK description, downloads and other helpful links

Main Menu

Gesture challenge

Started by Biterider, December 29, 2021, 09:47:38 PM

Previous topic - Next topic


I want to implement a zoom gesture that acts on a window, but selectively in the x and y directions.  :icon_idea:

The idea is to analyze the 2-finger pinch gesture and get the line connecting the 2 contact points and break it down into delta x and delta y to calculate the zoom factors in each direction.

Has anyone done something like that?



The only way I have even zoomed a window is with MoveWindow(). Primitive but fast and smooth if the steps are not too large.    :biggrin:  :skrewy:


I don't have a touch screen, but these messages could be helpful:

You are certainly aware of RegisterTouchWindow & friends. Sounds like a nice project :thumbsup:


If you can track the two finger positions, the rest should be easy. You would probably need to write both a left hand and right hand resizer but they are not all that hard to do.    :biggrin:  :skrewy:


You could consider trying the Windows BitBlt function. I've never tried using it to stretch only in one direction but it could work.
Whenever you assume something, you risk being wrong half the time.


Thanks for your replays.  :biggrin:

AFAIK, there are 2 ways to get touch information: WM_TOUCH/WM_POINTER or WM_GESTURE and they are mutually exclusive.
WM_TOUCH is more of the raw approach, while WM_GESTURE gives a more elaborate result that can be used right away if you want the standard gestures.

In this case, WM_GESTURE returns a zoom gesture, but does not distinguish in which direction.
On the other hand, if we want to use WM_TOUCH we have to write a full gesture recognition algorithm, which is a huge undertaking.

Zooming the window contents is another topic that is not relevant right now.