The MASM Forum
General => The Workshop => Topic started by: Biterider on December 29, 2021, 09:47:38 PM
-
Hi
I want to implement a zoom gesture that acts on a window, but selectively in the x and y directions. :icon_idea:
The idea is to analyze the 2-finger pinch gesture and get the line connecting the 2 contact points and break it down into delta x and delta y to calculate the zoom factors in each direction.
Has anyone done something like that?
Biterider
-
The only way I have even zoomed a window is with MoveWindow(). Primitive but fast and smooth if the steps are not too large.
-
I don't have a touch screen, but these messages could be helpful:
WM_GESTURE
WM_TOUCH
WM_POINTER
You are certainly aware of RegisterTouchWindow (https://docs.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-registertouchwindow) & friends (https://docs.microsoft.com/en-us/windows/win32/wintouch/guide-multi-touch-input). Sounds like a nice project :thumbsup:
-
If you can track the two finger positions, the rest should be easy. You would probably need to write both a left hand and right hand resizer but they are not all that hard to do.
-
You could consider trying the Windows BitBlt function. I've never tried using it to stretch only in one direction but it could work.
-
Hi
Thanks for your replays. :biggrin:
AFAIK, there are 2 ways to get touch information: WM_TOUCH/WM_POINTER or WM_GESTURE and they are mutually exclusive.
WM_TOUCH is more of the raw approach, while WM_GESTURE gives a more elaborate result that can be used right away if you want the standard gestures.
In this case, WM_GESTURE returns a zoom gesture, but does not distinguish in which direction.
On the other hand, if we want to use WM_TOUCH we have to write a full gesture recognition algorithm, which is a huge undertaking.
Zooming the window contents is another topic that is not relevant right now.
Biterider