|ReleasedLast Release||Feb 2017|
Maintained by Jeff Verkoeyen.
|LOCLines of Code||290|
This library consists of the following plans:
Rotatable plans allow a user to move, scale, and rotate a view. They each listen for deltas emitted by a gesture recognizer and add them to the target.
If a view can be dragged then it can sometimes be pinched and rotated too. To make this easy, we provide a
DirectlyManipulable plan. It’s equivalent to individually adding
Rotatable to the same target.
The collection of
DirectlyManipulable represent traits that can describe behavior of a target view. When any of these traits are added to a view the view’s
isUserInteractionEnabled is enabled. If the plan’s associated gesture recognizer is not yet associated with a view then the gesture recognizer will be added to the target view.
view.layer.anchorPoint while maintaining the same
view.frame. This plan is emitted by
DirectlyManipulable when a gesture recognizer begins.
Import the framework:
You will now have access to all of the APIs.
Check out a local copy of the repo to access the Catalog application by running the following commands:
git clone https://github.com/material-motion/direct-manipulation-swift.git cd direct-manipulation-swift pod install open MaterialMotionDirectManipulation.xcworkspace
[runtime addPlan:[MDMDirectlyManipulable new] to:<#Object#>];
runtime.addPlan(DirectlyManipulable(), to: <#Object#>)
[runtime addPlan:[MDMDraggable new] to:<#Object#>];
runtime.addPlan(Draggable(), to: <#Object#>)
MDMDraggable *draggable = [[MDMDraggable alloc] initWithGestureRecognizer:panGestureRecognizer]; [runtime addPlan:draggable to:<#Object#>];
runtime.addPlan(Draggable(withGestureRecognizer: panGestureRecognizer), to: <#Object#>)
We welcome contributions!
Check out our upcoming milestones.
Licensed under the Apache 2.0 license. See LICENSE for details.