Prototyped a little app that allows you to take frames from
@figma
and 'pull them into space'.
The frames stay linked to the canvas and update on any changes
Made a little hourglass prototype in SwiftUI this weekend ⏳
I started with the idea of using a physical flip interaction to start/reset a timer and iterated from that
Tried to recreate the subtle glass material of the Vision Pro interface in SwiftUI.
It adapts the reflections to the room's light source.
It’s difficult to capture on video but looks really pleasing in person. It’s very subtle but adds some nice texture and depth to the surface.
Published a new
@figma
plugin "Beautiful Shadows" that explores a different way of creating shadows.
Instead of adjusting offsets and blur values, you simply drag a physical light source around (parameterized by azimuth, distance etc.)
Check it out ☀️
Just published Copy & Rotate, a
@figmadesign
plugin which allows you to create rotated copies of elements – with preview right in the canvas🙌Copies are componentized, which allows you to make changes to your design anytime.
Prototyped a subtle Zoom Blur and Motion Blur SwiftUI shader this weekend.
It's a little detail that changes the feel of the dragged element entirely (and is fun to play around with)
One little detail is the grid animation while detecting a frame. I wanted to go for something that feels like something is being assembled/constructed.
It’s pretty simple and almost entirely driven by a spring delay value (SwiftUI).
(Commented gist: )
Played around with dynamic backgrounds too. They’re not so much different from time-based day/night backgrounds but the gradual changes make it pretty interesting.
I’ve made a little AI copilot for Origami Studio.
Think of it as a tiny code editor that sits on top of Origami's canvas and lets you generate patches using GPT-4 – all within the same surface.
It’s a native macOS app and you can get it here:
It's just a little idea I had while toying around with the visionOS SDK.
How it works: it's a Figma plugin that talks to an iOS app via WebSockets. All Figma frames are loaded as AR reference images which makes them detectable by ARKit (works surprisingly well) 1/2
When designing the app icon I learned about ‘crease patterns’ – structural representations of intricate origami folds.
The app icon shows the crease pattern for an Origami Crane – a nod to Origami Studio’s original app icon.
Excited to publish a Figma tool *for* Figma developers.
Together with
@diagram
we're publishing Debugger, a Figma plugin for inspecting and debugging Figma layers. Support for all node types, watching and editing values, filtering and some more ✨
It’s an amalgam of SpriteKit particle layers, gradients and blend modes. Together they create a nice materiality.
I liked the idea of using a turbulence burst to change between the material's colors
Here are some designs/particles that didn't make the cut. Either because I couldn't get them to flow nicely or they didn't look that good on the device.
...once detected, a high-res version is loaded, mapped onto a plane and stored as a reference for future updates. It's a lot of hot-glue and cobbled together, but it feels pretty magic to just pull frames from the screen like that :-) 2/2
@ollybromham
For the time, I just had a hunch how it should move and feel. I pulled some of the variables into sliders and built a little 'design tool' which allowed me to fidget around with the values and feel it out (this makes SwiftUI such a neat prototyping tool!) :-)
The blur strength responds to the drag/scale velocity. I'm using
@jmtrivedi
's fantastic Wave package which makes it easy to work with the velocity + animate the shader values.
The code is up here:
The demos are not actually reading the ambient sensor though (couldn’t get SensorKit permissions to work). Instead the luminosity is calculated using the device’s camera exposure. I grabbed the snippet from SO user AnuradhaH ().
The prototype's implementation is pretty naive. I’m sampling the camera view and the brightest area (pixels) act as the light source. I wish there was an easy way to access the iPhone’s native AmbientLightSensor via SensorKit and toy around with that :-)
@ollybromham
Realized that the thread didn't post, so here's a appendum:
The smoke is created with SpriteKit (Emitter and a Turbulence Field). I found that quickly oscillating the strength creates a dispersion effect, which goes nicely together with the device's accelerometer (the…
The animation progressively stretches and squeezes the layer at the same time. One trick was to delay each value a little bit to give each value room to breathe and do its thing...
The code is up here:
The effect is a blur of linear gradients, shapes and SpriteKit particles that are mapped to the device's motion sensor :-)
(I really like how it ended up resembling one of the iPhone's signature wallpapers.)
Having a lot of fun with these little SwiftUI explos recently. I’m learning a lot by just taking apart and remixing
@philipcdavis
’ and
@jsngr
’s prototypes — totally worth to check out Philip’s prototyping kit and Jordan’s repos
The plugin spawned from a small exploration of "intuition-based" inputs where I added a interactive light source directly to the Figma canvas.
Ex. for intuition: creating more diffuse shadows → simply scale the light source up
@ollybromham
The stick itself is a spline that gets more flexion the longer it gets (like a flimsy piece of wood). It's a v subtle detail that gives the interface more materiality and makes it feel less like a slider – even if you're not consciously aware of it
@ollybromham
I liked the idea of having a flick 'matchstick' gesture to start the timer. It punctuates the moment and contrasts the otherwise calm and ephemeral feel.
Relying on drag velocity alone to detect a flick wasn't enough – I found that flick gestures fired less touch events, so I…
Last but not least, I'm yet again again delighted by
@yuanqinglim
's create-figma-plugin toolkit (). For every headache there is a utility function and the new docs are pretty sweet. Allowed me to go from idea → publish in just a few afternoons.
This was also the first time I used
@yuanqinglim
's create-figma-plugin toolkit and wow – what a comprehensive and thought-out library. Allowed me to have the first prototype up and running in just an hour. A staple for building Figma plugins from now on😌
The idea was then cultivated by
@JoshWComeau
's amazing read and resource about 'Designing Beautiful Shadows in CSS' (). The credits for the smooth shadows, tint and some other fun interactive bits basically go to him 🙌
Make sure to check out
@jmtrivedi
's post as well
As always, thank you so much Janum for open-sourcing these prototypes; I'm learning so much from them 🥂
Curious how the Dynamic Island app morph animation works?
I built a similar mesh transform animation in SwiftUI, and open-sourced it!
It’s simple, interruptible, runs at 120hz, and doesn’t use any private APIs:
PS. little accidental find:
You get a directional Motion Blur for free if you blur the layer *before* applying the distortion shader. The shader compresses the blur horizontally but stretches/emphasizes the blur vertically
@NickADobos
Kinda! I'm using a timer to drive the animations and the sand levels show the remaining and elapsed time. For the sake of the demo I sped up the timer and displayed arbitrary minutes :-)
It's just a prototype, but if anyone cares to jump into the code:
@sdw
always been a fan of
@nandocosta_art
's material explorations for Microsoft Office (e.g. )
Would love to see some of these textures/materials make their way to visionOS' spatial surfaces
@tarngerine
@parkerhendo
@diagram
yea we had to abandon it for reasons, but I still believe in the idea behind it. One pattern I love is that the code is stored on the Artboard itself, so to iterate on a prototype you'd just option-drag/duplicate the frame – it's like visual versioning, felt p great & natural
In my day-to-day prototyping I use it mostly to create little utility patches and ‘jigs’ for all sorts of things (transforms, validate strings, randomize this and that etc.)
Just describing the desired inputs and outputs of a patch helps a lot – it makes you break down the logic…
@jmtrivedi
Super fun to think about how that'd play out for different apps!
(Was just thinking about Calender with an 'Add event' action and/or 'Next Up' view)
@warpling
yeah, would be fun to build some web prototypes with it :-) Yesterday I learnt that there used to be a DeviceLight API many years ago ().
Never made it past the experimental feature flag though...
@stevenschafer
@duyluongdesign
Ran into the same issue w/ SensorKit. I'm not sure if the entitlements are only needed for publishing or for general sensor reading. I'll poke this a little bit more this week and will keep you posted Steven :-)