Ukaton Profile Banner
Ukaton Profile
Ukaton

@ConcreteSciFi

3,293
Followers
917
Following
339
Media
781
Statuses

Making smart insoles. Thoughts by Zack Qattan

Camarillo, CA
Joined November 2018
Don't wanna be here? Send us removal request.
@ConcreteSciFi
Ukaton
5 months
Updated our Safari Web Extension so now you can get AirPods Pro motion data in the browser Works on Safari for iOS and Mac Works with AirPods Pro, AirPods (3rd gen), AirPods Max, and Beats Fit Pro
@ConcreteSciFi
Ukaton
5 months
Since Web Bluetooth isn’t on Safari, we made a Safari Web Extension that uses our Swift SDK to connect to our smart insoles and motion modules, interfacing with our Web SDK Works on Safari for iOS, Mac, and (hopefully) the Apple Vision Pro
5
12
164
73
372
3K
@ConcreteSciFi
Ukaton
7 months
Made an Unreal Engine Plugin for our smart insoles and motion modules Works over Wi-Fi (desktop and mobile) and Bluetooth (iOS and Android only)
@ConcreteSciFi
Ukaton
10 months
Using our motion modules to rotate objects in @Blender
3
16
111
31
252
2K
@ConcreteSciFi
Ukaton
4 years
Positional Tracking on the AirPods Pro! By applying the AirPods Pro’s Motion Data to @navisens ’ motion mapping web SDK, we’re able to obtain both head tracking and positioning, upgrading the AirPods Pro from 3DoF to 6DoF!
@ConcreteSciFi
Ukaton
4 years
AirPods Pro Motion Tracking on the Web! @aframevr Here’s @umarqattan playing around with an iOS app we made that dispatches the AirPods Pro’s Motion Data into any webpage as a window event! Now we can prototype stuff with JavaScript and various web technologies!
6
121
446
23
313
846
@ConcreteSciFi
Ukaton
1 year
Improved my text-to-phoneme-to-speech editor, including direct phoneme manipulation, whispering, pig Latin, phoneme substitutions (accents, dialects, and impediments), and gibberish generation using a Markov chain
@ConcreteSciFi
Ukaton
1 year
Text-to-phoneme-to-speech #speechsynthesis Converting words to phonemes, which control @pishtaq ’s Pink Trombone. Added controls to set timing and pitch. I can even export the utterance and import it into my Pink Trombone timeline editor
2
5
41
15
108
707
@ConcreteSciFi
Ukaton
1 year
Made a @unity BLE integration for our smart insoles and motion modules
@ConcreteSciFi
Ukaton
1 year
Using my voice to take pictures in AR, name them, and locate their original source! #WebXR @aframevr I casted the Quest Pro to my MacBook to take pictures and run speech recognition, since neither is available on the Oculus Browser
3
13
120
8
97
599
@ConcreteSciFi
Ukaton
1 year
AR Violin on the Quest Pro 🎻 #WebXR @aframevr Instead of using AR to assist me playing a real violin, I used @Polycam3D to 3D scan the violin & bow, and used a violin #Tonejs instrument Each finger maps to each string, and the more I curl them the higher the note
@ConcreteSciFi
Ukaton
1 year
AR Violin Assistance on the Quest Pro! #WebXR @aframevr Used pitch detection to detect whether the right note is hit, then highlighted which string/position to play next Sorry for sounding like crap - never played the violin before
6
25
116
13
112
472
@ConcreteSciFi
Ukaton
4 years
AirPods Pro Motion Tracking on the Web! @aframevr Here’s @umarqattan playing around with an iOS app we made that dispatches the AirPods Pro’s Motion Data into any webpage as a window event! Now we can prototype stuff with JavaScript and various web technologies!
6
121
446
@ConcreteSciFi
Ukaton
2 years
Animating a @readyplayerme avatar in @aframevr via full body tracking using our motion modules and smart insoles No cameras or apps needed - just a simple webpage
@ConcreteSciFi
Ukaton
3 years
Full body tracking and positioning by combining our motion modules’ orientation for articulation and our smart insoles’ pressure for anchoring #vr Works in the browser via websockets, even on smartphones! @aframevr No cameras required, so you can use your phone while tracking
3
18
112
8
80
419
@ConcreteSciFi
Ukaton
8 months
AR Minimap on the Quest Pro! #WebXR @aframevr Navigating an area using a miniature @Polycam3D scan of the place, with a real-time marker of your location Great for finding friends and other things
@ConcreteSciFi
Ukaton
8 months
AR mirrors on the Quest Pro! @aframevr #WebXR Playing with virtual mirrors that reflect a @Polycam3D 3D scan of the house, allowing me to try out different mirror arrangements
3
13
87
5
78
353
@ConcreteSciFi
Ukaton
10 months
Controlling a room full of Philips hue lights in AR on the Quest Pro! #WebXR @aframevr Turn on lights by walking near them, looking at them, with an AR flashlight, or with an AR torch
@ConcreteSciFi
Ukaton
1 year
Controlling a Philips Hue light bulb with just my eyes using the Quest Pro’s eye tracking
4
44
236
10
47
247
@ConcreteSciFi
Ukaton
1 year
Controlling a Philips Hue light bulb with just my eyes using the Quest Pro’s eye tracking
@ConcreteSciFi
Ukaton
1 year
Record & Playback full body tracking in AR on the Quest Pro using our motion modules #WebXR @aframevr
0
4
25
4
44
236
@ConcreteSciFi
Ukaton
2 years
Controlling Philips Hue lights with the Quest Pro via Passthrough, all through a webpage @aframevr @tweethue
@ConcreteSciFi
Ukaton
2 years
Animating a @readyplayerme avatar in AR using the Quest Pro, @aframevr , and our motion modules for full body tracking Just a simple webpage - no cameras or apps needed
5
23
119
7
38
175
@ConcreteSciFi
Ukaton
5 months
Since Web Bluetooth isn’t on Safari, we made a Safari Web Extension that uses our Swift SDK to connect to our smart insoles and motion modules, interfacing with our Web SDK Works on Safari for iOS, Mac, and (hopefully) the Apple Vision Pro
@ConcreteSciFi
Ukaton
5 months
Our smart insoles and motion modules work on Apple TV! Great for gaming, gyms, rehabilitation, and other scenarios where you want a large display in a (mostly) hands-free environment
4
22
133
5
12
164
@ConcreteSciFi
Ukaton
4 years
Real-Time Multi-User Positional Tracking! #AR Using @croquetio 's live collaboration platform, I can receive other users' orientation & position obtained via the AirPods Pro API and iPhone 11's Nearby Interaction Framework! All on the web too! @aframevr
@ConcreteSciFi
Ukaton
4 years
Positional Tracking on the AirPods Pro! By applying the AirPods Pro’s Motion Data to @navisens ’ motion mapping web SDK, we’re able to obtain both head tracking and positioning, upgrading the AirPods Pro from 3DoF to 6DoF!
23
313
846
2
50
156
@ConcreteSciFi
Ukaton
5 months
Our smart insoles and motion modules work on Apple TV! Great for gaming, gyms, rehabilitation, and other scenarios where you want a large display in a (mostly) hands-free environment
@ConcreteSciFi
Ukaton
5 months
Our smart insoles also work natively on Mac 💻 Our Swift SDK works on MacOS, allowing developers to easily make SwiftUI apps that interface with our smart insoles and motion modules via WiFi or Bluetooth Receive motion/pressure data, and even trigger vibrations
1
9
65
4
22
133
@ConcreteSciFi
Ukaton
2 years
Animating a @readyplayerme avatar in AR using the Quest Pro, @aframevr , and our motion modules for full body tracking Just a simple webpage - no cameras or apps needed
@ConcreteSciFi
Ukaton
2 years
Animating a @readyplayerme avatar in @aframevr via full body tracking using our motion modules and smart insoles No cameras or apps needed - just a simple webpage
8
80
419
5
23
119
@ConcreteSciFi
Ukaton
1 year
Using my phonetic text-to-speech to lip sync a @readyplayerme avatar via @aframevr As I update the phoneme timing, it updates the timing of both the synthesized speech and animated mouth I can also use my knn-based @ml5js phoneme classifier to control the mouth as well
@ConcreteSciFi
Ukaton
1 year
Pronunciation Assistance Using Pink Trombone to pronounce a word, and Meyda.js + @ml5js to recognize a user’s phonemes to guide them through the pronunciation
0
4
26
3
22
119
@ConcreteSciFi
Ukaton
1 year
Using my voice to take pictures in AR, name them, and locate their original source! #WebXR @aframevr I casted the Quest Pro to my MacBook to take pictures and run speech recognition, since neither is available on the Oculus Browser
@ConcreteSciFi
Ukaton
1 year
AR Violin on the Quest Pro 🎻 #WebXR @aframevr Instead of using AR to assist me playing a real violin, I used @Polycam3D to 3D scan the violin & bow, and used a violin #Tonejs instrument Each finger maps to each string, and the more I curl them the higher the note
13
112
472
3
13
120
@ConcreteSciFi
Ukaton
1 year
Using my voice to scroll, play games, and draw The car game is from @bruno_simon ’s portfolio page (), and the drawing app is @steveruizok ’s @tldraw
@ConcreteSciFi
Ukaton
1 year
Using my voice to move around in VR @aframevr Using Meyda.js and @ml5js to map utterances to commands (turn left/right, move forward/backward, look up/down)
1
2
12
5
19
117
@ConcreteSciFi
Ukaton
1 year
AR Violin Assistance on the Quest Pro! #WebXR @aframevr Used pitch detection to detect whether the right note is hit, then highlighted which string/position to play next Sorry for sounding like crap - never played the violin before
@ConcreteSciFi
Ukaton
1 year
Infinite AR staircase @aframevr #WebXR
1
12
69
6
25
116
@ConcreteSciFi
Ukaton
3 years
Full body tracking and positioning by combining our motion modules’ orientation for articulation and our smart insoles’ pressure for anchoring #vr Works in the browser via websockets, even on smartphones! @aframevr No cameras required, so you can use your phone while tracking
@ConcreteSciFi
Ukaton
3 years
Testing the motion sensor on our new smart insoles (inside a @WearAtoms shoe), using a @PolycamAI scanned model in the browser via @aframevr
3
13
44
3
18
112
@ConcreteSciFi
Ukaton
10 months
Using our motion modules to rotate objects in @Blender
@ConcreteSciFi
Ukaton
10 months
Made a python sdk for our smart insoles and motion modules, allowing developers to access sensor data via BLE or WiFi
7
20
86
3
16
111
@ConcreteSciFi
Ukaton
2 years
Playing soccer in VR using our motion modules for leg tracking. All in the @oculus browser via @aframevr , with #cannonjs for physics
@ConcreteSciFi
Ukaton
2 years
Using our smart insoles and motion modules for @CybershoesVR -like locomotion in VR on the @oculus browser via @aframevr
3
11
48
5
25
91
@ConcreteSciFi
Ukaton
10 months
Made a python sdk for our smart insoles and motion modules, allowing developers to access sensor data via BLE or WiFi
@ConcreteSciFi
Ukaton
1 year
New enclosure for our smart insoles, featuring a modular clip design for different applications (straps, screw-on, shoe clip, hooks, etc), as well as a template clip for anyone to make their own custom clips Also includes a larger vibration motor
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
6
25
7
20
86
@ConcreteSciFi
Ukaton
8 months
AR mirrors on the Quest Pro! @aframevr #WebXR Playing with virtual mirrors that reflect a @Polycam3D 3D scan of the house, allowing me to try out different mirror arrangements
@ConcreteSciFi
Ukaton
10 months
Controlling a room full of Philips hue lights in AR on the Quest Pro! #WebXR @aframevr Turn on lights by walking near them, looking at them, with an AR flashlight, or with an AR torch
10
47
247
3
13
87
@ConcreteSciFi
Ukaton
1 year
Using the Quest Pro’s persistent anchors to align a @Polycam3D scan of our apartment with the real world, even when refreshing the page @aframevr #WebXR
@ConcreteSciFi
Ukaton
1 year
AR Spotlight search on the Quest Pro, using Object Detection and Plane Detection to recognize and remember where objects are! @aframevr #WebXR Used Teachable Machine to create the model, casted the video to my laptop to run the model, and streamed results back via WebSockets
7
8
56
2
10
80
@ConcreteSciFi
Ukaton
1 year
Collaborative AR using our full body tracking system, a @readyplayerme avatar, @CroquetIO , and @aframevr ! #WebXR The top left is a Quest Pro, top right is a Quest 2 using body tracking (we couldn’t record passthrough so we used a @Polycam3D scan), and the bottom is a MacBook
@ConcreteSciFi
Ukaton
1 year
Viewing ourself in the third person in AR, using our motion modules, a @readyplayerme avatar, a @Polycam3D scan of the room, and @aframevr on the Quest Pro #WebXR
2
13
54
1
15
71
@ConcreteSciFi
Ukaton
3 years
Full body tracking on the Quest 2 in the oculus web browser, using our motion modules and smart insoles via websockets @aframevr Special thanks to @PlumCantaloupe for his mirror a-frame component:
@ConcreteSciFi
Ukaton
3 years
Full body tracking and positioning by combining our motion modules’ orientation for articulation and our smart insoles’ pressure for anchoring #vr Works in the browser via websockets, even on smartphones! @aframevr No cameras required, so you can use your phone while tracking
3
18
112
3
15
71
@ConcreteSciFi
Ukaton
5 months
Made a Swift SDK for our smart insoles and motion modules Also updated the firmware so it advertises its WiFi information over Bluetooth, making it easier to connect over WiFi (no need to manually type in the device’s IP address) Works on iOS, macOS, watchOS, tvOS, and visionOS
@ConcreteSciFi
Ukaton
7 months
Made an Unreal Engine Plugin for our smart insoles and motion modules Works over Wi-Fi (desktop and mobile) and Bluetooth (iOS and Android only)
31
252
2K
1
8
69
@ConcreteSciFi
Ukaton
1 year
Infinite AR staircase @aframevr #WebXR
@ConcreteSciFi
Ukaton
1 year
Using the Quest Pro’s persistent anchors to align a @Polycam3D scan of our apartment with the real world, even when refreshing the page @aframevr #WebXR
2
10
80
1
12
69
@ConcreteSciFi
Ukaton
9 months
AR Object Detection on the Quest Pro using YOLOv8 #WebXR @aframevr
@ConcreteSciFi
Ukaton
10 months
Controlling a room full of Philips hue lights in AR on the Quest Pro! #WebXR @aframevr Turn on lights by walking near them, looking at them, with an AR flashlight, or with an AR torch
10
47
247
4
10
68
@ConcreteSciFi
Ukaton
5 months
Our smart insoles also work natively on Mac 💻 Our Swift SDK works on MacOS, allowing developers to easily make SwiftUI apps that interface with our smart insoles and motion modules via WiFi or Bluetooth Receive motion/pressure data, and even trigger vibrations
@ConcreteSciFi
Ukaton
5 months
Our smart insoles and motion modules now work on Apple Watch! Our Swift SDK also works on WatchOS, enabling developers to make SwiftUI apps for the Apple Watch that can connect to our hardware via Bluetooth and receive motion/pressure data
2
11
61
1
9
65
@ConcreteSciFi
Ukaton
9 months
AR Magnifying Glass on the Quest Pro #WebXR @aframevr By periodically taking screenshots of the camera feed at various positions, I can “zoom in” by retrieving earlier screenshots from when I was up close
@ConcreteSciFi
Ukaton
10 months
Controlling a room full of Philips hue lights in AR on the Quest Pro! #WebXR @aframevr Turn on lights by walking near them, looking at them, with an AR flashlight, or with an AR torch
10
47
247
1
11
65
@ConcreteSciFi
Ukaton
3 years
Using our #smartshoe insoles to experiment with movement in VR
@ConcreteSciFi
Ukaton
3 years
Developing for our #smartshoe insoles is super easy for web developers, thanks to Web Bluetooth, @aframevr , and @glitch ! #javascript #wearables
0
14
31
2
18
60
@ConcreteSciFi
Ukaton
5 months
Our smart insoles and motion modules now work on Apple Watch! Our Swift SDK also works on WatchOS, enabling developers to make SwiftUI apps for the Apple Watch that can connect to our hardware via Bluetooth and receive motion/pressure data
@ConcreteSciFi
Ukaton
5 months
Made a Swift SDK for our smart insoles and motion modules Also updated the firmware so it advertises its WiFi information over Bluetooth, making it easier to connect over WiFi (no need to manually type in the device’s IP address) Works on iOS, macOS, watchOS, tvOS, and visionOS
1
8
69
2
11
61
@ConcreteSciFi
Ukaton
1 year
Viewing ourself in the third person in AR, using our motion modules, a @readyplayerme avatar, a @Polycam3D scan of the room, and @aframevr on the Quest Pro #WebXR
@ConcreteSciFi
Ukaton
1 year
Revisiting our @readyplayerme Quest Pro body tracking demo using our latest BNO085-based motion modules #WebXR @aframevr Not much of a visual difference, but it’s much easier to connect to due to the new pcb layout
0
1
10
2
13
54
@ConcreteSciFi
Ukaton
1 year
AR Spotlight search on the Quest Pro, using Object Detection and Plane Detection to recognize and remember where objects are! @aframevr #WebXR Used Teachable Machine to create the model, casted the video to my laptop to run the model, and streamed results back via WebSockets
@ConcreteSciFi
Ukaton
1 year
Collaborative AR using our full body tracking system, a @readyplayerme avatar, @CroquetIO , and @aframevr ! #WebXR The top left is a Quest Pro, top right is a Quest 2 using body tracking (we couldn’t record passthrough so we used a @Polycam3D scan), and the bottom is a MacBook
1
15
71
7
8
56
@ConcreteSciFi
Ukaton
2 years
Experimenting with streaming webcam and smartphone cameras to the Quest Pro via Passthrough in the browser @aframevr @livekitted
@ConcreteSciFi
Ukaton
2 years
Controlling Philips Hue lights with the Quest Pro via Passthrough, all through a webpage @aframevr @tweethue
7
38
175
1
10
55
@ConcreteSciFi
Ukaton
2 years
Figure Drawing and Rotoscope Animation via Full Body Tracking! Using our smart insoles and motion modules, we can pose a 3D character via @aframevr , choose a camera angle, and trace it We can also record mocap data and trace the character frame-by-frame for rotoscoped animation
@ConcreteSciFi
Ukaton
3 years
Full body tracking and positioning by combining our motion modules’ orientation for articulation and our smart insoles’ pressure for anchoring #vr Works in the browser via websockets, even on smartphones! @aframevr No cameras required, so you can use your phone while tracking
3
18
112
2
10
53
@ConcreteSciFi
Ukaton
3 years
Tiny Tennis for Two (my entry to the @TensorFlow Microcontroller Challenge) Play alone or online, using the @arduino nano 33 ble sense for swinging Built with @aframevr and @croquetio for serverless multiplayer and physics, with assets from @Sketchfab
@ConcreteSciFi
Ukaton
3 years
Used @croquetio + @aframevr to create a collaborative VR environment where users can add and modify entities in real-time via the A-Frame Inspector, like webstrates Also modified the cannon.js physics engine to replicate identical simulations among users
0
9
30
0
13
51
@ConcreteSciFi
Ukaton
4 years
Use multiple devices as external cameras/microphones! Send #WebRTC streams to one page for mixing audio/video, then rebroadcast the composite stream to a web extension for video chats like Google Meet or Zoom! Inspired by @thespite 's Virtual Webcam:
2
18
52
@ConcreteSciFi
Ukaton
2 years
Using our smart insoles and motion modules for @CybershoesVR -like locomotion in VR on the @oculus browser via @aframevr
@ConcreteSciFi
Ukaton
3 years
Full body tracking on the Quest 2 in the oculus web browser, using our motion modules and smart insoles via websockets @aframevr Special thanks to @PlumCantaloupe for his mirror a-frame component:
3
15
71
3
11
48
@ConcreteSciFi
Ukaton
3 years
Collaborative #AR ! By combining @croquetio ’s collaboration platform with @the8thwall , users can share #WebAR experiences! With image targets we can determine the other user’s position and orientation #poweredby8thWall
1
12
47
@ConcreteSciFi
Ukaton
3 years
Testing the motion sensor on our new smart insoles (inside a @WearAtoms shoe), using a @PolycamAI scanned model in the browser via @aframevr
@ConcreteSciFi
Ukaton
3 years
Made a scan of our latest insole design (inside an @WearAtoms shoe) via @PolycamAI for future demos
Tweet media one
Tweet media two
Tweet media three
Tweet media four
0
4
7
3
13
44
@ConcreteSciFi
Ukaton
5 years
"Hello, World!" with phonetic typography! #VoiceFirst Used @pishtaq 's Pink Trombone speech synthesizer to make a timeline editor! #voiceUI Animate the vocal tract using articulation keyframes, with phoneme morph targets and temporal kerning! Imagine writing dialogue like this
3
10
41
@ConcreteSciFi
Ukaton
4 years
Multi-Cam Streaming in the Browser! Using @Croquet and @feross 's simple-peer, I'm able to receive multiple #WebRTC streams and broadcast a single output, whether it's from a smartphone, tablet, or even another computer's webcam/screen! It's like a web-based @CinaMakerApp !
1
11
42
@ConcreteSciFi
Ukaton
1 year
Text-to-phoneme-to-speech #speechsynthesis Converting words to phonemes, which control @pishtaq ’s Pink Trombone. Added controls to set timing and pitch. I can even export the utterance and import it into my Pink Trombone timeline editor
@ConcreteSciFi
Ukaton
2 years
Redesigned my Pink Trombone Timeline Editor, adding the ability to manipulate speech, pitch, and the timeline using a MIDI keyboard via WebMIDI! (credit to @pishtaq for creating the original Pink Trombone speech synthesizer)
2
5
20
2
5
41
@ConcreteSciFi
Ukaton
4 years
Voice + Mocap = Vocap! Using @pishtaq 's Pink Trombone, @ml5js 's pitch detector, and @Google 's #TeachableMachine , I can use my voice to control the speech synthesizer using #MachineLearning ! I also updated the timeline editor since my previous post:
@ConcreteSciFi
Ukaton
5 years
"Hello, World!" with phonetic typography! #VoiceFirst Used @pishtaq 's Pink Trombone speech synthesizer to make a timeline editor! #voiceUI Animate the vocal tract using articulation keyframes, with phoneme morph targets and temporal kerning! Imagine writing dialogue like this
3
10
41
2
8
39
@ConcreteSciFi
Ukaton
2 years
Playing a virtual floor piano in VR using our smart insoles and motion modules All in the @oculus browser via @aframevr and Web Audio (don’t mind my little brother @jimmy_qattan playing along in the background)
@ConcreteSciFi
Ukaton
2 years
Learning to dance in VR, using our smart insoles and motion modules for leg tracking. @aframevr Also used foot pressure to play spatialized footstep sound effects
2
0
9
3
7
36
@ConcreteSciFi
Ukaton
4 years
Use your smartphone as an external webcam! Using @Croquet 's SDK and @feross 's simple-peer, I was able to easily make an entirely client-side video chat application that allows you to use your smartphone's camera instead of your webcam! #WebRTC
2
9
34
@ConcreteSciFi
Ukaton
4 years
Using our #smartshoe insoles to control a plane in @aframevr
0
11
36
@ConcreteSciFi
Ukaton
4 years
#AR #SpatialAudio Close Call! @aframevr With @croquetio 's realtime multiuser platform, the iPhone 11's #U1 chip and Speech Recognition, we can have a spatial in-person voice call with subtitles! The first step in the Augmented Conversation! #VoiceFirst
@ConcreteSciFi
Ukaton
4 years
Close Calls! #VoiceFirst Using @croquetio 's multiuser platform & the iPhone 11's positioning system, we can make voice calls when nearby! #AR Useful for loud areas, muffled voices due to masks, noise cancelling headphones, or if we're listening to music
0
5
9
0
11
34
@ConcreteSciFi
Ukaton
6 months
@devtom7 @GoveeOfficial did something similar with the philips hue lights, but yours has a much better interface
@ConcreteSciFi
Ukaton
2 years
Controlling Philips Hue lights with the Quest Pro via Passthrough, all through a webpage @aframevr @tweethue
7
38
175
1
0
35
@ConcreteSciFi
Ukaton
2 years
Introducing Repsetter, your new favorite workout diary We made it for ourselves and wanted to share it with you Built with @nextjs , @supabase , and @tailwindui
3
4
34
@ConcreteSciFi
Ukaton
11 months
Experimenting with object manipulation with the Quest Pro’s eye tracking and the @tapwithus Tap Strap Based on an earlier version using the Snap Spectacles
@ConcreteSciFi
Ukaton
2 years
Ported my @tapwithus web sdk to Snap’s Lens Studio, so now I can move objects in #AR with @SnapAR ’s @Spectacles
1
4
20
2
10
34
@ConcreteSciFi
Ukaton
4 years
Made a web extension that uses @Croquet to find any @modelviewer elements in the page and relays their orbit to everyone else with the extension
2
7
34
@ConcreteSciFi
Ukaton
4 years
Indoor Positioning in the browser! #WebXR Using the @navisens web demo, I'm able to get my position using just motion sensor data from my smartphone, then stream it to @aframevr using @Croquet ! A great companion to GPS-based location:
@ConcreteSciFi
Ukaton
4 years
Use your smartphone as a wireless compass and gps! Using @Croquet , I'm able to stream location and orientation data from my smartphone to my laptop, as well as head-tracking data from my #BoseFrames ! #BoseAR Also using @Mapbox for map stuff. And this is all in the browser!
0
3
12
0
8
34
@ConcreteSciFi
Ukaton
3 years
Made a basic Web API to connect to @tapwithus 's Tap Strap in the browser via Web Bluetooth, allowing you to access tap data, sensor data (accelerometer & gyroscope), mouse data, air gestures, and haptics. #WebBluetooth older demo:
@ConcreteSciFi
Ukaton
4 years
Tree + Keyboard = TreeBoard! Using @tapwithus 's Tap Strap to experiment with a new way of typing "top-down" as opposed to typing "left-to-right". Explore grammars and traverse trees!
3
2
12
2
9
33
@ConcreteSciFi
Ukaton
2 years
Streaming pressure data from our smart insoles to @SnapAR ’s #AR @Spectacles
@ConcreteSciFi
Ukaton
2 years
Streaming motion data from our smart insoles to @SnapAR ’s @Spectacles #AR glasses
1
1
8
2
5
32
@ConcreteSciFi
Ukaton
4 years
Real-time #SpatialAudio Navigation! Using @croquetio 's Live Collaboration Platform, @navisens ' motion mapping, and @Bose #AR head tracking, I can get a bird's eye view of a remote user I can then guide them via spatial voice calls & messages using @feross 's simple-peer! #webRTC
1
9
32
@ConcreteSciFi
Ukaton
2 years
Using our smart insoles to retrofit a simple desk cycle to bike around in VR! @aframevr Works on smartphones, tablets, desktop, and on the Quest 2! We also used our motion module as a steering wheel (or you can tilt your smartphone)
@ConcreteSciFi
Ukaton
2 years
#AR skating in the park with our smart insoles and @SnapAR ’s @Spectacles !
0
5
14
1
7
32
@ConcreteSciFi
Ukaton
3 years
Used @croquetio + @aframevr to create a collaborative VR environment where users can add and modify entities in real-time via the A-Frame Inspector, like webstrates Also modified the cannon.js physics engine to replicate identical simulations among users
@ConcreteSciFi
Ukaton
3 years
Made a @croquetio adapter for @HaydenLee37 's Networked @aframevr , using @feross 's simple-peer for #webRTC -based spatial voice chat. No server setup required
1
3
23
0
9
30
@ConcreteSciFi
Ukaton
4 years
Hide & Seek with iOS 14's Ultra-Wideband Positioning! Using @croquetio 's multi-user platform and the iPhone 11's Nearby Interaction, we can get real-time distance from other players! Can't wait for Apple to release more U1 chip devices #AirPods #AirTags
@ConcreteSciFi
Ukaton
4 years
Real-Time Multi-User Positional Tracking! #AR Using @croquetio 's live collaboration platform, I can receive other users' orientation & position obtained via the AirPods Pro API and iPhone 11's Nearby Interaction Framework! All on the web too! @aframevr
2
50
156
0
7
31
@ConcreteSciFi
Ukaton
3 years
Developing for our #smartshoe insoles is super easy for web developers, thanks to Web Bluetooth, @aframevr , and @glitch ! #javascript #wearables
@ConcreteSciFi
Ukaton
3 years
Here’s @umarqattan testing out our #smartshoe insoles! #wearables
0
2
8
0
14
31
@ConcreteSciFi
Ukaton
11 months
Real-time nutrition information in AR Using the Quest Pro and @decentespresso ’s Decent Scale to retrieve recipes and show how much of each ingredient to weigh, displaying real-time macros & calories Also uses eye-tracking and the @tapwithus Tap Strap for simple menu navigation
@ConcreteSciFi
Ukaton
11 months
Experimenting with object manipulation with the Quest Pro’s eye tracking and the @tapwithus Tap Strap Based on an earlier version using the Snap Spectacles
2
10
34
0
4
30
@ConcreteSciFi
Ukaton
1 year
Using our motion module as a @unity controller to orbit the camera and rotate objects
@ConcreteSciFi
Ukaton
1 year
Controlling a Philips Hue light bulb with just my eyes using the Quest Pro’s eye tracking
4
44
236
0
7
28
@ConcreteSciFi
Ukaton
4 years
Rewrote the @modelviewer / @Croquet web extension as a bookmarklet so it works on mobile browsers as well!
@ConcreteSciFi
Ukaton
4 years
Made a web extension that uses @Croquet to find any @modelviewer elements in the page and relays their orbit to everyone else with the extension
2
7
34
2
8
27
@ConcreteSciFi
Ukaton
3 years
#AR Remote Assistance! By combining @croquetio 's collaboration platform, @the8thwall 's #WebAR , @ultraleap_devs ' hand tracking, & @feross 's simple-peer, users can receive hands-on training on the web! @aframevr Includes speech transcription and voice spatialization. #voicefirst
3
6
27
@ConcreteSciFi
Ukaton
1 year
Record & Playback full body tracking in AR on the Quest Pro using our motion modules #WebXR @aframevr
@ConcreteSciFi
Ukaton
1 year
Made a @unity BLE integration for our smart insoles and motion modules
8
97
599
0
4
25
@ConcreteSciFi
Ukaton
3 months
Playing with our smart insoles on the Vision Pro Our demo app includes a Safari Extension that allows web developers to connect to our insoles in Safari via bluetooth (last clip) Special thanks to @Yosun & @Tahmeed_Rahim for providing the space and Vision Pro to test our app in
@ConcreteSciFi
Ukaton
5 months
Updated our Safari Web Extension so now you can get AirPods Pro motion data in the browser Works on Safari for iOS and Mac Works with AirPods Pro, AirPods (3rd gen), AirPods Max, and Beats Fit Pro
73
372
3K
4
8
28
@ConcreteSciFi
Ukaton
1 year
Pronunciation Assistance Using Pink Trombone to pronounce a word, and Meyda.js + @ml5js to recognize a user’s phonemes to guide them through the pronunciation
@ConcreteSciFi
Ukaton
1 year
Improved my text-to-phoneme-to-speech editor, including direct phoneme manipulation, whispering, pig Latin, phoneme substitutions (accents, dialects, and impediments), and gibberish generation using a Markov chain
15
108
707
0
4
26
@ConcreteSciFi
Ukaton
1 year
New enclosure for our smart insoles, featuring a modular clip design for different applications (straps, screw-on, shoe clip, hooks, etc), as well as a template clip for anyone to make their own custom clips Also includes a larger vibration motor
Tweet media one
Tweet media two
Tweet media three
Tweet media four
@ConcreteSciFi
Ukaton
1 year
Made a @unity BLE integration for our smart insoles and motion modules
8
97
599
1
6
25
@ConcreteSciFi
Ukaton
4 years
Upgrading #BoseAR from 3DoF to 6DoF! By combining my @Bose AR web sdk with @navisens ’s motion mapping web sdk, I’m able to get my #BoseFrames ’ position in addition to head tracking! Also used @aframevr for displaying orientation 👓 Thanks @umarqattan for testing it out 🤓
2
5
25
@ConcreteSciFi
Ukaton
3 years
Made a web extension that spatializes any audio source in a webpage based on its position onscreen and our motion module's head tracking (when attached to a pair of headphones) #SpatialAudio #WebBluetooth
@ConcreteSciFi
Ukaton
3 years
Using our motion motion to rotate a @Sketchfab model
1
0
4
2
4
25
@ConcreteSciFi
Ukaton
3 years
Playing with @ultraleap_devs ’s Leap Motion and my unofficial V4 #LeapMotion web SDK to add simple touchless gestures to webpages
@ConcreteSciFi
Ukaton
3 years
#AR Remote Assistance! By combining @croquetio 's collaboration platform, @the8thwall 's #WebAR , @ultraleap_devs ' hand tracking, & @feross 's simple-peer, users can receive hands-on training on the web! @aframevr Includes speech transcription and voice spatialization. #voicefirst
3
6
27
0
5
24
@ConcreteSciFi
Ukaton
2 years
Thanks to @esphome_ ’s ESP Web Tools, we’re able to easily update the firmware for our smart insoles and motion modules on the web! And special thanks to @balloob and @NabuCasa for being so responsive when I reached out for help!
@ConcreteSciFi
Ukaton
2 years
Redid our software so now our smart insoles and motion modules use the same exact firmware and web sdk
0
1
10
0
7
23
@ConcreteSciFi
Ukaton
4 years
Demo'd our #smartshoe insoles at a hardware meetup hosted by @HumanmadeSF ! Used @Croquet to broadcast sensor data to attendees so they can see the demo on their own devices in real-time! Demo includes weight distribution and now foot orientation via @aframevr !
0
7
24
@ConcreteSciFi
Ukaton
3 years
Made a @croquetio adapter for @HaydenLee37 's Networked @aframevr , using @feross 's simple-peer for #webRTC -based spatial voice chat. No server setup required
1
3
23
@ConcreteSciFi
Ukaton
1 year
Added UDP support to our @unity integration for our motion modules and smart insoles, so now developers can connect via Bluetooth or WiFi
@ConcreteSciFi
Ukaton
1 year
Using our motion module as a @unity controller to orbit the camera and rotate objects
0
7
28
0
4
22
@ConcreteSciFi
Ukaton
11 months
Using our smart insole as a sustain pedal for our @tapwithus AR piano #WebXR @aframevr Instead of using our smart insole as a Web BLE dongle like before, we just connected the tap straps to the Quest Pro directly as keyboards, using custom tap mappings for the left/right hands
@ConcreteSciFi
Ukaton
11 months
Ported our @Spectacles x @tapwithus AR Piano to the Quest Pro #WebXR @aframevr However, the oculus browser doesn’t have Web Bluetooth, so we used our motion module as a WebBLE dongle, relaying Bluetooth data from the Tap Strap the Quest Pro via WebSockets
1
4
12
1
10
22
@ConcreteSciFi
Ukaton
4 years
Click your heels thrice to order a @lyft home! Using @croquetio to stream #smartShoe motion data from my smartphone to my laptop ( @aframevr for vis), I'm able to collect data remotely from various users and use @ml5js to detect heel clicking! Collaborative #MachineLearning !
0
14
22
@ConcreteSciFi
Ukaton
4 years
Collaborative Maps with @croquetio and @Mapbox ! Post multimedia markers using @WebTorrentApp for file-sharing! Make #WebRTC calls with you friends using @feross 's #simplepeer , spatializing their voice based on their location via the #ResonanceAudio web SDK! [ft. @umarqattan ]
1
3
23
@ConcreteSciFi
Ukaton
1 year
Collaborative XR on the web with the Quest Pro, @aframevr , and @CroquetIO ! #WebXR While walking around in AR on the Quest Pro via Passthrough mode, remote users can join via VR, using a @Polycam3D scan of the area for reference
@ConcreteSciFi
Ukaton
2 years
Experimenting with streaming webcam and smartphone cameras to the Quest Pro via Passthrough in the browser @aframevr @livekitted
1
10
55
1
7
23
@ConcreteSciFi
Ukaton
5 years
Streaming our #ESP32 -based #SmartShoe insoles to a webpage via @AgoraIO ’s #webRTC SDK! Using @PlatformIO_Org for firmware and @nodejs for receiving data using @voodootikigod ’s SerialPort module! Can’t wait for our @PCBWayOfficial PCB’s to arrive so we can move foreword!
1
4
22
@ConcreteSciFi
Ukaton
2 years
Now we’re able to animate our @readyplayerme in @aframevr over Web Bluetooth instead of WebSockets - Full Body Tracking on the go! Since you can only connect to 7 Bluetooth devices simultaneously, we just connected to 6, and they’d connect to the rest of the 15 to relay the data
@ConcreteSciFi
Ukaton
2 years
Animating a @readyplayerme avatar in @aframevr via full body tracking using our motion modules and smart insoles No cameras or apps needed - just a simple webpage
8
80
419
1
8
21
@ConcreteSciFi
Ukaton
2 years
Added OSC support for our smart insoles and motion modules for applications like @cycling74 ’s #MaxMSP
@ConcreteSciFi
Ukaton
2 years
Added UDP support to our firmware so now we can use applications like #PureData to connect to our smart insoles and motion modules, using an identical SDK to our Bluetooth and WebSocket-based Web SDKs
0
1
2
1
5
22
@ConcreteSciFi
Ukaton
2 years
Ported my @tapwithus web sdk to Snap’s Lens Studio, so now I can move objects in #AR with @SnapAR ’s @Spectacles
@ConcreteSciFi
Ukaton
2 years
#AR skating in the park with our smart insoles and @SnapAR ’s @Spectacles !
0
5
14
1
4
20
@ConcreteSciFi
Ukaton
4 years
Watch videos together! Used @croquetio and @Sam_Potts 's to create a #TogetherJS -style shared video player Viewers can change playback or change the video source by pasting a YouTube or Vimeo url, or just drag a video file and share it via @WebTorrentApp
0
5
22
@ConcreteSciFi
Ukaton
3 years
Using our motion modules to detect head orientation relative to device orientation Here we spatialize an audio source so it sounds like it’s coming from your smartphone (using a classic Resonance Audio demo rewritten in @aframevr ) Thanks @jimmy_qattan for posing for our demo!
@ConcreteSciFi
Ukaton
3 years
Motion module firmware update: ✅ Transfer files over Bluetooth #WebBluetooth ✅ Run @tensorflow Lite models on the module itself #ESP32 ✅ Updated our @ml5js -based #MachineLearning kit to convert #tensorflow models to Tensorflow Lite via a @Replit python server
0
3
6
0
3
19
@ConcreteSciFi
Ukaton
2 years
Redesigned my Pink Trombone Timeline Editor, adding the ability to manipulate speech, pitch, and the timeline using a MIDI keyboard via WebMIDI! (credit to @pishtaq for creating the original Pink Trombone speech synthesizer)
@ConcreteSciFi
Ukaton
5 years
"Hello, World!" with phonetic typography! #VoiceFirst Used @pishtaq 's Pink Trombone speech synthesizer to make a timeline editor! #voiceUI Animate the vocal tract using articulation keyframes, with phoneme morph targets and temporal kerning! Imagine writing dialogue like this
3
10
41
2
5
20
@ConcreteSciFi
Ukaton
4 years
Serendipitous Meetups! @Mapbox Using @croquetio 's multi-user platform and @navisens 's location platform, we can be alerted when someone nearby has a common interest! Every day we walk past so many strangers that could've been potential connections!
@ConcreteSciFi
Ukaton
4 years
Collaborative Maps with @croquetio and @Mapbox ! Post multimedia markers using @WebTorrentApp for file-sharing! Make #WebRTC calls with you friends using @feross 's #simplepeer , spatializing their voice based on their location via the #ResonanceAudio web SDK! [ft. @umarqattan ]
1
3
23
2
14
20
@ConcreteSciFi
Ukaton
8 months
AR X-Ray on the Quest Pro! #WebXR @aframevr Look through walls, using a @Polycam3D 3D scan to see what’s on the other side Great for visualizing the layout of a building
@ConcreteSciFi
Ukaton
8 months
AR Minimap on the Quest Pro! #WebXR @aframevr Navigating an area using a miniature @Polycam3D scan of the place, with a real-time marker of your location Great for finding friends and other things
5
78
353
0
3
20
@ConcreteSciFi
Ukaton
2 years
Ported our body tracking demo from WebSockets to Web Bluetooth! @aframevr Unfortunately, we’re only able to connect up to 7 Bluetooth devices simultaneously, so we’re limited to just leg tracking. Still cool we can get leg tracking on our phone without needing a WiFi router
@ConcreteSciFi
Ukaton
3 years
Full body tracking and positioning by combining our motion modules’ orientation for articulation and our smart insoles’ pressure for anchoring #vr Works in the browser via websockets, even on smartphones! @aframevr No cameras required, so you can use your phone while tracking
3
18
112
1
1
18
@ConcreteSciFi
Ukaton
5 years
Fixed skeletal support for the @LeapMotion v4 JavaScript SDK Also tested out @AgoraIO ’s #webRTC SDK for streaming tracking data directly between peers Repo (WIP) -
1
5
19