Just hit 6000 students on my procedural animation courses! Join us. Lowest price discount (limited time/use) for both courses:
Control Rig:
Procedural Animation for Humans:
Procedural State Transitions, within control rig. Based on bone weighting & COM shifting. Can be used at runtime but I'm wondering if this would be more useful as an animation helper within Unreal, something similar to Cascadeur's tools maybe.
Fully procedural climbing! No keyframes. Only worked on this today for
#screenshotsaturday
so it's a little rough still. If you want to learn this stuff, check out my YouTube for tutorials/courses for procedural animation:
My latest course on Control Rig is now live, with a 5 day launch discount for the lowest price possible:
Beginner friendly, all project files included and ready to use, and as always I'm available to answer any questions you may have.
Not sure there'd be any use-case for this, but here is a plane with nanite displacement, displaying a live scene capture (hdr+depth, with -depth -> displacement). Maybe could be useful for something that needs to wrap around a surface beyond what decals can do.
#UnrealEngine
5.4
A veeery simple proof of concept for animating to a (dynamic) camera. In this example the ears track the cam position, but it could be expanded for adjusting the head position, morph targets, materials (eg. mouth/etc), whatever is needed based on character design 'rules'
Since it's Valentine's day, here's a coupon code for 75% off my procedural animation course for
#UnrealEngine
,
Give your loved ones the gift of a difficult (and boring at times) 9 hour technical animation lecture series. What more could they want?
@GTAVI_Countdown
Animations that work in sync are definitely not a new thing. They might trigger them more dynamically than a 90s game would, and use motion warping to get them both into the right spots, but playing two animations at once is 3 decades old tech
@Rexona57778086
@Rainmaker1973
Sure, if this was in 3d (EG 3d screen or VR) that might give some extra understanding. But this is literally an identical display, except instead of a clean black background, it has a random environment/wall behind it & the image colour is completely off, as well as translucent.
11 traces * 400 characters * 100fps = 440,000 sphere traces a second. An overview video on performance with Control Rig, and suggestions for how to optimize your procedural animations (if you NEED to).
Procedural animation FAQ, off the top of my head based on questions I get asked, in no particular order:
- Can I combine keyframed animations with procedural? Yes. This is almost always going to be the best solution if you have high quality animations or mocap data. If you can
@Ravenman13
@GTAVI_Countdown
This is relating to the structure of the animations/hierarchy, and also nothing new there in the specifics. There's nothing mentioned in there that hasn't been done for a long long time. But the examples in the original thread are all of two animations playing in sync. Half-life
@bgolus
I think this may be a smart move and one that will have paid off in some way over time, but I doubt it's the primary reason. I think just having the playerbase of CS1.6 (and later, any HL2 owner) forced into Steam got it off to a huge start. They were also the only one really
@kiaran_ritchie
@cinedatabase
That's right, no hand IK, it's just the bone values making them act like that.
Here is a rough outline I gave on discord:
@Tylru
Yeah, this version I've shown here is all done automatically with only initial control. Just passing input poses through it. But conceivably could be done with a specific timeframe and per-transition settings for animation. I was mainly exploring it for how it could be used as
@Rahll
Unless I'm mistaken, your position is that genAI is just stealing and doesn't generate anything "new". So by that logic the only jobs being lost are the jobs of artists who would steal/regurgitate something that has already been done.
@kushirosea
I made a script in blender to do walk cycle generation, using some of the same logic/principles. Basically just keyed the leg points for you for a walk in place. So I imagine this is also possible, but I don't know about in realtime. Could definitely have it autokey it all though
To anyone from Epic working on Sequencer or ControlRig
#UnrealEngine5
: Will it ever be possible to export metadata from Sequencer? A single "Export Metadata" tickbox here changes lives. Authoring procedural animation SETTINGS which can be keyed and baked to the animation.
Does anyone know how to walk? Legitimate question. Ignore the physics or biomechanics of the actual motion; how do we decide where to put our feet when walking/changing direction/turning?
Imagine you had to make a "footprint simulator" and come up with a ruleset. So when a foot
@DefundABC
@JakeSucky
This type of information culling is a lot more complex than you're assuming. In fact, in the 90s games you're referencing it was much easier with visleafs and closed rooms where the visleafs would pretty much guarantee that someone isn't visible. In an open level with varied
@Unexplained2020
No amount of study or testing or logic can prove anything about consciousness, if you discount the feeling of it. Every single explanation or justification of consciousness *relies* on trusting an individual's subjective feeling that they are conscious. If the same claim was made
@ralphbrooks
Nah none of this theory holds up. I think you're seeing how unreal almost looks photorealistic, and how these videos do, and making some connection there.
Footage of people walking a bit weird, and physics from an engine that isn't particularly great at physics.. It's a big
@skx_doom
I ordinarily wouldn't modify a plugin, but there's an element of assurance that if the developer leaves the planet you could always fix things/update for new engines if you had to. You could always release two, a cheaper feature limited version with no source, and an 'advanced'.
@RobertWJV
@Ravenman13
@GTAVI_Countdown
Don't let the visuals of the animation fool you, imagine it was a 2d game with solid blocks of colour. You wouldn't be impressed that two blocks of colour could turn green at the same time when in close proximity. Whilst another time they both turn purple.
It's dynamic and
@OutoftheboxP
Thanks! Only took 2 hours to put together (from modifying a system I made for multilegged creature walking). Would have taken me a decade to make with keyframed animations.
@80Level
Good on them. Most devs would try to capitalize on the viral nature and rush it out. They seem confident in their own execution enough to not care about the blatant ripoffs circling around.
@protopop
UE doesn't have it out of the box, I made this in control rig (gives direct access to manipulate bones etc). Unity is capable of the equivalent though. I've had some people follow my tutorial/courses, except they followed along in Unity and got similar end results.
@gunsnrosesgirl3
Not even close. Would love to hear an unedited version of this where they just tried to do it on a live stream. Then we could make a better judgement. This edited clip merely shows that *sometimes* orcas will make a two-syllable noise. That is literally all it it shows.
@games_inu
You'll learn a lot more doing/exploring what you want when you want to, than resisting and sticking to what you "should" be doing. Get launching and checking things out.
@Kujistudios
@ohle_mathiebe
There's no relation between the texture size and the screen resolution. The texture doesn't span the full UV space in a given area, and the object can be very close to the screen on one part, and very far on the rest. Imagine a 4k texture for a planet the size of earth, if you
@GetsuAizen
@Rainmaker1973
Oh the thing itself is cool. It works as a transparent and quite portable large cheap circular display. I could certainly imagine a use case for that. Just this application of using it to show exactly what you can see on the screen is entirely pointless.
Looks cool on a video,
@joewintergreen
Readability. There's also at least one scene in Dexter where the phone is basically showing the internal monologue/decision making, where the phone says "Call Deb?" or whoever it was, and then he pauses and declines it. Pretty sure no phone interface is asking you to call someone
@Rahll
This is true of most companies. Not specific to AI or even tech companies. You should compare these to the overall stock market to make the point you're trying to make.
@owenferny
Regenerate it 1000 times, still faster than one iteration from an animator. If regenerating it isn't enough and it generated a perfect one that just needs minor tweaks, there's video in-painting and such as an option.
@insaneUEFN
For people not using UEFNs version of Unreal, this is the way you can currently do this (see pic). Basically rotating the direction vector by that quaternion, which has the same result as calculating the X vector of a rotator.
@UzumakiMenmo
@Kujistudios
@ohle_mathiebe
"obviously the resolution would be different for a game."
"obviously the resolution would be different for a game."
"obviously the resolution would be different for a game."
@DokuGamesLTD
I wouldn't worry too much about the timing distribution of the KS, still try to do what you can to push it. Some big source might pick it up and send more traffic than you've had so far all in one day.
The reason it's usually a good sign if the majority gets funded in the first
@Der_Kevin
Delay the rotation of the character mesh, and use IK within Control Rig to match the feet to the board. Then when he starts going up a ramp, initially it will just be the feet changing heights, before the body 'catches up' and aligns perpendicularly - less 'stuck to board' look
@roydecampagna
On Udemy, here's my last course on the topic: which was a little heavier and more in depth focusing on human animations. This new one is aiming to be more generalized and easier to follow along with.
@PetarPehchevski
Looks great! Just a guess but are you inputting character velocity rather than calculating it? The prediction forward seems to be ignoring the Z velocity (which would make sense if using the character velocity which zeros out the Z axis when grounded).
So when going up or down
@Der_Kevin
Not a question just some general ideas for improving the look/fluidity, I do a lot with procedural animation and this is a prime use-case for it
I didn't set a 1 day limit on this, so it seems the offer is still valid for anyone who hasn't picked it up yet for 2 more weeks. It should be approx $15 (depending on local currency) when using the code below.
Since it's Valentine's day, here's a coupon code for 75% off my procedural animation course for
#UnrealEngine
,
Give your loved ones the gift of a difficult (and boring at times) 9 hour technical animation lecture series. What more could they want?
@x12341659751
@Surasia_
I think it's some kind of defence mechanism. It's obviously worrying (to everyone) that AI is going to take over every job, but some people seem to deny it by living in the delusion that because it hasn't done something specific yet, it never will. And that keeps them safe.
@Rahll
Yeah, it's almost like 1 or 2 years into video gen tech and it's still not perfect. People were saying the same silly arguments about image generators a few years ago, not able to comprehend the idea that AIs that could modify specific elements of an image after the fact etc.. It
@AnnoyedNPC
Yeah there's a lot that can be done, I'm working on some ideas (mostly in my head so far) for some animation tools. Sequencer animating is hugely underrated. It started off terrible and maybe gave people a bad impression but it's my go-to now for game animating.
@Warka101
I'm working on a course that shows in detail setting up the underlying system this uses (general multilegged character procedural anim), after that it was just basically swapping it out for a human and some parameter tweaks. I've got a video on YouTube that shows a basic proc
@beckylitv
It's certainly not "ALL" ai.. The voice is pretty clearly either generated or intentionally trying to sound fake, so I guess that part is. Then the talking and head motions are out of sync so that's either really good acting or it is actually lipsync generated face to match the
@rms80
@roydecampagna
Is there no way that this can be crowdsourced in a way, or incentivized? What I'd like to see would be some kind of problem->solution database, where there would be a method posted to do something specific, and if someone can come up with a better way they can add to it, and if
@a01744
Yup, it's IK but specifically using FullBodyIK solver. Which basically let's it warp the full chains (between arms-pelvis-feet in my case) to get the closest fit, meaning it's doing most of the work for the body rotations and such to reach the 5 target points I specify
@jklover369
None taken, I laughed too. I used the wrong skeletal mesh and it surprised me. Leaned into it a bit after that by tweaking a couple settings on the procedural system but it mostly just popped up as seen here.
@Akari_Enderwolf
It could, but maybe not this specifically as this is more related to rotational offsets. For a ladder animation, the best approach is likely motion-matching. EG the distance he is up the ladder controls the timeframe of the animation.
@UnrealEngine
Will the kickoff stream of this also be when the results are announced for the Furious Elegance challenge? Or is there an earlier/later stream for that?
@ohle_mathiebe
@UzumakiMenmo
@ItsDervy
@Kujistudios
I'm not sure it's fair to expect any admissions of guilt, when you misread that it'd be half res. And also that it wasn't for a game. And also said that it would be 1gb. Wasn't that all of your points?
Also your blanket original point of "its poor optimization" sounds like you
@Dorizzdt
The downfall of this approach is that it requires a lot of good animations, which is usually the limiting factor for indies. And I imagine we'll get TONS of games using their sample project locomotion from here on out.
@marcsh
I'd like to see what happens when those larger teams start using this approach, because fully procedural animations are currently just an experimental thing indiedevs do. The tech is there it's just not being explored enough and they're still using (for the most part) traditional
@TheJackyMartin
Favourite: Being able to do what I'd happily be doing for free, as a job.
Least favourite: The general sense of FOMO/lack of time, for all the things I want to do. I see something cool or think of something to experiment with and want to dive into it for weeks, but then feel a
@Rahll
Nonsense, it will replace most jobs. And not in the way that people often assume where now we have docbot5000 who wears a labcoat and replaced a doctor. It will mean that 1 doctor can do the job of 5 doctors, so there's 1/5th the demand for doctors (initially) for example. And
@xrossfader
Wait.. if you were to import an existing image, would it treat it like it's "your artwork" and use it for data? Is this a way for them to bypass any potential laws against scraping content?
@rms80
Remind someone to take over as community manager for this contest:
Been about 9 months and barely any updates, some of us are wondering when they'll contact about prizes/etc.
@MichaelG_3D
Would make a nice plugin to automate taking screenshots from a specific camera at regular intervals. Or I guess you could just record a separate window with that view.
Unreal Engine feature I think would be very useful: The ability to one-click expose any variable as a console command. So [category].[variablename]=5 would set that variable at runtime. Would make testing things much easier, especially if the command itself was also replicated
@Rahll
As another reminder, the only time people use 'screencap' is to describe a specific image captured from a video/movie. Asking it to construct a 'screencap' and being surprised that it's understanding of that is specific captures from movies shows a lack of understanding of AI.
@kiaran_ritchie
@cinedatabase
Yeah that's what I'm planning to use it for, just this was a much quicker way to test how it'd look than setting it up for Sequencer/as an editor tool.
The concern to solve is that the blending isn't accurately controlled, so it may be hard to keyframe it between a very specific
@Jakob_Wahlberg
You're not talking about solo devs. That's like saying you don't see the point of a one man band because it's not crediting the rest of the orchestra.
Usually existing assets get used somewhere, so I could understand your point if you were saying it isn't crediting the assets
@Rahll
Where do you draw the line? Do you agree the line is subjective? EG most tools will utilize some form of AI, and many have done for a while. Is AI based compression for an image or video okay? Or denoising a render? What about filling a section of colour? If the fill tool is
@UNDERDOGSgame
Very cool. I think the legs need some additional velocity prediction (the step radius and the timing window isn't matching). Ideally you want it to end up where the foot reaches forward as far as it lags behind before lifting. So the 'ideal spot' is centered half way between the
@tactigray
I feel like they could have / should have added their own spin on it. This just seems like a blind ripoff using identical marketplace/quixel assets. I guess multiplayer is a distinction but stylistically they could have at least changed *something*. Coulda been desert combat etc
@polygonflow
Please check out the workflow of BSP creation in the old hammer or quake map editors, I'd pay top dollar for that in UE and it fits right in your wheelhouse of workflow tools. Insanely faster and more usable than UE for any hard surface level design.
@ShadowArtGames
Try using the speed of the mouse/swipe to control the distance the sword is held away from the body. So that when it's held full left or full right, it's held more naturally with the sword closer to the body, and when you whip it across it'll extend out fully
This would allow the animator to modify data which can then be read by controlrig at runtime. EG they choose when procedural animation takes over (keyed on/off). Or specifically design settings (eg a 'walk designer') directly in sequencer as part of the animation file.