Archive:

Subscribe:Atom Feed

"Roland"

Posted: 06 November, 2020

Tuesday 03.11.2020

I rigged the wasp and setup the animation blueprints and blend-spaces.

Animation-wise it’s trivial. A pose for in-flight, a pose for stopping, and a 15 frame loop for idle.

Blendspaces are great. You setup animations on axis, with a range of, say, -1 to +1. At -1, the brake pose. At 0, the idle. At +1, the in-flight pose. The value passed into the blendspace tells it which animations to interpolate between, and at what weight. In the case of the wasp I use the linear velocity of the physics shape to get a normalised value, so at max speed I’m getting my in-flight pose, and at rest, the idle loop.

My first-pass was to let the AIActions write values into the Blackboard for the NPC Blueprint to process, and then pass on to the Anim Blueprint. I very quickly retreated from this. There’s duplication, it’s messy to debug and I don’t believe the AIActions need to know the implementation details of NPCs. I can see cases where I’d want to play specific montages, from AI Actions, on NPCs, but even in those cases I think it’ll be better to call a function on the NPC and let it worry about the how.

My wasp is now buzzing around quite happily.

Minus any actual buzz, because I’ve yet to do the audio…

The AIAction that chases the player has a small (0.75 second) delay when entering. I wanted to make sure there was enough time to alert the player, but to do that I need something more than just an animation and an audio sting, so I added a little Anime style surprise icon to the UI. It looks kinda cool.

With everything working in the Dojo I dropped my first couple of wasps into the world and everything was great until the map was reloaded, at which point the AI died.

It took a little while to work this out, but it’s obvious now I see it: BeginPlay is called quite close to instantiation, so during a map load the components of an actor can have BeginPlay called before they’re added to their parent actor. Because I cache references to other components (and parent actors) this is basically a race condition that can ripple all the way down to the AIActions themselves. It’s not noticeable in PIE because the map is already loaded, everything’s already been created, and it isn’t actually the first time BeginPlay has been called on these actors & components. Anyway, it was easy to fix.

Wednesday 04.11.2020

I added Visual Logger output to the AI stuff. It’s a bit more useful than my on-screen output, as it lets me draw lines and shapes in the world and play them back at my leisure. There’s a good overview of it here.

I also added a way for AIActions to retrieve the current GameplayTag. I’ve not been using tags up to now, but I’m starting to see how useful they can be. Essentially they’re hierarchical names: foo.bar.baz, which doesn’t sound all that useful, but it’s enough to do some filtering and categorisation with. For example, when computing the score, actions can check the current tag to see if it’s a child of “Idle” and adjust to break-out and do something. Or the reverse, wait for an idle to timeout. That’s useful decoupling.

I also made a start on re-implementing the Rat, and again, hit the issue with UObject derived blueprints being limited in what functions they can call. This is a serious issue with my system -- I can’t run EQS, Raycast, or well, do anything useful without bouncing up to the NPC -- so before I pulled the trigger on inheriting from AActor, I did some more digging.

UObject derived Blueprints can’t use Function Libraries by default, because they don’t have World Context. Although World Context could be passed down the chain (through the AIController, say) when calling a method, UE’s still going to say this isn’t safe and will give some nasty looking warnings in the editor. Like the image from last week.

The fix turns out to be as simple as implementing:

UWorld* UAIAction::GetWorld() const
{
    if (HasAllFlags(RF_ClassDefaultObject)) return nullptr;
    return m_pWorldContext;
}

In my case I’m doing NewObject<>() and then calling a second function to set m_pWorldContext (and some other cached references), but that seems fine. In the editor my AIAction blueprints now have access to the Blueprint Function Library, and anything else I want.

As Aladdin would sing: It’s a whole new world…

Thursday 05.11.2020

Teaching during the morning, but had my first run at AI Actions with world context, and yeah, super happy now. I can raycast, project onto the nav mesh, kick off movement, and check state all from within one BP. Was super easy to dev and debug, and the result’s doing exactly what I want.

I had to make one tweak to the interface by adding the Reset event to Blueprints (it was pure C++). But otherwise, all good.

I feel like I’ve jumped all the hurdles now. This is definitely going to do what I need.

Friday 06.11.2020

Added an idle state to the Rat. It’ll timeout for a few seconds, then randomly decide to idle for a few seconds. It calls into the NPC Blueprint to trigger the anim montage, so that pattern's working as I expected.

The rest of the day was spent scripting the next Vlog and updating social media. I’ll start capturing footage and editing the Vlog tomorrow...

Previous Post: "Jasper"

Friends:

If you like any of my work, please consider checking out some of the fantastic games made by the following super talented people: