Archive:

Subscribe:Atom Feed

"First AI"

Posted: 02 October, 2020

Monday 21.09.2020

I have absolutely no idea what I was smoking when I wrote the moving platform code, but it must have been strong. For some reason I did the movement based on distance checks between nodes, rather than lerps. This breaks in all sorts of fun and exciting ways; a frame stutter is enough to make the checks fail, resulting in the platform sailing off into oblivion, which made for some fun on the Twitch stream, if nothing else.

Anyway, they’ve been fixed. And since I was in there, I implemented the DungeonRoomObject interface and made sure they’re not ticking unless the player’s in the same room.

Unfortunately, moving platforms can still cause trouble with the physics. Atm they don’t appear to push the player through anything, but I know from bitter experience that someone will glitch the hell out of these. I’m deferring any other checks until my QA chum starts abusing them…

Tuesday 22.09.2020

I have a cold. I’m not happy.

I’ve not been up for anything heavy, but did I bounce around some bits and bobs:

Phil sent me some ditties. We’re hunting for our equivalent of the little tunes played when you uncover a secret, open a chest, get a key-item, etc. We’re not there yet, but I put some of the examples that fit best into the build, and made a video for Phil to wrap his ears around.

Wednesday 23.09.2020

Still snotty…

I added game state and code for the last “sequence” in the dungeon; heading back to unlock the path for the heart container. It’s all centred in one room, but you’ll need to hit it 3 times to get it. Very simple by Zelda standards.

ZTargets can now be disabled, temporarily. This is only needed by the Sticky Mitt totems, and even then, only for a few rooms before you collect the Mitt. But it was annoying having them targetable when you couldn’t interact with them.

I also made a new type of door, made of metal bars, that sent me down a little Rabbit hole…

I thought it’d be good to do the open / close as animations in Blender, but it ended up being a bit of a pain in the ass. Blender’s export to FBX is, er, fiddly, when importing to UE4. So why not try Epic’s “Send To Unreal” plugin?

This worked a treat; all the animations came in at the correct scales and bone transforms were clean. I can see it being a big time-saver, but it’s a change of workflow. Instead of: saving to blend -> export to FBX -> import FBX to UAsset, it’s “Send to Unreal” and I’m not sure I’m prepared to go without FBX entirely. What if I want to re-import the asset at a later point? Or move it in the content folder?

Seems great for iterating, but I’m going to need a security blanket for a little while longer.

After all that, I ended up doing the opening sequence via timeline as my doors aren’t setup for skeletal meshes in code, and meh, wasn’t in the mood…

I made a packaged build and gave it a quick play-through. Spotted some text bugs and realised that none of the builds I’ve made to-date have had VSync on by default!

Er…

whistles

Thursday 01.10.2020

Gave my favourite lecture, this morning. A 2.5 hour breakdown of some of the design elements in the early 2D Mario games. And yes, Mario 3 is best, don’t @ me.

My cold hasn't subsided, and I had a complete lack of sleep last night, so feeling pretty wiped out. Treating myself to a sickie. I did play the build and bask in its VSyncy-ness for a while.

Friday 02.10.2020

Ok, I legit couldn't put off doing any AI for any longer.

I've done a little bit with UE's Behavior Trees (in Psyance) so they're not completely unfamiliar, but er, I've forgotten everything. Took a little while to get going, but the Rats are fairly simple; they just need to run around, occasionally pause, do a sniff, and then run somewhere else.

The first pass at a tree for this ended up as:

They have really simple sight perception, so when they see something they can stop, find a new location, and then run away. It's avoidance at it's simplest, so I'll have to go in and refine it. Ideally they should be making an Environmental Query, to find a position away from the actor they've just sensed. Right now they're picking a location at random, so often they immediately sense the same actor again, which looks a bit weird.

It's a first pass, just to dip my toe back in. I'm not entirely happy at how many files this generates though :(

Anyway, here's a little look-see at them dropping in:

Previous Post: "Camera Action"

Friends:

If you like any of my work, please consider checking out some of the fantastic games made by the following super talented people: