Cop Out

Been trying to work out Playmaker schemes in my head, how to make things interact. My big crowning achievement last week was adding a sound effect to the player shaking his cup. My thanks goes out to the kids babysitter, who happened to walk in with a paper coffee cup, that I promptly commandeered to get that ‘authentic’ panhandler sound effect. Since most of my dev time is in the wee hours of the morning, I needed to hide in the basement bathroom to record my sound effect, lest I woke up anyone…

At some point, I’m going to upload a very pre-alpha to /r/oculus, just to get some preliminary feedback, and perhaps a few pointers on Playmaker and optimization. In the meantime, I’ve been working on another idea, working in a police officer who interacts with the player – perhaps using the players gaze as a variable. Stare at the cop too long and he wanders over to tell you to scram. Flip him the bird and *BAM!* You get beat up and arrested – game over.

So, its baby steps – I got this model from Yobi3D, rigged it and animated a walk cycle using this tutorial – I think it needs a little tweaking, but am generally pleased with the results:


I realized that the tagline of this blog is ‘Game Dev AND Stay at Home Parenting’ and I’ve done little to document the latter part of that clever little phrase…

Just about every day, I try to get up an hour or more to get some work done, some days, like today, I can plug away at my secondary characters animation. Its nice and quiet, I sip my coffee and try to imagine a well-to-do Upper East Side wench telling off some panhandler on her morning commute, what her posture is like, how she’d thrust an accusatory finger while saying something derogatory.

Other mornings, one or both of my progeny will throw a wrench into the works and wake abysmally early and wreck what I had planned on working on. Just last week I heard footsteps on the floor upstairs. I sighed, made a few quick notes on what I was planning on working on, hoping that after I got the elder child off to summer camp,  I’d catch up later son #2 took a nap. It happens. And if those footsteps are indicative of who they are, its my older kid and we’d get to snuggle on the couch for a bit, watch some PBS kids and have a quiet moment together before the other one woke up. So I wend my way upstairs and gingerly open the creaky door to the kids room…

My son is busy barfing on the floor.

He’s upset, he doesn’t get sick very often and is scared by what is happening. I’m trying to calm him so he doesn’t choke, trying to hold him in a position where anything else that might come out will land on the hardwood floor instead of his playmat, keeping him at a bit of a distance so I don’t get barfed on.

And now the other kid is awake – and screaming because he doesn’t know what is going on.

Yeesh – I just wanted to work on a walk cycle.

Offer calming noises to the youngest, strip the older one, grab a handful of paper towels to clean the floor, grab both kids to hold them, sit down on the bed and calm them and try to remember if I’ve got the summer camp number saved on my phone and if I don’t, where might it be – because I’ve gotta call and lat them know my kid ain’t coming today.

Some days are like that. Other days its full on Thunder Dome where I’m trying to send a simple message on Reddit asking where is a good place to get feedback on my pre-alpha build and the two of them are leaping off the furniture Crouching Tiger style with chainsaws, locked in gladatorial combat…


Anyway – I’ve just recently added PlayMaker to my arsenal, and while it took me a bit to get, it looks like it could radically change my game for the better. Now excuse me, my youngest is kicking the side of his cage, I mean crib, demanding release…


Rigging the game

As I wend my way further into this VR experience, the more complicated things become. I was lucky enough that a talented voice actress recorded a few lines of dialogue for me – I’ve been trying different methods of getting more advanced animation, especially facial expressions, into Unity. I tinkered around a bit with the .MHX2 file format – but it seems getting it to work with Blendshapes in Unity is a real Gordian Knot to untie.

But – even as one problem gets stickier – another is getting easier. I’ve decided to add a police officer as an antagonist. Possibly trying to set up a mechanic where if the user looks directly at him, he’ll get aggravated and harass the user. But that is down the road – I’ve been busy working on one of my favorite parts of 3D – Rigging!

Thanks to Yobi3D for the model!

cop model
cop model


Push Forwards

I’ve been encountering difficulties with the Rift SDK – my project wouldn’t play in the editor and my builds would crash. Since its new tech, getting help is a bit of an uphill battle since everyone is still figuring it out.

Out of the blue, Valve announces their own SDK plugin for Unity (with UE4 support next week) and I thought I’d give it a shot. Well, its a win / lose – I can play in editor (and in VR!) but my builds only play a black screen (but it plays without crashing!)

Also – I saw on reddit a generous VO actor offering to record some dialog for free, and I pitched my idea and she accepted it! So I’ll have a few disparaging lines from a snooty secretary type to mess around with. Which means I’ll have to explore the .MHX2 option for Makehuman export, as the regular .MHX gives me a single jawbone to try and lip sync with…

Success! and a setback

I finally overcame the problem of animating Makehuman characters and getting them into Unity. More or less the problem seemed to resolve itself when I upgraded to Unity 5. And now my token booth clerk walks, talks and cusses – thanks to dialogue I recorded at 6 AM when my voice isn’t exactly spry.

Only now, I can’t quite get the Rift integration to work – there are a bunch of coding references I don’t get – and one unhelpful person over at /r/oculusdev suggested I ‘learn to code’ to fix my problem. Yeah pal, as soon as I see a programmer start drawing proportioned, balanced and properly shaded figures or demonstrate how they can make texture maps with alpha masks…

Anyway – neat side note: The texture glitched on my guy, and he has a neat cyberpunk-ish feel about him, if I ever do something in a sci-fi setting, I will definitely re-use him.


Model + Paint

Modelling out the token booth for my VR experience “BUM.”

While, I don’t consider myself that great at modelling, I feel I get the idea across and am learning every time I make something new. Blender has proven to be a bit challenging, since I’m used to modelling with polys, and it constantly wants to convert things to n-gons. Still, I can see the merit in loop cuts and the knife tool – it’ll just take a bit to learn and implement effectively. It’s also helpful to be converting my old Truespace models over to Blender, and how my skills have improved.

What I do truly enjoy is the texture painting – a long time ago I got a copy of 3D Paint, and while it was very basic in its toolset, it did open my eyes to a new way of textureing 3D objects. I like to get my basic UV layout and color areas established in Blender, then pull my texture maps into Photoshop to work out details, then back into Blender for shadows, dirt and other bits of wear.

The UV maps, plus live preview of the texture in progress.
The UV maps, plus live preview of the texture in progress.

While my overall look is to emulate the classic NYC booth, I did need to make adjustments, since a seated character wouldn’t be visible to the player (since the player is sitting on the floor of the subway station) and eliminated some extra details such as a card swiper, as to avoid ‘dating’ my booth.

More WIP shots and in-engine pics to come.

Attack of the Cousins: Special Edition

I was originally inspired to do this by the dad who did VFX shots of his kids – the original was simply comping lightsabers on the kids and a photo of the remote from Ep. 4. moved in Premiere.

Then I discovered camera motion tracking in Blender.

It was rough, as this shot was very shaky, jerked around a lot and had no similar markers from the beginning to the end. I managed to bust it up into smaller clips and track those – but even then, they were atrocious to work with. I’m particularly happy with the pit and vaporators at the beginning of the 2nd clip, I like that Blender added in motion blur to match the shot – now if I could figure out how to make the shadows more intense…


Sometimes the best way to continue a project is to scrap it all and start over. I have been working sporadically on a animation set to the tune “UFO’s, Big Rigs, & BBQ” by Mojo Nixon, featuring a redneck truck driver, space alien and – of course, BBQ.

I was able to get decent results creating a humanoid figure using Sculptris (an amazing free program) and was delighted to import him into Blender, get him rigged and make him move. Problems arose when I discovered my mesh wasn’t optimal for animating, had some proportions too far out of whack and had limited ranges of motion.

It also didn’t help that my reference material (me, filming myself acting out what I wanted my character to do) was poorly acted, not thought out and too herky-jerky. It’s what I get for guzzling a ton of coffee and filming myself…

Anyway – back to the drawing board. I’m resculpting my character to better animate and instead of filming my mouth lip-syncing the song and projecting it onto my mesh, I’m adding a real mouth and will use shape keys to lip sync him to the song – more complicated, but closer to what real animators do.

is it redneck-y enough?
Will need further refinement, but getting there…