Workin’ in a Coal Mine

Been a long month of no dev.

The missus went on a vacation with her family and I had charge of the two boys for 10 days, solo. As often as I say that being a SAHD is the toughest thing I’ve ever done – doing it by myself was even harder. My hat’s off to all single parents, you are doing a herculean task under the most trying of circumstances.

Add to the actual time the wife was gone, there was plenty of extra kid watching time as she was packing, researching side trips, getting paperwork in order, etc. That and Son #2 has been waking earlier- which cuts into my dev time, and my sister-in-law commissioned 3 paintings for her husband – well, maybe I can squeeze in 5 minutes of dev stuff here and there.

What I have been working on, is a potential “Half Life Stories: VR” game, something to experiment with and potentially something to try and convince my local dev group to try and push to finish and release.

dragonskin

I’ll most likely be seeing my folks in a week, so I can scour their photo collection for more bits to add to the “Shore Road VR” project – which has stalled because I have zero reference photos of the old back dining room and back porch, as well as the upstairs.

I’m also trying to soak up a ton of new knowledge, since I discovered my library grants full access to Lynda.com, and tons of Unity, Blender, Photoshop tutorials.

I just wish I had more time.

We gotta ship SOMETHIN’!!

Sometime things just seem to fall into place.

I treated myself to a little birthday gift. I was looking at trying to incorporate Papagayo lip-sync data into the .MHX2 rigs and was getting nowhere trying to get the humanoid models to:

A) accept .dat targeted lip-sync data

B) accept and merge multiple motion caption files

C) export all that to Unity and look nice

For whatever obsessive reason, my brain cannot come up with game ideas that involve characters that are static blobs / robots with very little animation / monsters with one or two simple expressions, meh and pure volcanic rage.

Nope – I have to have humanoid NPCs – somewhat realistic, and certainly with enough responsiveness and emotion on par with Saturday Morning Cartoons… Talk about a challenge.

So when I encountered lip-sync packages on the Asset Store, I did some research and settled on LipSync Pro. Having a bit of animation experience I knew the more phonemes a character has, the more realistic dialog will be, and LSP seemed to fit the bill. It even works with Fuse characters, which aren’t as impressive as the .MHX2 rig – which works beautifully with Tore Knabe’s Realistic Eye Movements.

So I installed LSP, downloaded a Fuse model and amused myself with some impromptu dialog recorded from my three year old. I got big laughs from the kids when I showed them a sample of a truck drivin’ redneck dude asking for treats because he used the potty.

Suffice to say, I was impressed with the results and began wondering how else I can use this. I started rebuilding my VR Bum sim because I was getting atrocious frame rates so I’m pulling all my assets back into Blender to see how many polys I can slash off everything and optimize the level. Sovereignty is on hold until I can get some new dialogue recorded and I have a ton of 2D game design before I can start on my platformer based on my kids artwork.

I was scrolling thru some old projects when I happened to see my old 3D animation work folder and I searched it for things that I could adapt into a VR diorama. And lo and behold, my ‘learn Blender’ project  – a music video for Mojo Nixons “Ufo’s Big Rigs & BBQ

So, I’m chopping it down to a minute length – adding various animations to the Fuse character, a background and will add some 3D animated video to it so it looks like the Fuse character is delivering a performance in front of a screen playing highlights of the song.

I asked on /r/vive if people would like to offer feedback on a WIP and they seemed pretty desperate for any VR content, so that, coupled with the itch / drive to put something out there, will have me busy for a while. VR is only going to get bigger and more crowded, so the quicker I can get my name out there, the better. Here’s hoping its worth it.

Feeling Re-Vive-ed

So, all Dev work has ground to a halt with the arrival of my Vive.

I’ve been delighting in room tracking (even though my office is standing room only), I’ve been amazed at tracked controllers (now I REALLY want a VR version of Thief, after playing the Longbow demo in The Lab – being able to actually use a bow to shoot water arrows, or use a controller as a blackjack would be indescribably satisfying)

But, with my many an hour spent agog in VR, its hours less spent developing my own work. So, after dragging my feet a bit, I fired up Unity to see what I could do to convert my DK2 projects over to SteamVR.

Its a completely different way of doing things.

I completely broke all of my experiences and in the case of my BUM simulator, broke my level geometry and have to scrap and rebuild it all from scratch. Because I discovered something wonderful.

In the Steam VR examples is a IK scene, that allows your controllers to control virtual arms, attached like real arms would be in the virtual world:

Ye gods, if I incorporate that into my scene, you could shake your cup to rattle it! I am so excited this possibility exists  now the tricky part – figuring out HOW to make it work. My google-fu yields nothing – not any teases on how that got made or how I would adapt it to my character  and if I would have to rebuild my character from scratch, or simply rename parts of it…

On top of rebuilding everything else too…

 

 

 

 

 

A nice learning experience

Last week the Cleveland game devs had a Level Up workshop, focused on Unity and Playmaker – so I signed up, enthusiastic that I could unlock some more of the mysteries of Playmaker and game mechanics.

I was not disappointed. Our moderator, Bill, jumped in with a basic endless runner prototype using primatives and simple FSMs, and soon our cubes were colliding and jumping on command. It got me thinking about the platformer I want to make using my autistic son’s artwork.

I showed the kids my work the next morning, and they gleefully smacked the spacebar to make the little cube hop on command. I started up Audacity and recorded their delighted sounds and snuck them into the FSM. Their eyes grew wide when they heard their own voices yelling ‘jump’ when they hit the spacebar. I made a few more tweaks to reset the level if the player cube got knocked off screen.

I can see how this sort of game is so popular for beginner devs – its quick, easy and has an iterative cycle thats easy to add mechanics and playtest. It makes me question if I shouldn’t focus on one as a means to generate quick income. The HTC Vive is going on pre-order next week and I could really use the money.

Hello Sunshine!
Hello Sunshine!

Its the Little Things

So, last week – things got a little sidetracked. Getting ready to go to church and my eldest slips and cracks his chin on the steps. MIND YOU – he was holding the handrail AND MY HAND when this happened. All the times he has walked down those steps holding a load of toys so large he can’t see over them; slip down the steps on his butt, refused to hold anything and do it with a blindfold while juggling chainsaws – nothing. But the minute we’re taking the slightest precautions… BAM – four stitches.

THIS week, my youngest has reverted back to his very needy; meltdowny phase. The running joke is that he clings to me like I was the last chopper out of Saigon. It had been slowly building for the past few days (maybe based on the amount of attention his brother has been getting) but last night he wanted to take his sippy cup into the bath with him. A small, innocuous request – one I easily agreed was fine. And I told him so.

For reasons that will baffle and confound me to the grave – it led to a hour and a half meltdown that no amount of hugging, cajoling, bribing of sweets, threats of all-boys-Christian-scared straight-military- academy, offers to buy him a college education, buy him a mountain of candy, or anything else, would fix. He then proceeded to wake a few times during the night – which led to a co-sleeping session, something I haven’t done with him in many moons.

balloonscreenie

And yet – game dev MUST continue.

I’ve been listening to a Udemy course on FPS and Playmaker, and while a lot of these courses are retreads of the ‘here is the interface / options / how to get started’ lectures, I do tease out new nuggets of advice and tips each time I listen to one. This time I was able to put together the physics I needed to make a couple of balloons on strings behave like they should, swaying in the breeze and bouncing off each other in a realistic way. I was so impressed with it, I’m tempted to make it the load screen…

I was also able to attend a ‘Level Up’ lecture with the Cleveland Game Dev group – a one in a while chance to hang with peers in the area who have similar interests, and learn a bit about making pixel art. I’m still interested in creating a point-n-click or side scroller based on my eldest and his artwork, so it was a welcome break from kids and the oppressive weather keeping us indoors and the kids constantly battling each other. I’d love to be able to spend more time with the local game dev group, but the demands of daddyhood often leave me tired and they only meet in my neck of the woods every so often.

So – what’s next? I’m currently struggling with Playmaker, Mecanim and getting meaningful acting out of my MakHuman and Fuse characters and trying to learn the new Quixel suite – both of which have been daunting and I’ll be venting about here soon enough.