I had the good fortune to be interviewed about the game. I was cruising the Cleveland Game Dev posts on Facebook, and someone mentioned that the InfiniteGamer Podcast was a great way to get exposure. And Chris, from IG agreed. So I asked if he was interested in a game based on a 6 year old’s drawings.
So after a few emails, we agreed to interview in the office (after the wife & I furiously scrubbed the cobwebs and kid debris out of the corners!) and eagerly set up a build for him to demo.
He was impressed with the Vive and loved the demo. Chris instantly got the idea of it and how kiddo’s drawings ultimately ‘make’ the game. Obviously a seasoned gamer, we launched into a discussion about the influences, how my son (who was sitting on my lap – playing ‘his’ game as we talked) is influenced by classics like Super Mario and Zelda and in turn how it shaped the drawings I was using for the demo.
We wrapped after a while and I got back to my routine. As I was making dinner, the wife reminded me that Lent was around the corner – and I decided alcohol had to go. I need to start losing the spare tire and the beer isn’t helping. That, and when I have booze and then go to bed – I snore more, and that keeps the wife up.
Plus I have more energy, which is good – the sis-in-law is having a Easter brunch – and I need to bear down and get her paintings done. I can’t focus on game dev without thinking about the paintings, and I can’t paint and not think about the game. So I have a month to finish them and then I can focus on more important things.
Like finishing the demo. Like getting my Steam page up and running. Like dealing with the kiddos, and the spate of ‘potty accidents’ we have been experiencing. Like getting the attic done.
So, between furiously trying to finish the triptych for my Bro / Sis-in-laws, I’ve been furiously trying to get my demo level completed. Hearing that Steam is ending Greenlight, I’ve got even greater incentive to get my Steam page up and running.
After showing off my level during a birthday party for my youngest (all the kids, and most of the adults know about my Vive and want to play in between cake and presents) and got tons of feedback. Its amazing how quickly a 9 year old can be a developers harshest critic. It also helps when people just ‘get’ the game – the mechanics, the milieu, the theme. One of the toughest aspects to grapple with is allowing the player to turn. Is it static and no turning; use a control to turn – like Google Earth VR where you can use the grip controls to spin the earth around? Or as another player suggested – only turn around when whatever ‘quest’ requires a different point of view?
Which leads into more questions: how do I guide my player? My son hasn’t been exactly forthcoming with sketch requests. On my demo, I’d love a simple sign at the start that directs the player to go and find the king… except my kids are just like me: stubborn as all get out.
I am taking the various photos of artwork and teasing out all the letters to upload to MyScriptFont, a site where you can generate a TTF font of your handwriting – I’m using it for a GUI during the ‘sock hunting quest’.
So, out of the blue, I get a text from an old friend back in NYC. He had seen the post I put on FaceBook and wanted to chat. I was busy packing up kiddo #2 for a playdate at the library (a long overdue one, to boot!) and said I could chat – but quietly.
We got to our playdate, and #2 son was greeted with smiles and a card made just for him (it amazes me just how insanely cute and adorable kids can be right before they spill oil based enamel paint all over your vintage 1986 IBM ‘M’ keyboard…) and I explained that I needed to take an important call… like the one coming right this second.
My buddy and I exchanged several foul mouthed greetings (ye gods I miss some aspects of living in the city – cussing is elevated to an artform there…) and he quickly got down to business: he loved the video and has some people he wants to show it to. Moreover, needs documentation and whatever else I can put together to see if we can possibly get funding and wants to get a broader idea of what I’m trying to accomplish. He likes the idea that its for VR and we talk about full on VR rigs and mobile headsets.
Jeez, I’m trying to figure out how to make this into a game- that’s what I’m trying to accomplish.
But – we talk, brainstorm, catch up, cuss some more and now I’m committed to creating a prospectus and perhaps more to see if this idea has legs. In the process, we inadvertently come up with a whole slew of ideas, many of which are waaaay beyond my paltry skills (I love the idea that we can use the controllers to ‘draw’ in the world to help solve problems, like drawing a bridge to get across a canyon…) and I relegate them to expansion pack ideas – or sequels.
Its also putting pressure to get a demo level DONE. While looking at what I had in the level already, plus looking at the drawings floating in the file (plus some ideas from the wifey) I took the castle already in my scene, took a drawing of a character with a crown and tossed him in Spriter and took other elements to make a ‘quest’ of sorts: collect 5 stinky socks, and the king will let you pass through the castle:
I’m leaning away from trying to take drawings that #1 son did and extruding them into game levels, it looks wrong somehow and doesn’t seem to convey the same atmosphere as the open world does (in the video, there is a section of white blocky areas bordered in blue that were the old style)
I’ve been finding that the sprites in a 3D world are creating a bit of a headache – a lot of the static background props were slowing down my framerate, so I traced them in Inkscape and exported .svg files into Blender, extruded them and slapped the drawings back on as texture maps – which made it possible to rotate the quest items (socks) as soon as I can figure out why my axis isn’t aligned properly.
I’m hoping to have a level properly in shape and running in a week or two, release it as a beta for playtesting and figuring out problems and within a month, get a demo out on Steam.
Right before I was going to back it up. I had been remiss due to the holidays (and more holidays, and back to school, etc.) And had not backed up in what turned out to be a really long time.
So as I’m desperately trying to remake my project from old backups, demos I posted online, random files from my laptop and anything else floating around. I did OK, getting pretty close to my last working version when I had something of an epiphany / revelation / slap to the head.
It wasn’t very much fun. Well, I told myself – that’s rather the point. Its about how crappy people can treat each other, and what its like to be on the receiving end of peoples disgust and scorn. After replaying it to test if I had missed anything, I came to the ego-bruising realization that it wasn’t very good. Also, it was slowly dawning on me that I broke a cardinal rule of game development: start small.
Not that my ‘game’ was a sprawling 60 hour MMORPG with multiple branching storylines; it was nothing more than a static environment with a few interactable NPCs and triggered events. Turns out even that was too complicated.
So – as I was agonizing over what to do next, I was organizing some drawings my eldest had done and I came across this one:
and it just struck a chord with me… My son is 6, on the spectrum (non-verbal) but has drawing talent that is better than mine at a similar age. I’ve always envisioned making a 2D platformer with his work but that is a whole different beast from the 3D VR work I’ve been doing and would require more learning, more time away from other projects – time I really don’t have to spare.
So, as I’m sitting at my desk, my 6 year old wanders into the office and plops down on my lap and grabs my Vive, looking into my eyes and silently prompting me that he wants to look inside. I shift him around and open up a test Unity project (thanks VRTK for having simple projects to tinker with!) and throw the image above on a plane and slap a quick character controller on it and hit play.
Number One son positively howls with delight as he sees his drawing moving around on a flat plane, me controlling it with WASD as he looks around clapping his hands and laughing. I grab my Steam Controller and turn it on, a nice surprise when it controls my character as well and thrust it into the kids hands. He’s even more excited now that he is controlling the action and laughs as he repeatedly steers the character off the ‘world’ and plunges into the depths below. My wife walks in and informs him its time for bed and I get teeth brushed and tuck him in, kissing his little head and smiling at his delight.
I throw the character into Spriter Pro (an impulse purchase off of Humble Bundle) and give him (her?) a quick walk cycle and re-import back into Unity – its even better. I put his drawing on Facebook, to see if it resembles anything (particularly, a trademarked character) and people give me answers like, ‘lady bug construction hat’ or ‘African American lady bug superhero’ – so I’m kinda content that he’s not copying a kids book or game he’s played.
I throw in a few more drawings and screen cap the above footage and show it to the gang at the VRTK Slack channel. They LOVE it – I get a ton of ideas thrown at me (including the near impossible one of making it like Scribblenauts and have the player draw solutions to level puzzles) and I need to copy and paste all of them into a word .doc to keep track of them all.
So – I think I’m breaking another game dev rule: don’t get excited about a new project without finishing your current one. Looks like I enjoy breaking rules and I’m getting sucked into a simpler, more attainable goal.
*Night Before* – “OK, gonna get up at 5:45 AM, coffee and dive right in and get some work DONE!”
6:30 AM – The Next Day.
“Crap – slept in again!” Quick, coffee, take my old man pills (BP & cholesterol) quick check emails, headlines, VR Dev Slack group… “WHAT!?! VR Juggler Sim is getting released NEXT WEEK??” *Sigh* – “That guy started dev’ing like 2 months ago… how the flying @#$% did he get a release out that fast?”
THUD. “Get off of me!” – Oh, yeah, he probably doesn’t have kids. Or a old house that needs more repairs than the Millenium Falcon. Or bills. Probably has his mom bringing him hotpockets and uber caffeinated beverages 6 times a day. Ok, before the AM rush – what did I get done…
Oh about 3 brush dabs on a texture map I’m trying to tweak and 5 minutes watching a YouTube video explaining how to do animation layer masks so my character can ACTUALLY pick something up rather than have objects attach to the BACK of his hand – because, you know: immersion. I had to rewind the same section half a dozen times because the narrators accent is so thick, I can’t tell if he’s from Germany, Bangladesh or Pittsburgh.
Alright – get the kiddos up, fed, dressed, check the Slack channel again; oh look VRGamerChick and JediDev were up all night playing “Virtual Beer Pong” and laughing their asses off over how the physics got screwed up and was launching the ball into the ionosphere… bookmark the screencap to watch later as lactose intolerant child is busy pouring his expensive milk substitute on his brothers head.
Eldest child is off to school – leaving me with the needy 3 year old who wants to go to the library; which is fine – I can ruminate over how to get my grasping animation to work and finally get over this hump that has slowed me down for the past couple of weeks. Contact the dev of the IK plugin and ask for guidance – who promptly tells me that the components I’m combining together to make my game work, shouldn’t be used together.
Library done and time for lunch – I try watching the tutorial video again, getting nowhere because my 3 year old want to discuss each and every family member and which of them has a penis and who doesn’t. Just grateful we’re having this discussion in our kitchen, rather than the Toddler Story Room.
#1 Son is due home from school any minute as I try to accomplish something akin to housework and cleaning, assembling something vaguely called ‘dinner’ – which my kids won’t eat because it contains things like ‘nutrients’ and I have a flash of inspiration to check the order of my components and / or check if my animator has ‘root motion’ enabled – it always seems to go back to that stupid checkbox, which I still don’t understand. Quick sneak upstairs while the kids are playing splitscreen Portal 2 and test my hypothesis. I’m starting to hear yelling as I’m waiting for my base stations and HMD to warm up… Jeez guys, just 2 more minutes…
Crap – that doesn’t work either, quick run downstairs to see one child bludgeoning the other with my Steam Controller. Time Outs issued. Threats over removing games. Promises they will be shipped off to the grandparents if they don’t straighten out.
MAN, am I tired. Stumble thru dinner and showers for the kids, snacks and bedtime routines. Guzzle a beer or 4 as I catch up with the wife and her day, hoping to snag just a few more minutes of dev work before my eyes permanently weld shut until I hear the kids screaming again… Nothing more I can glean from the tutorial video, look at my character and start seeing if there are other options I can tweak so I can either have my character grab things or have him float in the air 2 feet below me. OK, things look good on the main camera / rig / mesh/ let me drill down to see if… wait. Go back.
YOU MEAN TO TELL ME I HAD TWO ANIMATOR CONTROLLERS ON THE SAME OBJECT?!?!
Crap. Bedtime. Oh well – I suppose it is progress… one less thing to figure out. Let’s see: gonna get up at 5:45 AM, coffee and dive right in and get some work DONE!
Well, despite the kids being sick, Halloween, the election and my hard drive dying on me – work must continue eh?
I got the Final IK (VRIK) solution sorted out and my character now moves fairly close to approximate a human with the upper torso. The twist relaxer (found on this page, along with a lot of other helpful stuff!) helped get my arms moving naturally with the user. Animations for actually grabbing things will be coming soon – it sounds like my colleagues in the VRTK Slack channel are struggling with the exact same thing, and I’m hoping their experiments point me in the right direction to gallop.
Now comes the overly tricky part. Since I’m lousy at coding and have no patience for it, I’ve been leaning heavily on Playmaker to handle my ‘programming’ and getting cheap Udemy courses to figure out how it works. This particular course helped tons in trying to integrate Unity’s Mecanim system with Playmaker (and from a fellow VRTK’er who actually was homeless) and suddenly propelled my thinking in whole new directions.
Since the Udemy course deals with things like health managers and damage, I’ve adapted some interactable objects to work with my randomly patrolling cop. Throw something at him and it hits, he comes over and commences stomping on the player. This has led to the ideas of gamifying the experience, byadding a health system, or a sanity system, collecting coins as ‘points’ possible upgrades (better clothes provide more sanity, paying for a public shower increases coin payouts from NPC’s) and the like. Originally I just wanted a simple experience to try and convey what it feels like to be a ‘undesirable’ to get people to feel empathy; but the lure of making it a playable experience that one can ‘win’ is strong.
But, as others wiser than myself have said, Keep It Simple, Stupid. Plenty of indie games have fallen by the wayside because they’re simply too complex in scope to ever complete. Plus seeing that my hard drive failed, I’m trying to piece my files back together from sporadic backups and hard drives that I’ve replaced, but never wiped. In doing so, I’ve come to realize I’ve been working on this (on and off) for 2 years now, and am only just now feeling like I’m on the cusp up being able to show off some work to the dev community.
I’ve also been racing to try and finish the 3 paintings for my sister in law – a wireless solution for the Vive has been announced and I would love to get my hands on one (once I get the $$) – even if its to see it doesn’t work as promised. I LOVED the feeling of raw potential when I strapped on my DK2 and the early promise of an emergent tech – I’m certain not having to worry about a cable would further enhance the immersion. And the fun. And the breaking of things.
Like the headline implies, been feeling like I’ve been stuck in the longest traffic jam known to mankind. Side work has taken a lot of my dev hours – 4 paintings and a logo will do that to even the most unencumbered man.
I have had the fortune of joining a dev Slack that centers around a set of tools for VR creation. The VRTK Group has been amazing in their support, ideas, helpful suggestions, weird humor (heck, I even got an amazing voice-over line from one of the devs)
The other, not so great thing about this group is that its impacting my wallet. I just found out about Final IK, and amazing rigging tool that just added VR support. When the Slack members started passing around this video:
I just about freaked out, because this kind of player controlled interaction would be PERFECT for my BUM experience. I could use some of the VRTK scripts for grabbing objects, I could have the player control the homeless person, grab a cup, and if I get the input from the controller, make a change rattle sound & trigger NPC’s walking nearby to either drop a coin or make a disparaging remark.
Now, before I get too far ahead of myself – I’m still stuck at getting the Final IK solution to work – as you can see by the video below, it looks like the player is swinging from a Jungle Gym:
not to mention that from the VR vantage point, the player is seeing the inside of the avatars head. Not overly selling the idea, but yet again – one step closer.
The missus went on a vacation with her family and I had charge of the two boys for 10 days, solo. As often as I say that being a SAHD is the toughest thing I’ve ever done – doing it by myself was even harder. My hat’s off to all single parents, you are doing a herculean task under the most trying of circumstances.
Add to the actual time the wife was gone, there was plenty of extra kid watching time as she was packing, researching side trips, getting paperwork in order, etc. That and Son #2 has been waking earlier- which cuts into my dev time, and my sister-in-law commissioned 3 paintings for her husband – well, maybe I can squeeze in 5 minutes of dev stuff here and there.
What I have been working on, is a potential “Half Life Stories: VR” game, something to experiment with and potentially something to try and convince my local dev group to try and push to finish and release.
I’ll most likely be seeing my folks in a week, so I can scour their photo collection for more bits to add to the “Shore Road VR” project – which has stalled because I have zero reference photos of the old back dining room and back porch, as well as the upstairs.
I’m also trying to soak up a ton of new knowledge, since I discovered my library grants full access to Lynda.com, and tons of Unity, Blender, Photoshop tutorials.
I treated myself to a little birthday gift. I was looking at trying to incorporate Papagayo lip-sync data into the .MHX2 rigs and was getting nowhere trying to get the humanoid models to:
A) accept .dat targeted lip-sync data
B) accept and merge multiple motion caption files
C) export all that to Unity and look nice
For whatever obsessive reason, my brain cannot come up with game ideas that involve characters that are static blobs / robots with very little animation / monsters with one or two simple expressions, meh and pure volcanic rage.
Nope – I have to have humanoid NPCs – somewhat realistic, and certainly with enough responsiveness and emotion on par with Saturday Morning Cartoons… Talk about a challenge.
So when I encountered lip-sync packages on the Asset Store, I did some research and settled on LipSync Pro. Having a bit of animation experience I knew the more phonemes a character has, the more realistic dialog will be, and LSP seemed to fit the bill. It even works with Fuse characters, which aren’t as impressive as the .MHX2 rig – which works beautifully with Tore Knabe’s Realistic Eye Movements.
So I installed LSP, downloaded a Fuse model and amused myself with some impromptu dialog recorded from my three year old. I got big laughs from the kids when I showed them a sample of a truck drivin’ redneck dude asking for treats because he used the potty.
Suffice to say, I was impressed with the results and began wondering how else I can use this. I started rebuilding my VR Bum sim because I was getting atrocious frame rates so I’m pulling all my assets back into Blender to see how many polys I can slash off everything and optimize the level. Sovereignty is on hold until I can get some new dialogue recorded and I have a ton of 2D game design before I can start on my platformer based on my kids artwork.
I was scrolling thru some old projects when I happened to see my old 3D animation work folder and I searched it for things that I could adapt into a VR diorama. And lo and behold, my ‘learn Blender’ project – a music video for Mojo Nixons “Ufo’s Big Rigs & BBQ”
So, I’m chopping it down to a minute length – adding various animations to the Fuse character, a background and will add some 3D animated video to it so it looks like the Fuse character is delivering a performance in front of a screen playing highlights of the song.
I asked on /r/vive if people would like to offer feedback on a WIP and they seemed pretty desperate for any VR content, so that, coupled with the itch / drive to put something out there, will have me busy for a while. VR is only going to get bigger and more crowded, so the quicker I can get my name out there, the better. Here’s hoping its worth it.
So, all Dev work has ground to a halt with the arrival of my Vive.
I’ve been delighting in room tracking (even though my office is standing room only), I’ve been amazed at tracked controllers (now I REALLY want a VR version of Thief, after playing the Longbow demo in The Lab – being able to actually use a bow to shoot water arrows, or use a controller as a blackjack would be indescribably satisfying)
But, with my many an hour spent agog in VR, its hours less spent developing my own work. So, after dragging my feet a bit, I fired up Unity to see what I could do to convert my DK2 projects over to SteamVR.
Its a completely different way of doing things.
I completely broke all of my experiences and in the case of my BUM simulator, broke my level geometry and have to scrap and rebuild it all from scratch. Because I discovered something wonderful.
In the Steam VR examples is a IK scene, that allows your controllers to control virtual arms, attached like real arms would be in the virtual world:
Ye gods, if I incorporate that into my scene, you could shake your cup to rattle it! I am so excited this possibility exists now the tricky part – figuring out HOW to make it work. My google-fu yields nothing – not any teases on how that got made or how I would adapt it to my character and if I would have to rebuild my character from scratch, or simply rename parts of it…