Like the headline implies, been feeling like I’ve been stuck in the longest traffic jam known to mankind. Side work has taken a lot of my dev hours – 4 paintings and a logo will do that to even the most unencumbered man.
I have had the fortune of joining a dev Slack that centers around a set of tools for VR creation. The VRTK Group has been amazing in their support, ideas, helpful suggestions, weird humor (heck, I even got an amazing voice-over line from one of the devs)
The other, not so great thing about this group is that its impacting my wallet. I just found out about Final IK, and amazing rigging tool that just added VR support. When the Slack members started passing around this video:
I just about freaked out, because this kind of player controlled interaction would be PERFECT for my BUM experience. I could use some of the VRTK scripts for grabbing objects, I could have the player control the homeless person, grab a cup, and if I get the input from the controller, make a change rattle sound & trigger NPC’s walking nearby to either drop a coin or make a disparaging remark.
Now, before I get too far ahead of myself – I’m still stuck at getting the Final IK solution to work – as you can see by the video below, it looks like the player is swinging from a Jungle Gym:
not to mention that from the VR vantage point, the player is seeing the inside of the avatars head. Not overly selling the idea, but yet again – one step closer.
The missus went on a vacation with her family and I had charge of the two boys for 10 days, solo. As often as I say that being a SAHD is the toughest thing I’ve ever done – doing it by myself was even harder. My hat’s off to all single parents, you are doing a herculean task under the most trying of circumstances.
Add to the actual time the wife was gone, there was plenty of extra kid watching time as she was packing, researching side trips, getting paperwork in order, etc. That and Son #2 has been waking earlier- which cuts into my dev time, and my sister-in-law commissioned 3 paintings for her husband – well, maybe I can squeeze in 5 minutes of dev stuff here and there.
What I have been working on, is a potential “Half Life Stories: VR” game, something to experiment with and potentially something to try and convince my local dev group to try and push to finish and release.
I’ll most likely be seeing my folks in a week, so I can scour their photo collection for more bits to add to the “Shore Road VR” project – which has stalled because I have zero reference photos of the old back dining room and back porch, as well as the upstairs.
I’m also trying to soak up a ton of new knowledge, since I discovered my library grants full access to Lynda.com, and tons of Unity, Blender, Photoshop tutorials.
I treated myself to a little birthday gift. I was looking at trying to incorporate Papagayo lip-sync data into the .MHX2 rigs and was getting nowhere trying to get the humanoid models to:
A) accept .dat targeted lip-sync data
B) accept and merge multiple motion caption files
C) export all that to Unity and look nice
For whatever obsessive reason, my brain cannot come up with game ideas that involve characters that are static blobs / robots with very little animation / monsters with one or two simple expressions, meh and pure volcanic rage.
Nope – I have to have humanoid NPCs – somewhat realistic, and certainly with enough responsiveness and emotion on par with Saturday Morning Cartoons… Talk about a challenge.
So when I encountered lip-sync packages on the Asset Store, I did some research and settled on LipSync Pro. Having a bit of animation experience I knew the more phonemes a character has, the more realistic dialog will be, and LSP seemed to fit the bill. It even works with Fuse characters, which aren’t as impressive as the .MHX2 rig – which works beautifully with Tore Knabe’s Realistic Eye Movements.
So I installed LSP, downloaded a Fuse model and amused myself with some impromptu dialog recorded from my three year old. I got big laughs from the kids when I showed them a sample of a truck drivin’ redneck dude asking for treats because he used the potty.
Suffice to say, I was impressed with the results and began wondering how else I can use this. I started rebuilding my VR Bum sim because I was getting atrocious frame rates so I’m pulling all my assets back into Blender to see how many polys I can slash off everything and optimize the level. Sovereignty is on hold until I can get some new dialogue recorded and I have a ton of 2D game design before I can start on my platformer based on my kids artwork.
I was scrolling thru some old projects when I happened to see my old 3D animation work folder and I searched it for things that I could adapt into a VR diorama. And lo and behold, my ‘learn Blender’ project – a music video for Mojo Nixons “Ufo’s Big Rigs & BBQ”
So, I’m chopping it down to a minute length – adding various animations to the Fuse character, a background and will add some 3D animated video to it so it looks like the Fuse character is delivering a performance in front of a screen playing highlights of the song.
I asked on /r/vive if people would like to offer feedback on a WIP and they seemed pretty desperate for any VR content, so that, coupled with the itch / drive to put something out there, will have me busy for a while. VR is only going to get bigger and more crowded, so the quicker I can get my name out there, the better. Here’s hoping its worth it.
So, all Dev work has ground to a halt with the arrival of my Vive.
I’ve been delighting in room tracking (even though my office is standing room only), I’ve been amazed at tracked controllers (now I REALLY want a VR version of Thief, after playing the Longbow demo in The Lab – being able to actually use a bow to shoot water arrows, or use a controller as a blackjack would be indescribably satisfying)
But, with my many an hour spent agog in VR, its hours less spent developing my own work. So, after dragging my feet a bit, I fired up Unity to see what I could do to convert my DK2 projects over to SteamVR.
Its a completely different way of doing things.
I completely broke all of my experiences and in the case of my BUM simulator, broke my level geometry and have to scrap and rebuild it all from scratch. Because I discovered something wonderful.
In the Steam VR examples is a IK scene, that allows your controllers to control virtual arms, attached like real arms would be in the virtual world:
Ye gods, if I incorporate that into my scene, you could shake your cup to rattle it! I am so excited this possibility exists now the tricky part – figuring out HOW to make it work. My google-fu yields nothing – not any teases on how that got made or how I would adapt it to my character and if I would have to rebuild my character from scratch, or simply rename parts of it…
Last week the Cleveland game devs had a Level Up workshop, focused on Unity and Playmaker – so I signed up, enthusiastic that I could unlock some more of the mysteries of Playmaker and game mechanics.
I was not disappointed. Our moderator, Bill, jumped in with a basic endless runner prototype using primatives and simple FSMs, and soon our cubes were colliding and jumping on command. It got me thinking about the platformer I want to make using my autistic son’s artwork.
I showed the kids my work the next morning, and they gleefully smacked the spacebar to make the little cube hop on command. I started up Audacity and recorded their delighted sounds and snuck them into the FSM. Their eyes grew wide when they heard their own voices yelling ‘jump’ when they hit the spacebar. I made a few more tweaks to reset the level if the player cube got knocked off screen.
I can see how this sort of game is so popular for beginner devs – its quick, easy and has an iterative cycle thats easy to add mechanics and playtest. It makes me question if I shouldn’t focus on one as a means to generate quick income. The HTC Vive is going on pre-order next week and I could really use the money.
So, last week – things got a little sidetracked. Getting ready to go to church and my eldest slips and cracks his chin on the steps. MIND YOU – he was holding the handrail AND MY HAND when this happened. All the times he has walked down those steps holding a load of toys so large he can’t see over them; slip down the steps on his butt, refused to hold anything and do it with a blindfold while juggling chainsaws – nothing. But the minute we’re taking the slightest precautions… BAM – four stitches.
THIS week, my youngest has reverted back to his very needy; meltdowny phase. The running joke is that he clings to me like I was the last chopper out of Saigon. It had been slowly building for the past few days (maybe based on the amount of attention his brother has been getting) but last night he wanted to take his sippy cup into the bath with him. A small, innocuous request – one I easily agreed was fine. And I told him so.
For reasons that will baffle and confound me to the grave – it led to a hour and a half meltdown that no amount of hugging, cajoling, bribing of sweets, threats of all-boys-Christian-scared straight-military- academy, offers to buy him a college education, buy him a mountain of candy, or anything else, would fix. He then proceeded to wake a few times during the night – which led to a co-sleeping session, something I haven’t done with him in many moons.
And yet – game dev MUST continue.
I’ve been listening to a Udemy course on FPS and Playmaker, and while a lot of these courses are retreads of the ‘here is the interface / options / how to get started’ lectures, I do tease out new nuggets of advice and tips each time I listen to one. This time I was able to put together the physics I needed to make a couple of balloons on strings behave like they should, swaying in the breeze and bouncing off each other in a realistic way. I was so impressed with it, I’m tempted to make it the load screen…
I was also able to attend a ‘Level Up’ lecture with the Cleveland Game Dev group – a one in a while chance to hang with peers in the area who have similar interests, and learn a bit about making pixel art. I’m still interested in creating a point-n-click or side scroller based on my eldest and his artwork, so it was a welcome break from kids and the oppressive weather keeping us indoors and the kids constantly battling each other. I’d love to be able to spend more time with the local game dev group, but the demands of daddyhood often leave me tired and they only meet in my neck of the woods every so often.
So – what’s next? I’m currently struggling with Playmaker, Mecanim and getting meaningful acting out of my MakHuman and Fuse characters and trying to learn the new Quixel suite – both of which have been daunting and I’ll be venting about here soon enough.
Or: How I Learned to Stopped Worrying and Love .MHX2
So I just had a major breakthrough (I’m hoping!) with my NPC interactions. I had glanced once or twice at the work Thomas@MakeHuman had been doing, but in my earlier, more sleep deprived days of DaddyDev, I never quite grasped the concepts he was working on.
Ye gods, I’m so freakin’ thrilled I went back to take a second look.
I took one of my MakeHuman characters (the subway token booth clerk) and, after installing the .mhx2 files in Blender and MakeHuman; I re-exported him put him into Blender. Granted, I had to grasp there was an advanced options switch on the Blender import side – but once I activated the face shapes, I was blown away.
I had visemes (phonemes) already plugged in. I could export expressions from MH. Once I added the MakeWalk addon, I could import Mo-Cap .bvh files and apply them to my rig. Delving into the online manual (and a profound thanks to the dev who still does written manuals, I’m sure I will rant yet again about how much I hate video tutorials) I saw there were even more options to smooth out the mo-cap motions, edit the location / rotation of bones and once I figure it out, append other .bvh files. What really blew me away was the ability to import MOHO (.dat) lipsync files from Papagayo. And I loaded visemes ON TOP of the already loaded .bvh!
With the ability to make a character walk, talk and emote, suddenly a whole new facet of story telling opened up for me. Since I’m not big into programming, I’ve been using Playmaker to handle some of the basic interactivity; and after seeing the dev who made the wonderful Rift demo, ‘Coffee Without Words‘ made an asset that mimics human eye movement – my NPCs came alive.
Be forewarned, my demo video is NSFW; I made a MH woman with some extended clothing assets, I didn’t realize the bathrobe was translucent, so you see some breasts.
Lastly, after having tinkered with this for a bit, I got an email saying that Adobe had released Fuse, a character creation that tied in with Mixamo and the Mechanim animation system Unity now favors. I tried Fuse, made a character, got a walk animation added and plopped it in Unity (and Blender too, just to see what it would do, if I needed to tweak it)
The character worked, it did its walk cycle and looked ok after I changed its skin shader from transparent to opaque. I did notice that it can be lip-sync’ed, but only with a $35 additional purchase, and a new piece of software to learn and try to integrate. When I pulled the Fuse character into Blender, it was huge, distorted and had no face-rig that I could discern.
So Fuse wasn’t all that tempting, considering I’d have to shell out $$ for SALSA, and I suspect Adobe will be quickly adding a subscription fee like all its other products, I’d like to keep my dev budget as free and OSS as possible. Perhaps I’ll just create and export a bunch of random characters to populate my subway scene with, since MakeHuman doesn’t have the widest range of clothing options, the Fuse NPC’s will shake things up a bit visually.
Now I have to sit and watch about 2,000 .bvh animations to find what actions I think will go well with my actors…
So, the wife was kind enough to give me a daddy-cation last month, which I took to go see Steely Dan in NYC.
I also took my nice DSLR to go roam the city streets and collect texture maps, a favorite pastime when I lived there. I’ve collected terabytes of photos that have yet to be tiled, tone mapped and color corrected. Yet an other favorite pastime of mine.
Another thing I did was to try and get decent shots for a photogrammetry idea that I had: create a NYC garbage can. I had been reading on the /r/3dscanning subreddit about it and had tried a year or two back with VisualSFM, with mixed results. I recently read about Autodesk’s Memento and decided I’d give that a try.
I was absolutely shocked when I added the resulting .FBX file to a Unity scene and the sheer level of detail on the street, let alone the sidewalk. the can itself was problematic because it had intricate detail and ended up with several holes in it, I suppose I could fix those in Blender, but would have problems getting the textures realigned.
I’ve seen others recreating rooms in VR using this method, and one of the best ones, IMHO, is the 4th Floor Studio Apartment – while not a game level environ per se, a place I loved putting on my goggles and visiting anyway.
So, if you read any dev blogs, they’re always chock full of advice, ranging from the latest tips and tricks on every piece of software available, to breakthroughs and milestones reached; but inevitably there are some rules that people put forth as sacrosanct.
Well, I’ve decided to break a couple.
One that always comes up is: Finish what you are working on before starting something new.
The problem is; I’ve never worked that way. As a painter, I need to have multiple canvases going at once. If I stare at one painting too long, I get drawn into a funk of ‘sameness’ – its the same reason I can’t sit and binge watch a whole season of TV shows. I start over-analyzing and I get resentful.
So, I’m giving myself permission to take a break from BUM and work on an earlier project, an adaptation of the play / short film “Sovereignty” that I had the honor of working on a few years ago. By trying to improve what I’ve started with what I’ve learned working on BUM, I can tackle issues that stalled that project and when I get stuck there, I go back to my other project and the 2 of them become like ladder rungs I can climb at once. Playmaker hasn’t been the magic bullet I had hoped for breathing life into my characters and if I continue to be stuck, I might start trading art for programming…
The other rule that people harp on is: Don’t try and do to many projects at once.
This one might apply a bit more, but I can at least start laying the groundwork for something that might be a breakout piece for me. My eldest child is on the Autism spectrum and is fairly non-verbal. In every other aspect, he’s a normal kid – loves climbing, exploring and watching cartoons, has a fondness for sweets and doesn’t like bedtime. He just doesn’t talk much. One thing he truly is engrossed in (and as an artist I’m thrilled by) is drawing. He has gone through several boxes of sidewalk chalk this summer and every chance he gets, he’s outside covering my garage floor in layers upon layers of drawing. When he comes out, he’s covered head to toe in chalk dust.
But his drawings are compelling. Some are obviously cartoons he’s seen or toys or trains, but others are simply things he’s seen in his day-to-day life. And I think they’d make great sprites for a 2D scroller / platformer.
Iknow, I know – I can’t take on a 3rd project – none of them will ever be finished, but whle I’m cleaning up the yard or watching the kids play, I can snap a few photos and throw them into Photoshop to start prepping them to animate. After all, I’ll need another project as soon as I finish one of them, right?
Exactly one year ago, I was on vacation with the wife and kids when I got a text from our wonderful neighbor who keeps an eye on our cat and waters the garden while we are gone – I just got a large box delivered on our porch, perhaps a certain package I’ve been waiting for?
As an avid long term gamer (you should have seen my anticipation downloading the Quake 2 demo on dial-up!) and have always been drawn in by the interactivity. I’ve never been a fan of passive tv watching, and, after our family got an Atari 2600 one Christmas, its been a steady evolution of interactive adventures. Beating “Adventure.” The thrill of solving Infocom’s “Wishbringer,” on our family’s first computer. Being blown away by the graphics of “Another World” on the Amiga. Further blown away when I purchased my first PC, a AT&T Globalyst desktop with a CPU clock in megahertz, no math co-processor and playing Quake 1 running on a 15 inch monitor at the smallest possible resolution and still loving every second of it (I still often play the Quake 1 soundtrack as background noise when I’m working), playing the Half Life series, Thief series and many more hours than I care to claim.
And yet, I still craved more interactivity. I downloaded mods to make games anaglyph 3D. Adding surround sound systems to my gamning rigs. I wanted to mod Wii controllers to implement Johnny Chung-Lee’s head tracking. Still nothing grabbed my attention like the whispers I saw on Reddit about a VR headset that worked. I was instantly hooked and spent a lot of time learning everything about it that I could. When the DK2 was announced, I sold a couple of paintings to purchase it. I think the wife is still a little off put over that decision. And I waited for the day that my shipping notice would arrive.
Of COURSE it had to be while we were on vacation. I gave my neighbor dire warnings on how important this package was. Offers of booze, money, my firstborn child to ensure my delivery was safely in my house, under lock and key until I returned. I’m a safe and careful driver, but the return drive home I fought the urge to floor it all the way home. I found my new toy on the kitchen table and resisted the urge to skip sleeping to play with my new toy, as I had to be up in 6 hours to attend to the kids.
I managed to get a small chunk of free time the next day and hastily set up my Rift and installed the drivers and runtimes and dove right into the desk demo. And gasped. And laughed. And sat there in bewildered amazement, looking all around me – standing up, kneeling under the desk (and losing tracking in the process) and reluctantly tore myself away to attend to the real world. I was ecstatic when the wife offered to give me a night off from the kids and I eagerly dove into every demo I could download. I rode rollercoasters, visited haunted dungeons, went flying. Nothing grabbed my attention like a small demo by Brendan Coyle, he did a simple studio apartment and the moment I stepped into it –
I felt real presence. I was there. This was the small studio I wanted in NYC. I was there and it was sunny and warm out. I explored the kitchenette, the bathroom, tried to read the titles of the books on the shelf. I wanted to open the door, go down the stairs and to the nearest bodega for a bagel and cream cheese. It was that real.
It didn’t take long for me to delve into gaming on the Rift – I was surprised when Quake 2 made everyone else queasy within seconds, yet I could take hour long doses of it, even with a tendency to have vertigo, maybe years of twitch gaming had dulled my nausea reflex. I dove into Half Life 2 with a zeal – stopping to look at the decades old game and the mastery of level design: http://steamcommunity.com/sharedfiles/filedetails/?id=184922180
little details jumped out at me and again, that sense of immersion gripped me tight and I was willing held captive as long as I could afford.
Eventually, I wanted to create my own worlds – to take the ideas bubbling in the aether of my imagination and make them whole and walk through them. I picked up Blender (having a little experience in 3D work, I knew the concepts, just needed to learn a new interface) and picked up Unity – both free options to express myself; with little time and less money, FOSS options were my only choice and I try to support these and other endeavors as much as I can. I dove into making my worlds and the day came when I could cram on my Rift and hit Enter and after that all too familiar Health Safety Warning…
Ye Gods… it was a basic polygon world and primitive texture mapped, no fancy normal maps or SSAO shading – but it was mine. And I could walk through it. All the paintings I’ve done that I wished I could walk through – worlds in my head I wish I could visit, all of these were in my grasp, just yearning for my feeble skills to flesh them out and make them tangible, more cogent…
As I played – as I built, as I worked my way through trial and fumble – other things tickled the back of my head. I’ve always been a political animal and very socially aware of the world around me. Being a creature of New York City, I’ve met many souls from various pockets around the planet, lived with more subsections of race, creed and faith than most meet in a lifetime and have heard their tales – when you hear a taxi driver spill out more truth about the universe than a dozen philosophy professors, it changes you. I’ve always felt that if you could just step into another persons shoes, see what they see – it would do a great deal to improve how our species relates to others of its kind. If a person could see what its like to be told you have cancer, or you just sold a patent worth millions, or have to make a choice that benefits your family but wrecks someone else’s family; to see an opposite political point of view, a different faith, how the world looks if you have autism. I want people to see other points of view, or use this tool to help those with a phobia get over their fears or help people to revisit places lost (can you imagine the joy of being able to walk around your childhood home that had burnt down to the ground?)
This new medium might just be the culmination of a lifetime of my interests; art and computers – 3D animation and Photoshop – video games and experiencing adventures previously unimagined. And I’m so thrilled its happening now and I can be a part of it.