Unity Editor Activity and Digging Deeper

More after hours digging into Unity (slowed down by some necessary after hours work related items…probably going to continue for the balance of the week as we wrap up a tight sprint 🙂 ).

Cross Platform VR and Flat Display

I am coming to the conclusion that I need to dig deeper into the prefabs provided with the various VR SDKs that matter to me (daydream, vive and cardboard). It appears that the different vendors implemented their own event systems that aren’t necessarily interoperable. If we want to build code that plugs into any of these systems, we’ll need to address this.

I’m not currently sure how best to deal with this. Trying to build wrappers sounds nice for compatibility but has potential limitations. Changing the prefabs in-place provides more leverage but means potentially having to make the changes repeatedly as new versions appear with other features that we want. This won’t be resolved without some digging into the code and experimentation.

Unity Editing Support

I was recently playing with some asset generation scripting. Malcolm had commented that it would be nice to have a way to quickly create a room of any desired size with appropriate wall textures and texture scaling. I whipped up a very quick and simple script to take a wall, floor and ceiling prefab and build a complete room out of the set.

This worked quite nicely (though many details to be added such as the texture management) but had no visibility in the editor. You added an invisible GameObject to the scene, set its parameters and then when you ran the game you got a magically created room.

I was looking at the editor side options last night and it appears to be relatively straightforward to make the script do its thing in the editor as well as at game run time.

I’m going to play with some of this soon to see what I can put together. Being able to build more complicated, programmatically generated items in the editor would be a great capability to have available. Prefabs are fine for simple things but I’d rather have the option of generating a family of possible items from a script. Editor side visibility is a key part of making something like this useful.

Having fun playing with this stuff as work projects at the moment are in stabilization and thus mostly running down defects and back-filling missed requirements rather than building new architecture and the code to support it.

Unity and Multiple Platforms

I tried to get a single Unity project to build for multiple platforms over the weekend. The code I’m writing is probably best suited to a flat screen with mouse and keyboard quite honestly. That said, I’m interested in working with VR systems and thus intend it to run on the platforms that I have available. This gives me:

  • Flat screen with keyboard and mouse.
  • Google Daydream VR with Daydream controller.
  • HTC Vive with two Vive controllers
  • Possible Google cardboard with no controller (perhaps a game controller as a stretch?).

Multiple Platforms from One Project

Sam had run across a couple of projects that claimed to work across platforms last week and sent me a link. Over the weekend I pulled the projects down and ran them on the Daydream without problems. They were a bit awkward to use as neither of them used the daydream controller effectively (one seemed to be built as a cardboard app, using the sight axis for selection).

I took the project that had been targeted on the Daydream and re-targeted it to the PC with a Vive headset attached. I got errors (don’t have the exact form) that suggested that I needed to load the Steam VR SDK in order to make things work. In the end I loaded the SDK from the asset store. This did allow the Vive headset to connect but did not display the information panels that the project was designed to show. At that point I shelved the project…I’ll probably return at some point to look deeper into the implementation, but for now I’ve had enough.

Switching Platforms in a Project (the hard way)

Later in the weekend I tried simply creating a project with a few simple assets and adding in the two SDKs. I was able to get the Daydream up and running (tried that first) by making the usual additions to the scene for that SDK and setting up the platform. To get the Vive running I had to delete all of the Daydream items from the scene and add in the usual Vive content. That too worked as expected.

The surprise came when I switched back to Daydream again. I continued to get errors indicating that the Vive input mapper json file was missing. It appears that something Vive related was still attached to the project and looking for Vive specific assets. This is another area where I may dig deeper at some point. For the moment I’m moving on.

Modular and Layered

I did dig through a good bit of content over the weekend (both video and online blogs/documentation) that suggested Unity supports external code assemblies linked in. This leads me down the road to a more modular approach.

  • Provide multiple .NET core assemblies that aren’t Unity specific code to implement the ‘business logic’ of the game. This keeps the complex code (game rules, world generation and storage) and other such items in a place where they can be unit tested and managed as pure code assemblies.
  • Manage all Unity assets that aren’t target specific in free-standing packages that get bolted into a target specific frame at the last moment.
  • Build the SDK specific parts of the game as thin wrappers over these other assets. These will be asset packages and projects that contain only target specific items and user facing elements.

I’m going to head down this road next and try structuring a new version of my Cluster game to support this approach. I started coding up the game generation part on Sunday and will likely try bolting a version of that into a VR frame as soon as it is complete enough to be usable.

Unity Event SYstem Looks Like the Next Stop

A quick perusal of the documentation suggests that I’m going to need a better grasp of the unity event system to get my handling of VR controller raycasting working as I’d like.

My stars (and anything else that wants to be responsive) will need to handle input events and thus detect when a ray touches the construct (and stops touching it). This should allow me to do the sort of ‘hover text’ I want. I’ll likely try to work this up tonight and get my inflating stars code working with the DayDream (and perhaps the Vive as well).

I’m still wishing for a decent solution to modularity that would let me keep the common assets between the three platforms I’m interested in separate enough to make sharing them easy.

I’m currently maintaining three divergent versions of the code and assets with each configured for a different system. As the game rules implementation develops, this is going to become hard to support.

Unity flat screen mouse pointing is working

I just pushed an update to github that detects mouse over of stars in the game screen and temporarily bumps the size of the star up by a factor of eight (to show you pointed at it). I’ve also got some other minor bits working such as finding a script on an object given the GameObject .

I need to add code to read in the star names file. I also want to make the DayDream pointer do this same thing to stars you point at. In the longer run, I expect to add a UI panel that displays additional information about the star when you point at it and may very well magnify the size of the star and any associated information at the same time.

Now on to seeing what I can see with DayDream…

Got the Static Assets in Place…

Now I’m stitching in some game logic and properties related to game play. Still need to get the systems loaded from the static stuff I’ve added and start building some game play but it feels like things are moving at a decent clip.

I’m also going to be looking at keeping some of the game logic in external, non-game assemblies. Being able to pull in external code that can be unit tested and perhaps even deployed in other areas is useful.

Hoping to get the basics wired in and then get back to interface work. I need to get the VR controls started. A user should be able to switch perspectives between several fixed camera spots with the controller. They need to be able to get a basic view of the state of the board from a distance and be able to view full details of a given star by pointing their controller at it.

  • View basic star information available to a given player from a distance.
  • Assemble a fleet (if fleet markers are available) from ships present at a star.
  • Task an existing fleet with going to another system.
  • Allocate this turn’s production to construction or research.
  • Side effect of the above, move population from a planet surface to colony transports.
  • Check transit time from one star system to another.
  • See beacon area of effect for your existing beacons.

Looking to store static data strings in a unity asset

I’m now looking to store static text (potential names to be selected randomly, object configurations, etc.) in an accessible unity resource/asset. I want the game to access and load this information and the information to be packaged as part of the game after a build (i.e. in the DayDream *.apk file that gets deployed to the device).

Got it…text asset in a resources folder. My list of star names is now in place. Now I need to shuffle them and associate them with stars and (hopefully) display the names near each star. Ideally I want the name tags to always be positioned so that the normal of the tag passes through the camera and the vertical axis of the label tag passes through the associated star with a fixed distance maintained between the bottom edge of the name plate and the center of the star. Ideally the top of the name plate would be ‘up’ to the viewer as well. I’ll have to look into enforcing these constraints once I’ve gotten any sort of name tags working 🙂

Cluster working on Vive

I’ve got the current bits of Cluster-1 working on the Vive using directions from here. Doesn’t do much yet, but nice to look at. I also need to figure out the ‘up’ direction as the Vive version has the board overhead and the Daydream version has the board in front of the player. I want the initial camera to look down of the board with stars around your knees. Much more to do, but still moving on.

I also want to make sure that the same project can be built for at least the three platforms…PC screen, Daydream and Vive…and probably cardboard from the same baseline.

As things move forward it will become progressively more important to me that I am able to switch between my supported platforms without making serious changes to the base build.

So far it appears that I can disable ‘VR’ support and the Steam VR camera rig to get back to a flat monitor world on the PC side. Next I’ll need to try to integrate the Steam VR and DayDream pieces to find out if I can switch between those with equal facility.

This approach still isn’t perfectly satisfying as in a more formal build environment, the Unity UI would be needed to switch between building plain PC mode output and PC VR output. I expect that as this is targeted partly at commercial applications there must be a way to run a build server that emits packages for a full range of targets without human intervention. Stay tuned and we may dig that out at some point…

Notes on daydream setup for unity

Using notes provided by Sam and Malcolm as a springboard…

Download the ADB drivers from clockworkmod.

Grab the GoogleVRForUnity package.

More instructions that I’ve grabbed from a short document that my friends created and shared…I’ll post details if they’re ok with that.

Currently building and will try to launch in Daydream once that is complete.

I may also try the vive as a target if this goes well. The nascent game isn’t really that exciting, but I’d like to get things basically working in all three environments if I can.

Looks as if my Android SDK location is a problem…it is in my users area and the path has a space in it. I’m trying to get android studio to relocate the SDK…currently it is in a hole and not coming out.

Some notes on the unity API(s)

This is largely notes to myself around the Unity API set. I’m at the point where I want to get some significant code written. I’ve got enough assets in hand to get started on the sample game I’m putting together. I know what needs to get done. Now I need to better understand how to fit that into the frameworks that are currently in place.

Looking at the Unity documentation online, the manual and the API docs. I found some procedural generation links here and here and here that I’ll have to read through in more detail.

Hmm…best practices guide…that looks interesting too. Took a quick look and these are more advanced topics than I’m looking for.

CLuster now has stars

Just a disk of random stars with few additional details, but a nice start. I’ve overpopulated them to make the disk more dramatic at the moment. Just load up stars.unity from the Cluster-1 off of GitHub and you’ll see them.

A view from afar…
Closer in looking across the disk…

Now to generate planetary systems for each star based on its color and add in a skybox to provide the black of deep space as a background. Note to self…don’t select eight thousand stars by mistake while trying to grab a screen clip…

It would be nice to surround them with coronal particle systems and texture them with sun-spots, but that will (perhaps) come. For now I’m more interested in building out the mechanics.