A quick perusal of the documentation suggests that I’m going to need a better grasp of the unity event system to get my handling of VR controller raycasting working as I’d like.
My stars (and anything else that wants to be responsive) will need to handle input events and thus detect when a ray touches the construct (and stops touching it). This should allow me to do the sort of ‘hover text’ I want. I’ll likely try to work this up tonight and get my inflating stars code working with the DayDream (and perhaps the Vive as well).
I’m still wishing for a decent solution to modularity that would let me keep the common assets between the three platforms I’m interested in separate enough to make sharing them easy.
I’m currently maintaining three divergent versions of the code and assets with each configured for a different system. As the game rules implementation develops, this is going to become hard to support.
I just pushed an update to github that detects mouse over of stars in the game screen and temporarily bumps the size of the star up by a factor of eight (to show you pointed at it). I’ve also got some other minor bits working such as finding a script on an object given the GameObject .
I need to add code to read in the star names file. I also want to make the DayDream pointer do this same thing to stars you point at. In the longer run, I expect to add a UI panel that displays additional information about the star when you point at it and may very well magnify the size of the star and any associated information at the same time.
Now I’m stitching in some game logic and properties related to game play. Still need to get the systems loaded from the static stuff I’ve added and start building some game play but it feels like things are moving at a decent clip.
I’m also going to be looking at keeping some of the game logic in external, non-game assemblies. Being able to pull in external code that can be unit tested and perhaps even deployed in other areas is useful.
Hoping to get the basics wired in and then get back to interface work. I need to get the VR controls started. A user should be able to switch perspectives between several fixed camera spots with the controller. They need to be able to get a basic view of the state of the board from a distance and be able to view full details of a given star by pointing their controller at it.
View basic star information available to a given player from a distance.
Assemble a fleet (if fleet markers are available) from ships present at a star.
Task an existing fleet with going to another system.
Allocate this turn’s production to construction or research.
Side effect of the above, move population from a planet surface to colony transports.
Check transit time from one star system to another.
See beacon area of effect for your existing beacons.
I’m now looking to store static text (potential names to be selected randomly, object configurations, etc.) in an accessible unity resource/asset. I want the game to access and load this information and the information to be packaged as part of the game after a build (i.e. in the DayDream *.apk file that gets deployed to the device).
Got it…text asset in a resources folder. My list of star names is now in place. Now I need to shuffle them and associate them with stars and (hopefully) display the names near each star. Ideally I want the name tags to always be positioned so that the normal of the tag passes through the camera and the vertical axis of the label tag passes through the associated star with a fixed distance maintained between the bottom edge of the name plate and the center of the star. Ideally the top of the name plate would be ‘up’ to the viewer as well. I’ll have to look into enforcing these constraints once I’ve gotten any sort of name tags working 🙂
I’ve got the current bits of Cluster-1 working on the Vive using directions from here. Doesn’t do much yet, but nice to look at. I also need to figure out the ‘up’ direction as the Vive version has the board overhead and the Daydream version has the board in front of the player. I want the initial camera to look down of the board with stars around your knees. Much more to do, but still moving on.
I also want to make sure that the same project can be built for at least the three platforms…PC screen, Daydream and Vive…and probably cardboard from the same baseline.
As things move forward it will become progressively more important to me that I am able to switch between my supported platforms without making serious changes to the base build.
So far it appears that I can disable ‘VR’ support and the Steam VR camera rig to get back to a flat monitor world on the PC side. Next I’ll need to try to integrate the Steam VR and DayDream pieces to find out if I can switch between those with equal facility.
This approach still isn’t perfectly satisfying as in a more formal build environment, the Unity UI would be needed to switch between building plain PC mode output and PC VR output. I expect that as this is targeted partly at commercial applications there must be a way to run a build server that emits packages for a full range of targets without human intervention. Stay tuned and we may dig that out at some point…
More instructions that I’ve grabbed from a short document that my friends created and shared…I’ll post details if they’re ok with that.
Currently building and will try to launch in Daydream once that is complete.
I may also try the vive as a target if this goes well. The nascent game isn’t really that exciting, but I’d like to get things basically working in all three environments if I can.
Looks as if my Android SDK location is a problem…it is in my users area and the path has a space in it. I’m trying to get android studio to relocate the SDK…currently it is in a hole and not coming out.
This is largely notes to myself around the Unity API set. I’m at the point where I want to get some significant code written. I’ve got enough assets in hand to get started on the sample game I’m putting together. I know what needs to get done. Now I need to better understand how to fit that into the frameworks that are currently in place.
Looking at the Unity documentation online, the manual and the API docs. I found some procedural generation links here and here and here that I’ll have to read through in more detail.
Hmm…best practices guide…that looks interesting too. Took a quick look and these are more advanced topics than I’m looking for.
Just a disk of random stars with few additional details, but a nice start. I’ve overpopulated them to make the disk more dramatic at the moment. Just load up stars.unity from the Cluster-1 off of GitHub and you’ll see them.
Now to generate planetary systems for each star based on its color and add in a skybox to provide the black of deep space as a background. Note to self…don’t select eight thousand stars by mistake while trying to grab a screen clip…
It would be nice to surround them with coronal particle systems and texture them with sun-spots, but that will (perhaps) come. For now I’m more interested in building out the mechanics.
Spent some time after hours doing some work with unity and blender with my friends Sam and Malcolm. We’re all trying to become more proficient in putting together virtual spaces and using these tools to build more interesting assets.
I’ve got a daydream headset now to complement my HTC Vive. I really want to reach the point where I can build things that work on flat PC screens and both the vive and daydream (cardboard might be nice as well, but it is less interesting as its control capabilities are quite limited).
Malcolm and Sam are focusing on the daydream for now. We’re all going after different sorts of content. I’m aiming for a computer mediated, turn based, multi-player game inspired by some of the old board wargames I played in high school and college.
Malcolm and Sam helped me get my phone and computer loaded with the daydream development tools and mostly running (I left my daydream headset at home so I’ll have to fill in the remaining details later). I still have quite a bit of coding to do in order to get a working initial environment for my game going. I’m also going to have to work to keep the graphical intensity at a level that the phones can handle…I expect the PC and vive will want a different sort of assets for practical purposes.
Once the base game is working, I’ll need to put together a PHP/MySQL back-end to connect the game-board implementation to other players. Details here remain to be worked out, but I expect the initial version will be hosted here with some hand coded PHP managing the back-end. I am still split on how to partition up the game rules logic. Ideally the local copy of the game would only handle display and rendering while the server side code would handle random number generation and rules logic. Placing that much complexity in the PHP code may not be an ideal solution…I’ll have to play with things more in order to be sure.
Spending some quality time this weekend walking through the deeper layers of Jackson JSON serialization. I’ve used the library casually in the past, but never for anything of any great complexity. It has recently come up as a target for some more complicated (harder to map directly into JSON) data structures. I’m spending a few hours this weekend putting together some sample code to ring the changes on things I might want Jackson to do.
Initial sample code bits are just walking through the basics and making sure I remember what is needed there. I am noticing a few things:
It doesn’t care about serializable at all. I expect that it uses reflection to walk the object tree and find property accessors directly.
It demands getters and setters for all properties to be serialized. A constructor with the appropriate arguments is not an option. This means that immutable-ish objects are not supported. You must be able to read and write object properties individually to make Jackson work.
It does respect transient so it is easy enough to mark internal implementation details for exclusion from processing.
The object mapper does not like objects with more than one single argument setter, even if one setter exactly matches the type being loaded into the de-serialized object.
Next step is going to be creating a map of string to custom object.
So far I’ve done the easy stuff. Some evening soon I’ll try to find time to add in the more complicated bits. Code is here.
And now custom object that is a proper map key (implements hashCode and Equals) to custom object.
I started work as a Principal Software Engineer at KMC systems in April of 2016. I'm enjoying the work and the team there so far.