Warehouse AR Project (WIP)


Project Statement

The intention of this project is to create an AR foundation that would allow a warehouse to create a virtual navigation using augmented reality. In theory this would allow a worker to quickly pull out their phone and have it navigate them quickly to the box they need to find.

Running Preview


This preview shows off my AR manager easily spawning, interacting with, and destroying AR objects seamlessly.

What’s done so far?

So far I’ve been able to architect a AR object scene manager that allows me to quickly create and manage AR objects within the scene without needing to touch any AR foundation code. In other words I have wrapped all of the plugin features into my own managers and set it all up to correctly manage them (located within ARScenemanager.cs). This will allow me to focus on creating the front end application without worrying about the AR foundation code.

I’ve began work on setting up the mixed reality scene, I’m doing this by creating a desktop app that allows the user to place where the box’s on a warehouse map in a desktop application. This would then be represented when they pull out their phone within the warehouse

Current Roadblocks / issues

There are two major issues that I’m running into currently, the first one is that there is no easy way to track a phone’s location within the warehouse. My current solution is to assume that the user is always starting the application in a specific place (placing a tape x on the ground where they need to stand in the physical warehouse). This however is a quick and dirty solution that I’m going to look into improving in the future.

The second one is that the AR world will shift if the user flicks their phone around too much, I believe that there are answers within Unity’s AR core plugin to fix this, or that I may need to redo how my architecture works. I need to do more research into how Unity’s plugin handles the machine vision end of AR & scene creation before I go any farther.

Code snippets

You can check out all of my code at my GitHub


Box Voyage!

Projects, Seinor year projects

Project statement

Box Voyage is a playful adventure puzzle game where players take their government-mandated vacation on a cheap cruise-in-a-box. Push, grab, and spin anything you can get your hands on as you explore the rooms aboard the ship and attempt to fill your Fun Meter.

  • Explore 7 eccentric, colorful rooms full of puzzles to discover how to have enough Fun
  • Manipulate dozens of tactile objects inspired by old analog toys
  • Dig deeper into the rooms to discover trophies to remind you of your time in the box

Every interaction in the box is bursting with feedback and begging to be inspected. Pushing buttons, flipping switches, launching toy planes, cutting hair, and cooking up octopus are just a few of the ways players can expect to have Fun on the cruise. There’s still a lot to enjoy in this quirky adventure, even if your boss is making you do it.


Project and team information

This is a 2 semester long project with a 13 person interdisciplinary team.

My Role in the Project

My Role in Box Voyage was a Gameplay Programmer, I focused on rapidly prototyping and polishing new puzzles and rooms after design made them. This included a lot of communication with both design and art to figure out exactly how the room would work, how it would look after it was finished, and what would need to be changed. I spent the early days of my involvement prototyping new rooms and puzzles, this involved grayboxing the room based on the diagrams sent by designers and made the functionality. For the latter half of the project I focused on cleaning up bugs, implementing finished art assets to the code, and making any tweaks we needed to the rooms based on testing results (both QA and internal).

Working online due to COVID-19

About halfway though the project we were forced to make the transition to online due the COVID-19 pandemic. With this we swapped to exclusively online communication. While this was a blocker at first, we were able to quickly make the swap and worked just as efficiently as before!

Project specifications

  • C#
  • Unity
  • Unity’s new input system
  • Github repository
  • Github bug reporting and tracking
  • Github pull requests

Release information

This game is being released in June of 2020 on Itch.io and Steam!

Work in progress projects!

Development Blogs, Projects

This is a list of all of my currently in progress projects, no guarantee that any of these will ever be finished but they’re still pretty cool

Unnamed Jack-box Party Pack Ripoff (Class project)

This one is an exploration into networking. I’m currently working on a RakNet c++ networking plugin that will have a unity scene receive input from multiple different phones and have their input be displayed on the “master screen” hosting the unity program. The catch is that this will all be in real time, making this tricky. More info to come.

AR Board Games (Class project)

This is a Unity 3D project using their alpha AR foundation to attempt to simulate playing a board game in AR across two phones. I currently have the AR scene rendering and creating a zone for user input, next I’ll be working on networking this across two phones.

AR interactive reader exploration

Development Blogs

For this project I plan on making an interactive reader that will allow the user to adjust “values” in a text and have those values change the information on the screen… but do that on a physical piece of paper.

This idea is an extension of Bret Victor’s Explorable Explorations project (found here: http://worrydream.com/#!/ExplorableExplanations). Using Victor’s API we can potentially read a piece of paper using a phone camera and apply the logic in AR.


There are two key parts of this that I’ll need to dive into, firstly Explorable Explorations is written in HTML and JavaScript… of witch I know neither. So I’ll need to find a way to convert their project into something more comfortable for my skills as a programmer. For this, I plan on making a Unity Android app that will interface with Victor’s project and do the necessary functions.

Secondly, I need to get an AR setup in unity to be able to read the paper. I’ll have to look into API’s and methods of reading text from a camera, then positioning them in the game world based on the text’s location. Which is a huge task on it’s own.

For this project, I have about 12 weeks to work on this, so I’ll need to budget my time carefully. I’d like to have Explorable Explorations working with Unity by week 4, this will leave me 8 weeks to get the AR portion of the project done.

Ideas on how to overcome these issues

So I have a few ideas on how to deal with the issues noted above. For the HTML & JavaScript part, I’m hoping that once I do a deep dive into Victor’s code I’ll find a way to re-rout the JavaScript functionality into C# so I can function with Unity. Using WebGL is another option as it already functions in the web. The third option is to write a plugin that’ll display an HTML text file into Unity, then using that to function.

For AR it’ll be a bit trickier. Since I don’t have the resources to write a text reader on my own (and doing so would be overkill for this project anyways) I’ll have to create a cheat for our purposes. However, if I simply create a syntax that’ll tell the program where what words are, and that they have a variable attached to them, then we can overlay the unity text over them. The issue with this is that we’ll have to create a way to easily import and export these variables to make this application usable to a wide market, as well as making a syntax that’s dynamic enough to be used in multiple situations but not distracting to the user.

For a first draft I’m thinking something like this [ Variable ][#]. Unity will look for the square brackets, then grab the # after it to tell the code what variable it is. This is something that I’ll have to tackle more in the future and is likely to change.

Another option for this is to have a barcode over every variable, however this will defeat the purpose as it’ll make the text unreadable without a camera. We could potentially put a small QR code after the variables that the program looks for, but that’s another can of worms.

The third is that I could use a image to text API (Such as OCR https://ironsoftware.com/csharp/ocr/), however that would be taking some of the work away from my hands but could be useful if crunched for time.

Quick Notes

Points of exploration: Making a plugin to get HTML & JavaScript code to work with Unity | Augmented Reality

Unknowns: Augmented Reality theory | JavaScript | HTML

External factors: Explorable Explorations| OCR (potentially)

Side Tracked!

Projects, Seinor year projects

Project statement

Side Tracked is a Virtual Reality narrative game where you play as a train conductor helping souls reach a conclusion of their past life. You do this by manipulating the emotions of the passenger therefor altering the way they interpret these memories. Based on their development throughout the trip they will either end up in “The light” or “The abyss”, and it’s up to you to help them get there!


Project and team information

This game is being made over the course of the semester with hopes of being chosen to move onto the next semester

My role in the project

My major contribution for this project is the back-end systems. This includes the narrative integration, modular train track builder (this allows the train to move across the world and tells the game when to play events), and the train movement.

Project specifications

  • C#
  • Unity
  • Oculus Rift
  • Github Repository

Build download coming soon!