Stage light synchronization in .NET

Was up until 1:00 working on a lighting synchronization program in C# (what will become the basis for Concertroid’s lighting synchronization system) whose purpose is to play an audio file and send commands to stage lights via DMX simultaneously. I’m using Chauvet Intimidator Spot LED 250 fixtures with an ENTTEC USB DMX Pro MKII controller. The original purpose for the system was to power a concert I was planning, although I am hoping to debut the technology at a Halloween party in a couple of weeks (a couple of weeks?! and I just finished a very rudimentary synchronization app?! with only one song’s lighting sequence finished that I consider “good”… I must be insane…)

I still need to learn more about properly managing multiple threads running asynchronously. I have to figure out what is the best way to indicate what time the next frame of lighting commands should be sent to the DMX controller. Right now, it’s working fine, but the slightest delay will cause it to go out of sync because the lighting thread is timing based on subtracting the current time from the time it was started.

Hey, I wrote this in one night, it’s not perfect. At least it plays music and controls lights ;) My goal is to modify the audio engine with a function that retrieves a timestamp based on the current audio frame. This will ensure that the timestamp only reflects the exact time of the audio thread, regardless of whether the application is paused or not. Depending on how much delay this may (or may not) introduce, I might consider adding an event handler for a “tick” event that gets called every time the timestamp changes (so the lighting thread doesn’t have to poll the audio thread for the timestamp).

Eventually I would like to design a 100%-independent multiple thread coordination system that will assist with controlling lights, animations, and audio playback all simultaneously.

More on this later…

Receipt printing and thinking outside the box

That concludes the work on Sydne Monitoring Service. The system can now print receipts for your sales, complete with your company logo and a “thank you, come again” message at the bottom.

This was difficult to achieve for a number of reasons. I have never worked with receipt printers before, and I’ve only barely touched on actual printing in .NET. It turns out that it was fairly simple to actually compose a page, if you are comfortable with basically drawing a printed page exactly like you would draw a window on the screen.

The tricky part is that you have to be constantly aware of how big your target page is. Receipt printers are definitely not 8.5″x11″ pages. They are most certainly not A4 size. Being creative enough to get the most out of your receipt and still be able to make it look “good” is a challenge.

I’ve wasted countless strips of paper trying to lay out the design of the receipt. Now that I’m writing this, it occurs to me that I could’ve done the majority of the work in a properly-sized Windows Form and then just transferred the rendering code to the print routine. Of course, all of the best insight occurs AFTER the fact.

In the end, the receipt turned out well, I think. And of course, once the product is officially released, and a company finds it worthy enough to purchase a support contract for ;) I would be happy to help with any tweaks that might be necessary to get the output rendered just right for a particular business.

Integrating desktop software with the Web

I am currently working with a friend on an online inventory tracking and sales management system. It’s designed to be hosted on a Web server so you can make sales and access your history from anywhere.

One of the defining characteristics of a point-of-sale system is the cash register. A typical modern PC-based cash register employs a cash drawer, touch screen monitor, receipt printer, pole display, and barcode scanner. Of course, all of these peripherals interact with each other via software installed on the client PC itself. There is usually no remote Web page being used as the primary method of operation.

Sydne is by design a Web application, and must jump through hoops in order to interact with an application running on the desktop. In addition to figuring out how to communicate with the application, it also must be able to gracefully ignore when the application isn’t running (for example, when the user logs into the Web site from somewhere other than the cash register).

Sydne Monitor provides a VERY simple lightweight HTTP server that runs on a configurable port and awaits the Web application’s command. Sydne Monitor offers a script in http://localhost:port/Sydne.js that is included in all pages of the Web application. Before this is done, however, the Web application pulls in a Sydne.js from the server itself, which acts as a stub so that calls to the (non-existent) monitor do not cause the rest of the JavaScript to fail.

When a Sydne Monitor command is executed, it spawns an XMLHttpRequest that attempts to access a special file on the lightweight HTTP server. The server detects that incoming request, gathers any parameters it might need, and then proceeds to perform the requested action for the client PC, such as reading an RFID card or opening the cash drawer. It responds to the request with a JSON packet that indicates whether the operation succeeded and, if it did, any values that are a result of the operation (such as the RFID card data).

Sydne Monitor also by default spawns a window that can be moved onto another monitor for use as a pole display. The display I’m using is a MIMO 710 S as recommended by Dale Harris, who has been developing (non-Web-based) point-of-sale software for years.

Will take pictures of the setup I’m using and post them soon!

Hello world!

Well, it looks like I’ve become part of the WordPress master race. ;)

I’m not sure how much of my content I’ll attempt to backport from my outdated Google-hosted blog, but I would like to move away from that now that I have my own dedicated server.

I’ll be writing about software development, Web site design, computers, music, pretty much anything that interests me that I feel like ranting about. Here and there, especially on programming posts, I might cover some topics other developers would find of interest. If you find something interesting (or boring, for that matter), don’t hesitate to let me know!

Concertroid! Engine – on a translucent screen! (courtesy of Re:VB-P)

Re:VB-P, developer of the similar VOCALOID-inspired concert animation software AniMiku, has agreed to do a test run of my software on his own translucent screen and projection system!

Collaborating with him on this project, I’m learning a lot of things about 3D modeling and animation that I would’ve probably never understood otherwise. While he demoed his projection system running my software, I took a couple of pictures to see the difference between his software, my software, and the official concert renderings used by Crypton Future Media, in order to better understand where my program needs improvement.

Here’s what I will be working on in the near future:

  • Cleaner edges (antialiasing)—Most noticeable in the Blu-Ray release of the 2010 VOCALOID concert by Crypton Future Media, the lack of antialiasing in 3D models causes the edges of the character to appear sharper and, though not usually noticeable while in the audience, becomes much more apparent during filming, especially close-ups in high definition!
  • Improved rendering—Using the default settings already provides a much better render of the character than MikuMikuDance, which applies a more anime-like shader and defaults to drawing edge lines throughout the model, something that simply doesn’t look right in a concert. However, there is always room for improvement, and adding support for HLSL/GLSL shaders is something to look forward to in the future.
  • Faster rendering—The software currently takes around 50ms to render a single frame on my ASUS computer with integrated graphics card and 4GB memory. While I plan on having a much better dedicated rendering computer for any concerts I present, I would like to fix all the things that my program might not be doing “The Right Way” that is causing it to be so slow.

And finally, some screenshots. Special thanks to my friend and colleague Re:VB-P, developer of the AniMiku virtual concert animation software, for providing the screen and taking the time to demo my software on his setup. AniMiku has been used for live events around the world by the VOCALOID producer group Vocalekt Visions. Re:VB-P also helped identify a crippling bug that never showed up on any of my computers, but sometimes happened on other systems—particularly those with third-party graphics hardware.

Concertroid! rendering with lights on


Concertroid! rendering with lights off


Universal Editor module for VMD file format

Finally finished the excruciatingly painful analysis of the VMD file format (versions used in both V1.30 and V7.39 of MikuMikuDance). I’m rather proud of it because it’s the first time in a long while that I’ve actually figured something out just by tweaking values, not actually looking at other people’s (free) source code.

I added support for the VMDMotionDataFormat in Universal Editor, so hopefully when I get around to finishing the animation system for my renderer it will be able to read (and convert!) VMD files generated from both versions.

This was excruciatingly painful primarily because of the way the bone frames are stored. It was interesting to notice that the only difference between V1.30 and V7.39 file formats (besides the model and bone names) were the field sizes of the model name (10 bytes in old version, 20 bytes in new version). But the way the frames are stored is hideous.

For each keyframe, it stores the bone data block (name of bone, position and rotation of bone, and interpolation data) for each bone affected at the keyframe. This is not that big of a deal… except that the keyframes aren’t actually stored in the order they should appear during playback. They’re stored apparently randomly, though presumably (I haven’t tested it) in the order that the user adds them to the timeline. I ended up making a class RawFrameData to store the frame index and bone information, storing all of that information in a System.Collections.Generic.List<RawFrameData>, then sorting that list in ascending order by the frame index.

I then created three classes: a MotionFrame class, a MotionAction (abstract) class, and a MotionBoneRepositionAction class which inherits from MotionAction.

MotionFrame has a frame index and collection of MotionActions (so that other actions, such as texture changes, can be recorded on the keyframe as well, though these aren’t supported by the VMD file format and will require a new Concertroid-specific file format to store them). The MotionBoneRepositionAction represents a bone position/rotation change, and stores all the data that was originally present in the VMD file.

A quick note: the model and bone names between the PMD models are bound to be different, and the model and bone names in the model for the single-model MikuMikuDance V1.30 are hardcoded, so if I develop a converter, it will have to feature some way of allowing the user to map old VMD bone names to new VMD bones and vice versa. Also, supplying an XML configuration file with the bone mappings already laid out for some of the more common models (hanaminasho, kio, animasa, etc.) would probably not be hard to do, assuming I can find the time to work on them. This would also allow you to convert VMD motion data between models that have different bone names…

Anyway, no screenshot for now because there really isn’t anything to show. Once I develop the user interface for the converter, I’ll post that. But since I’ve completed VMD support (I checked by converting World is Mine from V7.39 to V1.30 and it played perfectly!) I am not going to do much more with it at this moment.

Right now, my focus is on getting my renderer to animate with the given animations and/or pose files. So until next time, thanks for reading!

Ren’py Data Manager

It’s a small utility that allows you to back up your save data for Ren’py-based games with ease. Currently only works with Katawa Shoujo, since that’s the visual novel I’ve been playing at the moment ;p But as I get more involved with other VN’s and learn how they structure their save files, perhaps I will add support for them as well.

It’s really great if you like to play on multiple computers but hate having to restart the game from the beginning. It also features the ability to backup the persistent data (like which extras you’ve unlocked).


Unavailable as of the blog migration; check back soon.


Note: Right now the binaries have only been tested on Windows. My Linux system is down at the moment so I have not had the time to test on Linux; however the software is written in C#, which is cross-platform and can be compiled under Linux with mono. The data path is found based on the Windows version of Ren’py though! The data path may be different on a linux system; since I haven’t tested it, I don’t know what it’s supposed to look like. A simple switch(Environment.OSVersion.Platform) should take care of these different cases though.

Screenshots of what I have so far with Concertroid

No animation just yet. Still trying to figure out the bone positioning and mapping the bone movements to the vertices of the model, with some consultation from my buddy over at AniMiku, who is undertaking a project quite similar to mine, except his is done in DirectX, and actually works at the moment…

I’m not content to just re-build MikuMikuDance in OpenGL instead of DirectX, though. Here’s a trio of of screenshots that showcase some of the rather interesting new features you will find in the Concertroid! animation system:

Nicer rendering and animated textures: You won’t see this in the preview, because there are no animations, but if you look closely at the skirt and armband textures between this screenshot and the following screenshot, you’ll notice that they’re different – changing every frame as the software is doing live rendering.


Lights-off mode for added effect: This feature was introduced in the 2011 Sapporo VOCALOID concert done by 5pb. and Marza Animation Planet, although they made a few mistakes. The lighting actually (in my opinion) got worse in the 2012 concert, but regardless of that, their biggest mistake was that they did not use it enough. The first few songs featured lights-off mode, and everything else they just lazily faded out. Since Concertroid! is a live rendering system, there will be no need to worry about that. If there is a delay specified within a certain threshold between two songs in the set list, the lights will go off and the model will still appear to be on the stage.


Bone view: This is only a feature for me to see the bones as I am developing the animation system, but it is a feature I will keep in Caltron (the underlying .NET rendering library I am developing alongside Concertroid!) in case other developers might find use for it.


I hope you’ve enjoyed this latest update, look forward to more coming in the next few months (hopefully, if I can get these features all sorted out!)

FINALLY…… you’ll never believe this

TL;DR: I fixed the problem; here’s the result.CREngine fixed texture rendering issue

For those who want to know the dirty details… I was playing around with Caltron, the rendering engine for PolyMo Live! (now Concertroid!) and I started noticing a peculiar bug in the method that applies the textures. Colors in normal GDI are encapsulated by System.Drawing.Color in the .NET framework, a structure I use to store color values in non-Caltron projects (like Universal Editor). These Colors represent color values using four Int32s (something I’ve never understood, since colors can really only be practically represented in ranges from 0-255). Whether they’re four Int32s or four Bytes, their practical range is still only 0-255.

Well, OpenGL expects color values to be formatted as percentages, something I knew full well when I started getting into it. However, as I soon learned, I was forgetting a crucial part to the texture applicator. The RGB color values were not getting divided by 255. So, I just added the necessary division, and voilà! The entire thing went black. See, I had forgotten ANOTHER crucial part to the renderer. I had forgotten to flip the light switch.

Yes, the light had been configured with the correct color values and position and whatnot, but I had forgotten to glEnable() the light. Anyway, now that I fixed both of the problems (and found an interesting phenomenon where the light position wasn’t completely correct and was actually illuminating INSIDE Miku instead of outside her), I feel it’s time to update the blog. So enjoy the latest picture! I should have a test program (and complete source code) released soon, but I found another bug where the software will run on my machine but no one else’s. I think it’s due to the Universal Editor library doing something it shouldn’t be doing (namely, attempting to read/write from a read/write-protected location)… so I should have that fixed in no time.

TGA image loader for Universal Editor

Caltron Image Viewer displaying a TGA image using Universal Editor

Caltron Image Viewer displaying a TGA image using Universal Editor

Believe it or not, those last screenshots showing polymolive displaying (somewhat flawed) textures on the model were not in the de facto TGA file format (TrueVision “Targa”), they were in Windows Bitmap (BMP) format. I had been converting all the textures to BMP format because it was easier to load the BMP files using the .NET System.Drawing API, which (being based on Windows’ own GDI system) does not support more esoteric file formats like TGA. Fortunately, now TGA textures load PERFECTLY… though whether they’ll display correctly on the model is another story, but the below screenshot proves that the textures load just fine when displayed in a System.Windows.Forms.PictureBox – so any errors with the model textures is an OpenGL issue and NOT a universal-editor issue!!!