I’ve been a bit quiet on the blogging front for the past few months mainly due to being insanely busy at work – why is it that so many projects have a delivery date in August or September meaning you end up losing half your summer – or is that just my experience?
What I’ve been working on has been possibly the most interesting job I’ve ever had the pleasure of doing – it’s one of those where you can’t distinguish whether what you are doing is a job or a hobby and it consumes your entire life – but in a good way!
Back in March we got a lead regarding a university looking to put together a complete car environmental simulation system whereby you would drive a mass produced car into a simulation environment that would provide a 360 degree screen to simulate the visual environment, a system to simulate the actual operation of the car without it actually running i.e. you push the accelerator pedal, and the rev counter responds as if you were driving for real, and the bit that we were interested in which was to provide signals to stimulate the car infotainment system. It was a project we desperately wanted to win purely from an interest point of view, and happily we did.
Infotainment appears to be quite a regularly used buzzword in the automotive industry these days, and fairly obviously merges the words ‘Information’ and ‘Entertainment’. It essentially denotes the fact that whereas even just 10 or 20 years ago a standard car would have just a radio (possibly with RDS) and CD player and that was about it, and if it was a bit more upmarket, or a slightly electrically suspect French car it would also have a seperate car health check and management system that would inform you of when your car needs a service, or if a bulb is out.
The trend now is towards ever more integrated car control and entertainment systems that as well as a stereo now have built in GPS, USB, Wireless, Bluetooth for your phone and there appears to be no sign of this trend letting up with car manufacturers on the one hand looking to pack more and more technology and connectivity into their cars, and also governments legislating to make cars become ever more clever and safe through technology.
In the context of this project, our challenge was to provide a system that could stim a car or provide connectivity for it with AM, FM, DAB, DVB-T, Bluetooth, Zigbee, GPS, WLAN, ITS, BroadRReach, CAN, LIN, FlexRay and also carry out tasks such as simulating a mobile phone call or text.
Almost all of these standards are already covered by National Instruments toolkits, so we were able to make use of their GPS Toolkit, FM-RDS, and Modulation Toolkit for FM, AM, and off the shelf drivers for CAN, LIN and FlexRay.
For DAB and DVB-T we were able to use Maxeye’s DAB toolkit, for simulating Mobile comms we made use of an incredible bit of software called Eggplant, and for the rest we used a mix of bespoke drivers and .NET libraries.
Hardware wise we used 3 PXI Chassis containing signal generators, a VST, and serial and DAQ interfaces.
For the software we made what some people may consider to be quite a bold choice and used Actor Framework. As I have alluded to before on this blog it is quite a divisive topic among LabVIEW users who either seem to hate it and won’t touch with a barge pole or absolute love it – or to be totally accurate, love it for a few weeks, hate it for a week or two, and then love it again; what I am maybe trying to say is that it is not without its frustrations at times!
Overall though, I think it’s fantastic – although I find it quite difficult to articulate why I feel this way about it.
Almost all of what you can do with Actor could be achieved with daemons, queues, notifiers and user events, what I would say makes Actor so appealing is the elegance with which you can develop very complicated, multi-process, yet utterly robust architectures with rock solid inter-process communication whilst actually writing very little code yourself. It is also incredibly easy to make an Actor Framework fully remote controllable (again with very little additional code), and create highly re-usable components using abstract message classes.
An example of this elegance is in terms of error handling. A typical way I have done this before would be to have a single error handler daemon that is then passed errors from other processes via a queue. you would then have to insert an error handler VI in every process to transmit these error to the daemon. Works fine, great. With Actor the way you do this is to create a base class that inherits from Actor, and then override the Handle Error.vi method to decide what to do with certain errors. You then inherit this class into all your other Actors and without writing an additional code all of your asynchronous processes are using a robust and uniform error handling strategy – and what’s more it is then very easy to escalate errors to say your main Actor, and if necessary to carry out recovery actions or stop/restart processes with ease.
It’s something that is possibly a blog in itself, and it’s one of those things that you will either get it or not – certainly not the be all or end or of how to do things, but it works for me.
On top of using Actor Framework, as we were using multiple PXI chassis and had various bits of software running on these chassis to physically out put signals – an example of this was the GPS. We wanted to be able to control much of the GPS settings from our main GUI, but our actual GPS transmitter (RFSG) was on a different PXI chassis we had to decide on a scheme as to how we would communicate with these remote systems.
There were a number of options for this – the obvious ones being TCP/IP, Network Shared Variables, or Network Streams. A further option that came to light was LNA’s (Linked Network Actors). This network stream based library allows you to communicate between Actors across a network, and after some testing we decided to use these.
As with the rest of Actor it has proved to be absolutely bullet proof, and in over 5 months of development we have not as far as I am aware seen a single dropped transaction.
In terms of issues as with any project we have encountered a few. The biggest one without a doubt though has been that we have at times experienced horrendously slow performance at edit time. This has been partly unavoidable as we have been using some very big toolkits (the GPS and DAB especially are huge which given what they are doing is hardly surprising).
Undoubtedly though, some of this performance was due to us using Actor, and specifically the fact that whilst Actor out of the box gives you incredibly strong cohesion between processes due to the use of messaging classes, it also means it can be difficult to loosely couple components, and for instance we were finding that would would load the GPS Actor, and it would also pull in all its sibling Actors such as Radio, Zigbee etc.
There is a solution though, and that is to use Abstract Message classes, and I wish I’d have done more research into this before I started the project as it isn’t a new thing with Actor, and when we finally got around to implementing this we saw a 10 fold, maybe more increase in load times, and certainly if I get another project suitable to use Actor on then I will implement this from day one.
So anyway, this is some screen dumps of the (almost) finished product if anyone is interested and to put in context some of what I’ve said in this post.