Thursday, September 15, 2016

Hydrus Project Update


(This should have been published months ago (nov. 2016) but I was so busy with the project itself I kinda forgot it was on the queue)

A lot has been happening in the Hydrus Project front. The deadlines are getting tighter and there is still a ton of stuff to do, for the whole team. In this post I'll go through what is new, what was accomplished, and the what main challenges were for getting there. Finally, I'll summarize what is left to be done and ponder over any loose ends.

Some of the boat's electronics, as assembled in our test bench

 

System Board

In an effort to consolidate the electrical and electronic project of the boat, a new system board is being finalized. It has more of an integration role, presenting a ton of connectors for the different sensors and subsystem modules. It also contains an instrumentation amplifier for the PH probe, and a slave Arduino Nano for dealing with the ultrasonic sensor array. The Nano can also be used for interfacing with other less important sensors, or other platform features, should the need arise. 

The whole project is being done in KiCad, an open-source EDA suite. A custom component footprint for the GPS module had to be drawn, but all in all, most needed stuff was already in place. I really enjoy working with KiCad. It is simple, yet featureful, rarely getting in my way. And the 3D preview is a very nice bonus. 

The hardest part of the layout process was my insistence in using only a single copper side. This is very important for me since we do not have a PCB prototyping machine, so the production techniques at our disposal are rudimentary. At best. But after a lot of swapping components around and painstakingly tracing everything by hand, it was possible to arrive at a viable layout, with only a single jumper resistor.

KiCad has come a long way since I first tried it many years ago

Construction of the board is expected to begin shortly, using PCB homebrewing techniques. If only we could make it look as good as in the 3D preview, with solder masks and silk layers and all... boy I'd be really happy.

Boat Frame

The plug for the hull mold is being fabricated out of a styrofoam model given to us by one of our Joinville colleagues, that just graduated as a naval engineer. Everything is being done in the models lab, LabMod. The styrofoam model was covered in plaster and sanded to a smooth finish, converting it into a solid plug. It was then used to laminate a female fiberglass mold. We are now in the process of using the mold to derive both of our hulls. 

Êmili is the one making sure our hull comes out nice and strong. The team expects the final boat frame to look pretty good, not bad at all for a bunch of mechatronics nerds. We are still figuring out the best configuration for the middle platform, and should be able to integrate the electromechanicals and electronics into the frame in a matter of weeks.

The hull plug during sanding...

 
..and the hulls after being laminated inside the negative

Software

The station desktop program can now edit navigation routes and has received general polish. It did not change much though. The meatier part is the new simulation mode. I'll talk about it shortly.

Most changes went into the drone firmware, as it acquired more capabilities and is now approaching a water-ready state. The original software architecture did not change at all, since it has proven to be a viable base for implementing all the required features for the prototype. 

One step back is that it was not possible to use a separate thread for driving the I2C bus, and by extenion, the OLED screen. This is unfortunate as the screen update rate was already qslow. Now it has to be driven opportunistically, during slack time in the main thread. This will be investigated further, if time allows. It is not an essential feature of the project, but if the need for removing the screen ever comes, its bling factor will surely be missed.

Control and Simulation

Nothing of the above would amount to much if the boat could not fulfill it's main purpose: autonomous navigation! Alright, we'd still have a cool boat. Anyway, we want a control loop running in the firmware. 

I decided to implement it as a sort of state-machine control loop multiplexer thing. For every state, there is a basic behavior, that in a certain condition might advance the simulation towards the next logical step. This made control much simpler to implement, in place of having a monolithic, very complex controller.

There is also a simulation mode embedded into the station application, which can override some sensors and actuators according to a simplistic model, for testing.

Next Steps

This project is big. Like, huge. A lot of stuff. I had no idea what I was trowing myself into. But yeah, that's life. We'll manage.

Huge thanks to my teammates for putting up with everything and remaining calm, collected and motivated.

The time for presenting everything is approaching fast. Let's see how this will turn out. Until next time!

Monday, June 27, 2016

The Hydrus Project


It has become traditional for the Brazilian Symposium on Computing Systems Engineering to be held together with the Intel Embedded Systems Contest. For the contest's 2016 edition, I have teamed up with a couple of friends, Guilherme Pangratz and Êmili Bohrer, and my mentor, Giovani Gracioli, to come up with an exciting entry. Back in April, the project was approved, and development is going on full-steam, so I thought it was time to blog about it.

The proposal


The state of basic sanitation in Brazil is dire in many places. By investigating ways of improving our ability to detect irregular sewage dumping or other changes in water quality in bays and reservoirs, we decided to create an aquatic drone that navigates and measures water quality autonomously. We named the drone and the project after Hydrus, the "male water snake" constellation of the southern sky.

Our drone is to based on the Galileo Gen 2 board, and should be able to leave its base station, autonomously go to set waypoints, acquire some data, and return home.

Current status

Base station application prototype
  • Hardware: We have created power, GPS and frontend boards for the boat. The frontend has indicator LEDs and a shutdown button, as well as an AMOLED screen for displaying essential system information. The power board contains a very basic power backbone for distributing power to the electronic motor controllers, a battery level sensing circuit, and a simple +5V output for powering the sensors.
  • Software: The base for our firmware is a cyclic executive task scheduler with a scheduling table, and a global blackboard data structure. There are tasks for system management, navigation, sensing, and communication. Parsing of GPS navigational data is done, as well as motor control and vectoring. Communication with the base station is partially implemented.
  • Base station software: as of now, the application for the base station can render the current drone position, and display the essential information about the drone, in real time. It can render a preset route atop the map. Missing is the ability to edit the route from the map itself.

What's next

There is still a ton of work ahead of us! We haven't yet received the sensors we ordered, when they arrive they need to be integrated and tested. We are finishing work on a test frame, so that we can implement the navigation next, based on sensor fusion algorithms and a control state machine. Exciting stuff!

Wednesday, March 16, 2016

dw Engine Update Two

dw Engine is my project on QML-based game facilities that extend Qt to form a 2D Game Engine for desktop and mobile. I have been taking my sweet time bringing the engine to higher standards of ease of usage and performance and I would like to share some of it and some thoughts on Qt performance and suitability for such a project.

New Graphics and New Test Level

Tiles were painted on Krita
I decided to take the plunge and convert the demostration game to HD. Along with that came a new, original test level, new sprites for the common gameplay objects, most painted in Krita, and a few rendered in Blender. Sonic's 60-something poses still need to be remade though. Maybe I can get some help with that...

 

Texture and Spritesheet Managment

The most noticeable problem during the transition to HD was the constant uploading of textures to the GPU. It seemed to happen whenever an Image item was made visible, or when the properties of an Animated Sprite object changed. This was destroying the framerate on mobile, and highlighted the need for more explicit control of the texture lifecycle if one wants to make an action game in QML.  A solution was to create specialized classes (dwTextureCache, dwTexture, and dwImageItem) for managing and displaying textures on the screen. It already supports online conversion of texture to 16bpp and allows for future extension to support texture compression. This will be important on mobile because the game's rendering performance is constrained mostly by memory badwidth limitations, and compression helps with that greatly.

New spritesheet system is simple to use
There was also the need to replace the usage of AnimatedSprite. It has a cumbersome interface and most of the problems of Image. So a new subsystem was introduced with a Spritesheet class and a cache for it, just like with the textures, that describe the animations contained in a texture atlas in terms of sequences. They can have various properties, such as parametric animation speeds and automatic transition to other sequences. Spritesheets are described by a JSON file in a simple format.

Sprite instances are a specialized subclass of dwImageItem, that keep track of the animation data and are updated by a centralized animation controller. It keeps track of multiple animation domains and allows animation control in a global fashion.

Level Editor

The in-game level editor that was mostly stuck in development hell was rethought and is now good enough for my use case. It allows editing of the object layout for a stage, and has specialized modes for adding tile and geometry object types. It is modular and is not even loaded if a level is not started in debug mode, but otherwise can put the entire field in "edit mode" instantly by a single press of the escape key.
Editing the test stage mid-gameplay

Other Small Stuff

Support for game controllers was added, on all desktop platforms, courtesy of SDL2. This was thoroughly tested using my trusty 360 controller, and fleetingly using a couple other controllers as well. It should work on mobile too but this was not tested. 

Moreover, there was a need to render water as a simple colored quad that would multiply the colors underneath it, simulating what a 90's console could accomplish via palette changes mid-hblank. However, there are no blending modes available in QtQuick besides common alpha blending. The way to circumvent this was to create a special QSGMaterialShader subclass that executes custom OpenGL code upon activation, and change the blending mode this way. Thanks to Giuseppe D'Angelo from KDAB for this great tip! To use this material an entire chain of classes had to be created culminating in a special node type. I plan to extend this special node in the future to allow it to render arbitrary textures in arbitrary blending modes, and maybe even point particles.

QtQuick and Performance

Sometimes I wonder if I have chosen the right platform for this project by choosing Qt. Certainly QtQuick is more than sufficient for creating casual games and even some more involved examples, but is it ready for a platformer with sprawling levels and tens of objects onscreen at once? After having to implement a lot of custom infrastructure that replaces core use cases of QtQuick (images, sprites, etc), I would say no, it is not. 

Object creation is very slow, frame sync is a bit wonky on all platforms I test except for Android, the JavaScript engine is a source of major and unexplainable frame time variability, and so on and so forth. I still need to implement some sort of object pooling, otherwise frame skips are going to be a frequent and sore sight on an eventual mobile release.

However, even with all those shortcomings, QtQuick and QML allowed me to accomplish far more than I expected initially when beginning this project. If one understands some of the QML engine's inner workings and good patterns for performance, the level of achieved productivity can be high. And there is the undisputable truth, right here in front of me, that the QSG renderer can churn out 150 frames per second at 1080p on Intel on Mesa. It is a 2D game, sure, but these are performance levels that I would expect from a game engine.

I guess at some point I should go see what can be done with Qt3D... :)

Saturday, January 2, 2016

Microcontroller and Instrumentation Experiments

Holidays are still going strong here in Rio. But I guess as a result of some subconscious new year resolutions, it is finally time to clear up the publishing queue a little bit. Blogging has been slow because things have been moving past me alarmingly fast.

From the top of my mind, the first thing that comes up is that I presented a paper at SBESC 2015 in Foz do Iguaçú (link to it when the proceedings are out). A huge shout-out to my teacher and mentor Giovani Gracioli is in order: every single meetup or thing we do together is a lesson. Thank you so much.

This post is not about proper research activities though, that is coming up at a later date. Now I'd like to ramble about some experiments done in two courses: Microcontrollers and Instrumentation. Working within the limitations of the Tiva C series Launchpad by TI as an interface board with the PC, my tasks were to interface with varied low-cost sensors, trying to build valid instruments out of the setup, and testing them. Being the Qt junkie that I am, for each of them there is a Qt/QML computer program that provides the interface.  Here they are:

Sound Frequency Meter


Trying to make sense of noisy input
The first project goal was to construct an useful measurement instrument out of a microphone breakout board not unlike this one. It did not have an analog-out though, so some analysis and pin soldering was needed.  I decided to go for a frequency meter application because the necessary algorithms (FFTs) are readily available in good quality. The chosen library to do the decoding of the audio data was libfftw.

In the board firmware, a simple program samples the ADC at 22050Hz via a timer. The most significant 8 bits are then sent to the PC via serial. It is a crude mechanism. An improvement would be using the USB device interface for proper sound capture. However the computer application takes care of interpreting the data as audio and analysing it. There are options to normalize the input for display and also turn off the Hann Window used in filtering the input.

All the analyzed data is drawn via QML's Canvas element and some simple Javascript, both the input waveform and the representation in the frequency domain.

A report (in Portuguese, sorry) was written with more details.

Rotation Speed Meter


The complete test setup
On a second Instrumentation task, I was given a very simple rotary quadrature encoder, and a brushed permanent-magnet servomotor to go with it. My task was to assemble both together somehow, and use the encoder to measure the current speed of the motor.

The coupling of the motor is something quickly put together in the course of a day in the fabrication lab. The most interesting part of the assembly process was the manual machining of the axle coupling with a lathe.  

The capture board was required to do more this time, taking care of debouncing and interpreting the state changes of the input signal. There are hardware peripherals on the Tiva (QEI) specifically for interfacing with such encoders, however they could not be used successfully because the cheap encoder was too noisy for the QEI to handle. So debouncing was implemented in software and the output to the PC was in the form of 'L' and 'R' character pulses to indicate that the encoder moved. The implementation for this is based off of some Arduino code I found around the web. It was used because I thought the event transition scheme of the event machine was very elegant and (cursorily) performant.

The PC application used the skeleton of the previous one and is much simpler this time, counting pulses in a period of time and estimating rpm accordingly. The report goes as far as this point in development. However, as some sort of project epilogue, An H-Bridge came into play. and some rudimentary speed control for the motor was assembled and implemented. That was the most fun part of it all.

Camera Interfacing


Using friends as models
For the Microcontrollers course, my task was to interface with a VGA camera with a parallel interface (and I²C for control), the OV7670 module. The module does not have a framebuffer and the transfer data rate is too much for the serial interface. Since I was trying to avoid all the hassle of the USB interface because I didn't have much time, I decided to try and see what could be done without it and without DMA to keep things simple. Turns out it was possible to fit a 8bpp QQVGA picture in the device's RAM as a framebuffer. With it, it was not necessary for the serial data rate to be very fast (and unreliable). Capturing would then be done in a scan-transmit serial fashion.

The PC viewfinder used the same serial access class wrapped in a camera interface, but a whole different UI code, using QGraphicsView, which was deemed easier for this application. A button can be pressed to save the image too. 

Around 1.7 fps were achieved using this scheme. Can we do better with the Tiva? Using USB and DMA, I believe so. But is the Tiva's main processor fast enough to stream a full VGA frame, in color, and at 30fps? I wouldn't bet on it. Hardware engineers are probably cringing right about now thinking of this setup. This is the kind of development task an FPGA is expected to excel at. Then again, just that it ultimately works is in itself an interesting thing.

Source Code


The source code for these experiments is in this repository. Hopefully It can help someone that needs to hack something quickly together using Qt and the serial port to interface with custom hardware.

That's for today. Happy New Year!!! :D