Friday, January 20, 2017

The Hydrus Project wins Intel's Embedded Systems Competition

Me, our friend Eloi, Êmili, Guilherme, and Professor Giovani

After many months of hard work and dedication, our team accomplished its ultimate goal, and we won the competition!

The Hydrus project (as per these posts) was successfully completed a single day before the competition. The last test was in the last possible day. All of our presentation material was put together and edited right before the date. We lost pieces of the boat in transportation, etc. Towards the end of our endeavor, everything was insanely chaotic. But we thrived.

Personally, I am hugely grateful to team members Êmili and Guilherme, who always put a lot of effort and whose belief in the project never faltered, even at times when I seemed aimless, or just plain going crazy.

In the end, it was a success. Not everything worked, and our execution was not perfect. Nevertheless, we ended up with a perfectly workable system, and we like the design of the software and the boat itself.

Here are the final report and presentation slides. Also a short video demonstrating it in motion:

I have chosen to use the boat as the subject matter of my term paper. During this process, it should be revamped to work with other control units (Raspberry Pi) and also be made possible for it to be driven by EPOS, the Embedded Parallel Operating System by LISHA,

Oh yeah, one last thing: the prize! We were awarded a trip to visit an Intel Facility in the United States.Can't wait!!

Thursday, September 15, 2016

Hydrus Project Update

(This should have been published months ago (nov. 2016) but I was so busy with the project itself I kinda forgot it was on the queue)

A lot has been happening in the Hydrus Project front. The deadlines are getting tighter and there is still a ton of stuff to do, for the whole team. In this post I'll go through what is new, what was accomplished, and the what main challenges were for getting there. Finally, I'll summarize what is left to be done and ponder over any loose ends.

Some of the boat's electronics, as assembled in our test bench


System Board

In an effort to consolidate the electrical and electronic project of the boat, a new system board is being finalized. It has more of an integration role, presenting a ton of connectors for the different sensors and subsystem modules. It also contains an instrumentation amplifier for the PH probe, and a slave Arduino Nano for dealing with the ultrasonic sensor array. The Nano can also be used for interfacing with other less important sensors, or other platform features, should the need arise. 

The whole project is being done in KiCad, an open-source EDA suite. A custom component footprint for the GPS module had to be drawn, but all in all, most needed stuff was already in place. I really enjoy working with KiCad. It is simple, yet featureful, rarely getting in my way. And the 3D preview is a very nice bonus. 

The hardest part of the layout process was my insistence in using only a single copper side. This is very important for me since we do not have a PCB prototyping machine, so the production techniques at our disposal are rudimentary. At best. But after a lot of swapping components around and painstakingly tracing everything by hand, it was possible to arrive at a viable layout, with only a single jumper resistor.

KiCad has come a long way since I first tried it many years ago

Construction of the board is expected to begin shortly, using PCB homebrewing techniques. If only we could make it look as good as in the 3D preview, with solder masks and silk layers and all... boy I'd be really happy.

Boat Frame

The plug for the hull mold is being fabricated out of a styrofoam model given to us by one of our Joinville colleagues, that just graduated as a naval engineer. Everything is being done in the models lab, LabMod. The styrofoam model was covered in plaster and sanded to a smooth finish, converting it into a solid plug. It was then used to laminate a female fiberglass mold. We are now in the process of using the mold to derive both of our hulls. 

Êmili is the one making sure our hull comes out nice and strong. The team expects the final boat frame to look pretty good, not bad at all for a bunch of mechatronics nerds. We are still figuring out the best configuration for the middle platform, and should be able to integrate the electromechanicals and electronics into the frame in a matter of weeks.

The hull plug during sanding...

..and the hulls after being laminated inside the negative


The station desktop program can now edit navigation routes and has received general polish. It did not change much though. The meatier part is the new simulation mode. I'll talk about it shortly.

Most changes went into the drone firmware, as it acquired more capabilities and is now approaching a water-ready state. The original software architecture did not change at all, since it has proven to be a viable base for implementing all the required features for the prototype. 

One step back is that it was not possible to use a separate thread for driving the I2C bus, and by extenion, the OLED screen. This is unfortunate as the screen update rate was already qslow. Now it has to be driven opportunistically, during slack time in the main thread. This will be investigated further, if time allows. It is not an essential feature of the project, but if the need for removing the screen ever comes, its bling factor will surely be missed.

Control and Simulation

Nothing of the above would amount to much if the boat could not fulfill it's main purpose: autonomous navigation! Alright, we'd still have a cool boat. Anyway, we want a control loop running in the firmware. 

I decided to implement it as a sort of state-machine control loop multiplexer thing. For every state, there is a basic behavior, that in a certain condition might advance the simulation towards the next logical step. This made control much simpler to implement, in place of having a monolithic, very complex controller.

There is also a simulation mode embedded into the station application, which can override some sensors and actuators according to a simplistic model, for testing.

Next Steps

This project is big. Like, huge. A lot of stuff. I had no idea what I was trowing myself into. But yeah, that's life. We'll manage.

Huge thanks to my teammates for putting up with everything and remaining calm, collected and motivated.

The time for presenting everything is approaching fast. Let's see how this will turn out. Until next time!

Monday, June 27, 2016

The Hydrus Project

It has become traditional for the Brazilian Symposium on Computing Systems Engineering to be held together with the Intel Embedded Systems Contest. For the contest's 2016 edition, I have teamed up with a couple of friends, Guilherme Pangratz and Êmili Bohrer, and my mentor, Giovani Gracioli, to come up with an exciting entry. Back in April, the project was approved, and development is going on full-steam, so I thought it was time to blog about it.

The proposal

The state of basic sanitation in Brazil is dire in many places. By investigating ways of improving our ability to detect irregular sewage dumping or other changes in water quality in bays and reservoirs, we decided to create an aquatic drone that navigates and measures water quality autonomously. We named the drone and the project after Hydrus, the "male water snake" constellation of the southern sky.

Our drone is to based on the Galileo Gen 2 board, and should be able to leave its base station, autonomously go to set waypoints, acquire some data, and return home.

Current status

Base station application prototype
  • Hardware: We have created power, GPS and frontend boards for the boat. The frontend has indicator LEDs and a shutdown button, as well as an AMOLED screen for displaying essential system information. The power board contains a very basic power backbone for distributing power to the electronic motor controllers, a battery level sensing circuit, and a simple +5V output for powering the sensors.
  • Software: The base for our firmware is a cyclic executive task scheduler with a scheduling table, and a global blackboard data structure. There are tasks for system management, navigation, sensing, and communication. Parsing of GPS navigational data is done, as well as motor control and vectoring. Communication with the base station is partially implemented.
  • Base station software: as of now, the application for the base station can render the current drone position, and display the essential information about the drone, in real time. It can render a preset route atop the map. Missing is the ability to edit the route from the map itself.

What's next

There is still a ton of work ahead of us! We haven't yet received the sensors we ordered, when they arrive they need to be integrated and tested. We are finishing work on a test frame, so that we can implement the navigation next, based on sensor fusion algorithms and a control state machine. Exciting stuff!

Wednesday, March 16, 2016

dw Engine Update Two

dw Engine is my project on QML-based game facilities that extend Qt to form a 2D Game Engine for desktop and mobile. I have been taking my sweet time bringing the engine to higher standards of ease of usage and performance and I would like to share some of it and some thoughts on Qt performance and suitability for such a project.

New Graphics and New Test Level

Tiles were painted on Krita
I decided to take the plunge and convert the demostration game to HD. Along with that came a new, original test level, new sprites for the common gameplay objects, most painted in Krita, and a few rendered in Blender. Sonic's 60-something poses still need to be remade though. Maybe I can get some help with that...


Texture and Spritesheet Managment

The most noticeable problem during the transition to HD was the constant uploading of textures to the GPU. It seemed to happen whenever an Image item was made visible, or when the properties of an Animated Sprite object changed. This was destroying the framerate on mobile, and highlighted the need for more explicit control of the texture lifecycle if one wants to make an action game in QML.  A solution was to create specialized classes (dwTextureCache, dwTexture, and dwImageItem) for managing and displaying textures on the screen. It already supports online conversion of texture to 16bpp and allows for future extension to support texture compression. This will be important on mobile because the game's rendering performance is constrained mostly by memory badwidth limitations, and compression helps with that greatly.

New spritesheet system is simple to use
There was also the need to replace the usage of AnimatedSprite. It has a cumbersome interface and most of the problems of Image. So a new subsystem was introduced with a Spritesheet class and a cache for it, just like with the textures, that describe the animations contained in a texture atlas in terms of sequences. They can have various properties, such as parametric animation speeds and automatic transition to other sequences. Spritesheets are described by a JSON file in a simple format.

Sprite instances are a specialized subclass of dwImageItem, that keep track of the animation data and are updated by a centralized animation controller. It keeps track of multiple animation domains and allows animation control in a global fashion.

Level Editor

The in-game level editor that was mostly stuck in development hell was rethought and is now good enough for my use case. It allows editing of the object layout for a stage, and has specialized modes for adding tile and geometry object types. It is modular and is not even loaded if a level is not started in debug mode, but otherwise can put the entire field in "edit mode" instantly by a single press of the escape key.
Editing the test stage mid-gameplay

Other Small Stuff

Support for game controllers was added, on all desktop platforms, courtesy of SDL2. This was thoroughly tested using my trusty 360 controller, and fleetingly using a couple other controllers as well. It should work on mobile too but this was not tested. 

Moreover, there was a need to render water as a simple colored quad that would multiply the colors underneath it, simulating what a 90's console could accomplish via palette changes mid-hblank. However, there are no blending modes available in QtQuick besides common alpha blending. The way to circumvent this was to create a special QSGMaterialShader subclass that executes custom OpenGL code upon activation, and change the blending mode this way. Thanks to Giuseppe D'Angelo from KDAB for this great tip! To use this material an entire chain of classes had to be created culminating in a special node type. I plan to extend this special node in the future to allow it to render arbitrary textures in arbitrary blending modes, and maybe even point particles.

QtQuick and Performance

Sometimes I wonder if I have chosen the right platform for this project by choosing Qt. Certainly QtQuick is more than sufficient for creating casual games and even some more involved examples, but is it ready for a platformer with sprawling levels and tens of objects onscreen at once? After having to implement a lot of custom infrastructure that replaces core use cases of QtQuick (images, sprites, etc), I would say no, it is not. 

Object creation is very slow, frame sync is a bit wonky on all platforms I test except for Android, the JavaScript engine is a source of major and unexplainable frame time variability, and so on and so forth. I still need to implement some sort of object pooling, otherwise frame skips are going to be a frequent and sore sight on an eventual mobile release.

However, even with all those shortcomings, QtQuick and QML allowed me to accomplish far more than I expected initially when beginning this project. If one understands some of the QML engine's inner workings and good patterns for performance, the level of achieved productivity can be high. And there is the undisputable truth, right here in front of me, that the QSG renderer can churn out 150 frames per second at 1080p on Intel on Mesa. It is a 2D game, sure, but these are performance levels that I would expect from a game engine.

I guess at some point I should go see what can be done with Qt3D... :)

Saturday, January 2, 2016

Microcontroller and Instrumentation Experiments

Holidays are still going strong here in Rio. But I guess as a result of some subconscious new year resolutions, it is finally time to clear up the publishing queue a little bit. Blogging has been slow because things have been moving past me alarmingly fast.

From the top of my mind, the first thing that comes up is that I presented a paper at SBESC 2015 in Foz do Iguaçú (link to it when the proceedings are out). A huge shout-out to my teacher and mentor Giovani Gracioli is in order: every single meetup or thing we do together is a lesson. Thank you so much.

This post is not about proper research activities though, that is coming up at a later date. Now I'd like to ramble about some experiments done in two courses: Microcontrollers and Instrumentation. Working within the limitations of the Tiva C series Launchpad by TI as an interface board with the PC, my tasks were to interface with varied low-cost sensors, trying to build valid instruments out of the setup, and testing them. Being the Qt junkie that I am, for each of them there is a Qt/QML computer program that provides the interface.  Here they are:

Sound Frequency Meter

Trying to make sense of noisy input
The first project goal was to construct an useful measurement instrument out of a microphone breakout board not unlike this one. It did not have an analog-out though, so some analysis and pin soldering was needed.  I decided to go for a frequency meter application because the necessary algorithms (FFTs) are readily available in good quality. The chosen library to do the decoding of the audio data was libfftw.

In the board firmware, a simple program samples the ADC at 22050Hz via a timer. The most significant 8 bits are then sent to the PC via serial. It is a crude mechanism. An improvement would be using the USB device interface for proper sound capture. However the computer application takes care of interpreting the data as audio and analysing it. There are options to normalize the input for display and also turn off the Hann Window used in filtering the input.

All the analyzed data is drawn via QML's Canvas element and some simple Javascript, both the input waveform and the representation in the frequency domain.

A report (in Portuguese, sorry) was written with more details.

Rotation Speed Meter

The complete test setup
On a second Instrumentation task, I was given a very simple rotary quadrature encoder, and a brushed permanent-magnet servomotor to go with it. My task was to assemble both together somehow, and use the encoder to measure the current speed of the motor.

The coupling of the motor is something quickly put together in the course of a day in the fabrication lab. The most interesting part of the assembly process was the manual machining of the axle coupling with a lathe.  

The capture board was required to do more this time, taking care of debouncing and interpreting the state changes of the input signal. There are hardware peripherals on the Tiva (QEI) specifically for interfacing with such encoders, however they could not be used successfully because the cheap encoder was too noisy for the QEI to handle. So debouncing was implemented in software and the output to the PC was in the form of 'L' and 'R' character pulses to indicate that the encoder moved. The implementation for this is based off of some Arduino code I found around the web. It was used because I thought the event transition scheme of the event machine was very elegant and (cursorily) performant.

The PC application used the skeleton of the previous one and is much simpler this time, counting pulses in a period of time and estimating rpm accordingly. The report goes as far as this point in development. However, as some sort of project epilogue, An H-Bridge came into play. and some rudimentary speed control for the motor was assembled and implemented. That was the most fun part of it all.

Camera Interfacing

Using friends as models
For the Microcontrollers course, my task was to interface with a VGA camera with a parallel interface (and I²C for control), the OV7670 module. The module does not have a framebuffer and the transfer data rate is too much for the serial interface. Since I was trying to avoid all the hassle of the USB interface because I didn't have much time, I decided to try and see what could be done without it and without DMA to keep things simple. Turns out it was possible to fit a 8bpp QQVGA picture in the device's RAM as a framebuffer. With it, it was not necessary for the serial data rate to be very fast (and unreliable). Capturing would then be done in a scan-transmit serial fashion.

The PC viewfinder used the same serial access class wrapped in a camera interface, but a whole different UI code, using QGraphicsView, which was deemed easier for this application. A button can be pressed to save the image too. 

Around 1.7 fps were achieved using this scheme. Can we do better with the Tiva? Using USB and DMA, I believe so. But is the Tiva's main processor fast enough to stream a full VGA frame, in color, and at 30fps? I wouldn't bet on it. Hardware engineers are probably cringing right about now thinking of this setup. This is the kind of development task an FPGA is expected to excel at. Then again, just that it ultimately works is in itself an interesting thing.

Source Code

The source code for these experiments is in this repository. Hopefully It can help someone that needs to hack something quickly together using Qt and the serial port to interface with custom hardware.

That's for today. Happy New Year!!! :D

Friday, November 13, 2015

TampereGotchi on GitHub

It's about time! After almost a year past the project completion at the University of Tampere, the source code was left bitrotting in my (and my colleague's) hard drives. Since then I remembered countless times that I had to upload and blog about it and fix a couple things on it but never had the time. Hopefully it's not too late.

TampereGotchi is a finland-themed clone of the most popular franchise of virtual pets. Basic actions like feeding, cleaning, and playing game swith your pet are implemented. Also sending and receiving pets to a server and sharing it via a code. The game is not finished and likely never will be! But it is a great testbed for playing with Qt 5 and QML.

I am particularly proud of the polish that went into the main screen and some other tidbits, and the spaceship minigame. Also Joona and Ammar did a great job with the sprites. Thanks to our PMs and teachers at the University of Tampere that made this school project something special and very fun to work in!

The repository with the code and assets is available at Hopefully some will clone it and mess with it a bit.

An APK built for armv7 processors is available here.

Friday, October 9, 2015

Back to Brazil: LISHA and other tidbits

Exchange period is finally over. Truth be told, it has been over for almost two months! But it feels like only now I have got my bearings back again. It is not easy leaving Finland, what an awesome country!

Just as I was doing before leaving more than a year ago, I am back at work at LISHA, the Hardware and Software Integration Laboratory at Santa Catarina Federal University (UFSC), whilst working on graduating as a Mechatronics engineer. Under the supervision of Dr. Prof. Giovani Gracioli, my current filed of research are resource synchronization protocols for mutiprocessor real-time operating systems (RTOSs), or more specifically, EPOS.

Working at such high level research is very challenging work. Reading and reviewing articles that represent the top contributions in the field of RTOSs is mandatory for background research, and my current focus. Soon we shall be moving to algorithm modelling.

On a lighter note, I am also doing regular courses from the programme and my favorite so far has been Microcontrollers. We are exploring, one by one, every peripheral and interface of the Tiva C Series TM4C123GH6PM Microcontroller (using the LaunchPad development platform). It struck me how much functionality was crammed into that IC. The CCS development environment could deliver a better experience, especially on Linux, but thankfully one only needs a standard cross compiler gcc toolchain (arm-eabi-none) to develop for the board. TivaWare including all the necessary makefiles also helps a lot.

I plan to blog about the course's final project and any other stuff that ends up being done with the board. Also if there is any progress on dw Engine I shall post about it. A friend and me are in the process of sketching up an original level and ship a demo. Let's see what happens!

Saturday, June 20, 2015

Presenting neiatree

Mandatory Screenshot
neiatree is an asset tree processing tool for games of other multimedia applications. I made it after I counldn't find something similar that was simple to setup and use like I wanted, so why not scratch my own itch? By this naming convention following after neiasound I guess I'm starting a neiasomething library collection. neiaframeworks, perhaps? Hmm...  

So, about neiatree. It allows you to parse a directory tree into another, optionally processing the files into other files. Inspired by make, also keeps track of source file modification dates and updates only what has changed. So you can for example, compress textures and sounds for your game as a build step that runs automatically, before or after a build or a run. Want to use different compression settings for your tool? Not to worry, change the rules and clean your destination folder. If you did not change anything, the overhead in project build time is negligible. A tenth of a second or so.

It is easy to integrate it into your toolchain. The only build dependency is Qt 5.4. You can use an older version with simple tweaks.

Licensed under 2-clause BSD. Go check the GitHub repository!

Tuesday, June 16, 2015

Intel RealSense for Mobile Devices: Aftermath

Geting, Rauli, Me :)
It is hard to believe the project is over already, and I did not post anything about it apart from a passing mention when the KoneCranes App project was concluded. Perhaps it is because our great PM Cyndi took care of keeping a nice and steady blog about all of our activities, productive and social. Here it is.

I can't think of anything but praise for the team. We simply work great together, top to bottom, in a fashion I have only fleetingly experienced before. Even if the project turned out to be crap, at least we would have made some good friends. Alas, it turned out our project was in the top 4 shortlist, and we competed head-to-head for the win! Nope, we did not win. So close...

What we ended up with as deliverables was mainly documentation for ideas for future RealSense applications, and a proof of concept implementation of 3D camera usage for unique party experiences. Those are to be licensed to Intel, so meh, I can't post anything here ;)

If I had to distill a main project takeaway from a technical perspective, it would be that Intel is right on the money. The "future" mobile experiences will be defined by contextual computing, and quality of implementation matters. Specialized hardware with good software middleware is must for developers, given the sheer amount of heterogeneous solutions that need to come together to give a contextual computing architecture shape and protocols. This is a very exciting time for developers. Our camping knives are being sharpened for us, our tents will raise by themselves.

Let's explore! 

Monday, June 1, 2015

Projective Game Platform for Public Spaces

Hello everyone! :)

My second and final semester at the University of Tampere has just come to an end. I am very thankful for being invited to this great instituition. The ambiance and facilities are just as excellent as the staff and quality of teaching. This post is about my project for the Human-Technology Interaction Project course at the School of Information Sciences.

It is a proof-of-concept for a gaming platform that could be applied to public spaces, using multiple projection and the original Microsoft Kinect. Not terribly original, but functional, and quite fun! Check out footage from our user testing: (that means party at my room :D)

It depends on Qt 5.4, OpenAL (it embeds a version of neiasound), OpenNI2, and NiTE2. Code is available here, on GitHub.

Sunday, May 10, 2015


Today is a very important day for me. I am finally releasing to the wild my first open source library!

neiasound is an OpenAL wrapper for Qt-style applications, ready to be integrated into an engine's main loop. Doing positional audio is stupidly easy with it. It also includes  facilities for reading wav and ogg files, and optionally supports libsndfile, for reading flac and many more formats. It is also easy to implement your own audio stream decoder interface. There is support for streaming dynamic playlists with intros and seamless looping.

I have been developing and dogfooding this lib for quite a while now, and I am most happy to share it with the world! In fact, dw Engine is my third project to use it. It never stopped evolving. What is missing, but coming, is support for more straightforward usage of efx effects and extensions, and minor cleanups.

In Android projects, it is compatible with the standard OpenAL Soft port, and OpenAL-MOB. I recommend OpenAL-MOB for a better experience and reduced latency. If you disable HRTFs, I suspect performance is the same or better.

neiasound is made available under the 2-clause BSD License. Give it a try on your next project!

Thursday, April 30, 2015

Maker Faire UK

First time in Maker Faire!
Well, it has been a while. This semester has been quite busy and there is still a lot of stuff to be done. Also I got confirmation that I'll have an internship for the summer at Demola. So... More work for me. Fantastic, really!

However this weekend I managed to take a break from school and Finland to go accompany my friend Juha from to Maker Faire UK in Newcastle. A huge thanks to him for making this possible :D

Together, in the stand, we demonstrated his homebrew pick-and-place machine and software and I would like take a moment to say how awesome it is. Between sending a board layout to a factory to get only a couple of prototypes, and soldering hundreds of absurdly small SMD components ourselves, electronics designers never had many options concerning the production of more complex prototypes. This machine attends these use cases perfectly, as it is cheap, and built with a designer's workflow in mind. Don't hesitate to check the website, even if only to appreciate its ingenuity.

On another note, I got to win SODAQ's mini hackaton challenge and now I have a board of my own! So I look forward to playing with it and coming up with something cool. The code for my design, Bluetooth Low-Energy controllable Game of Life, can be found here.

As for general stuff, my other projects go on. I should have something to share after the semester ends. Cheers!

Thursday, February 19, 2015

dw Engine Update One

We have Fire Shield! Normal shield! Bridges! Running on water! Rain! Particle effects! Cool stuff! Basically, The engine is ready for it's inevitable one/two zone demo. The first video is a bit older, and sucks for being offscreen. Linux version. The other is more recent, from the android build. The capture is choppy though, so you'll have to take my word for it when I say the framerate is good.

So, the engine has evolved by leaps and bounds since the last hackish screenshot on the previous post. Player physics were ported to C++ but remained functionally the same, only some bug fixes were applied, particularly when rolling and pushing. Now we have a much-needed BVH, less boilerplate code in level object components, and small misc optimizations all around. The Android version runs at 60fps solid most of time (Moto X 2013) even with reflections and fancy stuff. It drops frames when there are lots of objects on screen though, particularly when rings scatter. If one prizes accurate Genesis emulation, that is a feature! (I'm kidding of course, more optimizations are needed)

The BVH is of the hierarchical circles type, implemented as two classes in native code and performs well enough, activating/deactivating graph leaves by emitting signals. Essential for decent performance on android. Zones can be of arbitrary size now, within floating-point precision limitations. I did not see the need for anything more complex such as AABBs.

There is a thread in Sonic United about the project. More timely updates are to be found there, for the interested parties. I shall work on releasing a demo (with an original zone) in my spare time, I just hope there is enough time to present it at SAGE 2015. Zomg spriting is hard! Not sure if there are two acts this year like last year, so, fingers crossed.

Smart Crane Monitoring App and Real Sense for Mobile Devices

One final presentation at Konecranes HQ in Hyvinkää, marked the end of the SCMA project with Konecranes. It was a pleasure to work with the company, and special thanks go to our awesome Demola facilitator, Ville Korpiluoto. Cheers!

The project results were licensed to the company and development will continue internally at Konecranes. It has taught me a little bit more about native mobile app development and respective interfaces. I am also proud of the graphical design of the app, which was drawn from scratch. In retrospect, the whole project was a very positive experience. See previous posts for more material.

The Spring semester here in Finland has brought another interesting project to work on, in partnership with Intel: Real Sense for Smart Devices. This one has a bigger team, and very interesting technology to play with. I would love to try and get some game up and running with it :D

Now, I'm off to relax a little here in Scotland to finish a week-long UK trip, with a great view of the Edinburgh castle. Student life ain't easy...

Monday, January 5, 2015

Smart Crane Monitoring App Update

This is another blog post reporting project progress. On the 18th of December we had a review meeting with KONECRANES. We revised the use cases and set the final goals for implementation. During the holidays it was a bit difficult to work, given the atmosphere and constant distractions. I'm glad I still got to enjoy Christmas and New Year's in Tampere. Ultimately, work got done finishing the UI.

Inspection Checklist UI

Our next meeting will be on January 7th. The project nears it's conclusion. In our agenda we will focus on results so far and on how the final presentations will be conducted.

That was it for this blog post. I will be back with the video of the final presentation, on the 15th. Later!

Friday, December 12, 2014

Smart Crane Monitoring App

Hello everyone! This is an update on the Smart Crane Monitoring App project at Demola Tampere.

About the Project

More information about the project can be found here. Previous blog posts about the project can be found in the same page.

Our Progress

In the beginning of the project, one of the first things we worked on was the project playbook. It was reviewed in a meeting at Konecranes HQ, in which we also showed a draft application demoing sensor interaction. We also got to play with the industrial cranes in the testing facility (photos were not allowed). We had lots of fun!

Our team at Konecranes Headquarters

Next, we worked on the design of the user interface and defined the main interaction patterns and functionality of the application. We had weekly Skype meetings with our client for delivering updates on the progress. The skeleton of the app was laid out and choices concerning tools and libraries were made.

In the last weeks

A lot of work went into documenting use cases for the application, as these are one of the main deliverables of the project. We faced some delays in this process but implementation marched on.

Qt 5.x was chosen as the main library for app development, with the QtSql and QtMultimedia modules enabled, as well as the new WebEngine module in Qt 5.4. The libraries zbar and qchart are going to be used for reading Qr Codes and plotting graphs, respectively. The application is running on Android smoothly.

The backend uses MySQL Server 5 for all storage tasks. Currently there is no managed interface to the database. Acess control and data commiting is done solely through the MySQL driver, using users and roles from the DB system itself. All operations done by the client application are pure SQL queries.
 Some screenshots of the app running on the IDE and target devices:
Development Environment

App in action

Tuesday, December 2, 2014


The dw game engine showing off my programmer art.
I would like to show off a little side project of mine: the dwEngine. It is a game engine based off QtQuick2 and utilizes many of its features, benefitting from the great performance of QSG. It integrates Box2D, on the C++ side, for collision detection, raycasting and physics. As a bonus, it is being developed in paralell with its Android version, since it is so easy to port stuff.

My current goal is to get a complete Sonic the Hedgehog style engine working, replicating the original physics as closely as possible. The Sonic Physics Guide at Sonic Retro is a great resource, and the results so far are almost impossible to tell apart from the original games.

I shall publish a video once I get a more complete test level and a better recording setup. Submitting a demo to SAGE 2015 is my ultimate goal. Then I decide what to do with the engine from there.

Till next time!

Bonus screenshot: Debug mode on Android.

Sunday, April 27, 2014

Tutorial: Utilizando o Hadoop com o Eclipse no Ubuntu 14.04

  1. Instale a JDK com o seguinte comando no terminal:
    sudo apt-get install openjdk-7-jdk
  2. Baixar o Hadoop de
  3. Extraia para a pasta home
  4. Edite o arquivo ~/.bashrc e adicione as seguintes linhas ao início do arquivo:
    (edite a linha se o caminho do Hadoop for diferente)
    export PATH=$PATH:/home/seu-nome-de-usuario/hadoop-2.4.0/bin/
    export JAVA_HOME=/usr/lib/jvm/default-java/
  5. Faça logout, e depois faça login novamente
  6. O Hadoop está pronto para uso!
Usando com o Eclipse:
JARs que necessitam ser adicionadas a um projeto MR no Eclipse
  1. Crie um novo Projeto Java;
  2. Entre nas propriedades do novo projeto criado;
  3. Na seção "Java Build Path", na aba "Libraries", clique em "Add external JARs..." e adicione os JARs. Para usar o MapReduce por exemplo, veja a figura acima. Todos estão na pasta do hadoop, em share/hadoop/*
  4. Nem todos of JARs são úteis ou relevantes ao seu projeto, mas não faz mal adicionar todos se quiser;
  5. Pode começar a programar. A compilação ocorre automaticamente e a saída é na pasta bin/ da raiz do projeto. 
Para executar seu projeto, vá até a pasta dele através do terminal e execute:
HADOOP_CLASSPATH=bin hadoop [caminho.para.a.classe.Main] [entrada] [saída] 
HADOOP_CLASSPATH=bin hadoop org.camargo.hadoop.wc.WcMain ./data/teste.txt ./out

Aviso: é necessário apagar a pasta de saída entre as execuções, o Hadoop geralmente se recusa a escrever para uma pasta existente.

Saturday, April 19, 2014

Download: Raspbian Wheezy com Qt 5.2 (

Resolvi facilitar a vida de quem teria que passar pelos mesmos perrengues que eu, e disponibilizar uma imagem do Raspbian com os pacotes do repositório Sobre essa imagem:
  • Pode caber em um cartão SD de 2GB! (exige um retrabalho da tabela de partições da imagem padrão do Raspbian)
  • Deve ser descompactada antes de ser utilizada (formato lzma);
  • Não inclui a partição de boot. Você deve sobrescrever a partição de raiz (segunda partição) de um SD válido do Raspbian com esse sistema de arquivos ext4.
  • Repito: NÃO é uma imagem de disco, é uma imagem de sistema de arquivo.
  • Só possui o usuário root, a senha é "pi";
  • Algumas coisas foram removidas, como por exemplo, o lxde e interface gráfica, arquivos de documentação, o swapfile foi praticamente removido (8MB), etc; Depois de expandir o fs em uma partição maior, você pode reinstalar tudo de novo via apt-get;
  • Inclui um exemplo para testar o Qt5, na pasta /root;
  • Não se esqueça de utilizar o resize2fs ou o raspi-config depois de copiar o sistema de arquivos, para aproveitar ao máximo o espaço disponível no SD.
Não é complicado usar essa imagem, se você souber o que está fazendo. Se alguém pedir, faço um tutorialzinho ;)

Boa programação!

Friday, February 28, 2014

Tutorial: Qt 5.2 no RaspberryPi (Outdated)

Depois de apanhar um bocado, consegui fazer um programa escrito em Qt5 funcionar no Raspberry Pi com aceleração de hardware via OpenGLES 2.0. A principal dificuldade é que não há pacotes prontos no repositório oficial do Raspbian Wheezy. O Arch Linux possui uma versão recente do Qt5.2 em seus repositórios mas não foi compilada especificamente pro Pi, então faltam os plugins de plataforma da QPA. Mas tem várias alternativas pra rodarmos o Qt5 acelerado.

Do jeito mais simples pro mais complicado, podemos:
  • Utilizar uma tarball pré-compilada, fornecida por Heater, um membro da comunidade;
  • Utilizar o repositório do Sebastien Noel;
  • Atualizar a distribuição Raspbian para a versão Jessie; ou
  • Compilar o Qt5 nós mesmos.
Compilar o Qt5 para o Pi requer que compilemos no próprio dispositivo, ou a utlização de uma cadeia de ferramentas de compilação cruzada (cross-compiler), o que é bem complicado. Compilar o Qt5 no Pi requer mais de dois dias, um swapfile grande, e um hd externo, além de muita paciência. Então descartaremos essa possibilidade.

Importante: todos os comandos listados a seguir devem ser executados como root (use sudo -i para entrar em modo root). Algumas operações requerem espaço considerável no disco, então recomenda-se um cartão SD de 8GB. Fique atento ao espaço disponível em / com o comando df. Não se esqueça de expandir o sistema de arquivos na configuração inicial!

Utilizando a tarball do Heater

Existe um pacote do Qt5.2 pré-compilado fornecido por um membro da comunidade. Mais detalhes sobre o pacote podem ser encontrados nesse post. O processo de instalação é muito simples, apenas extrair o pacote, so é preciso configurar algumas variáveis de ambiente depois.

Vamos baixar o arquivo, extraí-lo, e remover o pacote para economizar espaço:
 tar xvzf qt5.2-rpi.tgz
 rm qt5.2-rpi.tgz

E mover o diretório base do Qt para /opt, criando os symlinks necessários em /usr/local.
 mv qt5 /opt
 ln -s /opt/qt5 /usr/local/qt5

Agora precisamos configurar as variáveis de ambiente. Fazemos isso de forma sistêmica editando /etc/profile. Adicione essas linhas ao final do arquivo:
 export PATH=${PATH}:/usr/local/qt5/bin
 export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/usr/local/qt5/lib
 export QT_PLUGIN_PATH=/usr/local/qt5/plugins

Tudo deve estar pronto! Reinicie o Pi para colocar as novas configurações em vigor.

Utilizar o repositório do Sebastien Noel (

Para facilitar a vida, recomendo utilizar a imagem que disponibilizei aqui. Estará desatualizada assim que o Rapbian Jessie sair oficialmente, mas pelo que pude testar é estável e funciona muito bem.

dist-upgrade para o Raspbian Jessie 

Essa forma pode parecer mais simples, mas como é um repositório de testes não há garantias de estabilidade, o update demora bastante e provavelmente vai remover pacotes que talvez você utilize. Recomenda-se fazer um backup antes de atualizar a distribuição, sempre! Nossa sorte é que o Pi não pode ser brickado, basta flashear o cartão SD novamente, mas é trabalhoso reconfigurar todo o sistema.

Primeiro, atualizamos o sistema de forma normal, com
 apt-get update && apt-get upgrade  

Em seguida, vamos atualizar a lista de repositórios. Edite o arquivo /etc/apt/sources.list e substitua todas as ocorrências de wheezy por jessie. E  atualizamos a distibuição:
 apt-get update && apt-get upgrade && apt-get dist-upgrade

Isso deve demorar. E de vez em quando apt engatilha um pedido de confirmação, então fique atento! Quanto tudo estiver pronto, reinicie o Pi (comando reboot). Se não aparecer nenhum problema, já podemos instalar o Qt5! Assim:
 apt-get install qt5-base qtdeclarative5-*

Compilando e executando um programa

Agora que já temos uma build do Qt em mãos, podemos compilar e testar um programa! Uma boa alternativa é a demo Qt5 Cinematic Experience do grupo QUIt Coding, que demostra várias capacidades da biblioteca e ainda tem uma versão otimizada para o RaspberryPi! Para obter o código da aplicação:
 tar xvzf Qt5_CinematicExperience_rpi_1.0.tgz
 rm Qt5_CinematicExperience_rpi_1.0.tgz

Esse exemplo não pode ser compilado out-of-source, pela versão dos arquivos de projeto. Então o qmake é executado na mesma pasta:
 cd Qt5_CinematicExperience_rpi_1.0

Se tudo correr bem, já podemos rodar a demo. Do console de texto, podemos utilizar o plugin de plataforma eglfs (EGL Fullscreen) para executar o programa direto do framebuffer em tela cheia:
 ./Qt5_CinematicExperience -platform eglfs

Só nos resta aproveitar a performance nativa do Qt5 e a excelente GPU do Raspberry. Boa programação!
Um pouco difícil de ver pela claridade, mas esse deve ser o resultado final...