Email
Github
LinkedIn
Twitter
Resumé

February 22, 2018

Project FireBot: What happend so far...

This is my first entry on this blog. A lot has happened since I started this project in late 2016 (technically I was working on a more basic version of it in 2014, but it was very primitive and I stopped working on it): I have thought myself the basics of ROS, I have convinced a group of companies to sponsor my project, and I've now finally received the last parts. Setup of the parts is still in progress, since some of the ROS drivers I installed are still acting up, but I'm sure that I'll be able to fix those soon enough (more on that later).

The Turtlebot 2 arrives at my school

The Project: What it's about

As the name implies, this project has to do with fire and robots. But that is somewhat general. Basically, I'm attempting to build and program autonomous fire-fighting robots that can be placed in highly sensitive areas, for example archives ore museums, and that try to extinguish any fire before it becomes uncontrollable.

The focus is to minimize damage, which is why it will be using gas (in testing I will likely use compressed nitrogen, though a finished product should use INERGEN® – a gas developed for firefighting that is highly effective and unreactive). In case the robot is unable to control the fire, it should be able to transmit its sensor data, including accurate and real-time maps (2D and basic 3D), real-time video and real-time thermal video and fire / victim locations, to firefighters so they can adequately prepare and execute a rescue and extinguishing attempt.

The FireBots (as I call them) would transmit this data over a so-called MANET (Mobile ad-hoc network) that should be able to send data even in case of building power loss by using the robots as relays.

The Project: Finding sponsors

Luckily, I had a slight advantage in finding sponsors, since my school already has a group of companies funding a variety of trips, equipment etc. I arranged to present my project at one of their meetings, and I managed to impress them enough to receive the amount I required for this project (altogether it ended up being a little under 3000€).

I will attach my presentation of the project at the end of this post. It contains a detailed description of the project and my original cost calculations (which have drastically changed as my switching distributer and some tips from an expert in the field also made me rethink the sensors I wanted).

The Project: The hardware

Currently, my robot is equipped with an Intel® NUC Kit NUC5I5RYK (link here), an RPLIDAR A2 (link here), a SEEK Thermal Compact (link here) and an Intel RealSense R200 (link here (only the robotic development kit is still available)).

The assembled Turtlebot 2 with sensors (without the R200)

The LIDAR is meant for making highly accurate 2D maps of the robot's surroundings (using Google Cartographer) and for guiding the robot while navigating by providing precise, real-time data. Since some obstacles may not be seen by the 2D LIDAR but still potentially hinder the robot, one of the R200's (many) tasks will be to provide additional 3D mapping data (using Octomap). I believe the SEEK Thermal's job is pretty obvious, since it is a far infrared camera, meaning that it can detect heat. The NUC is the brain of the robot, and with its built-in i5-6260U @ 2.9Ghz and 8 GB of RAM it's powerful enough to handling pretty much any anything I will be throwing at it. The 500 GB SSD also makes sure that I won't be running out of disk space any time soon.

My work setup (remote connection via TeamViewer is currently extremely laggy, even though the connection speed is 200+ Mbps, which is why I need a monitor)

What to expect & Conclusion

Basically, I've got a good idea of what I want/need from this project and the hardware to do it. Now the only two obstacles I face are knowledge and time: my knowledge of ROS, even after extensive research, is still fairly limited, and each new problem can take me hours or days to fix (for example, I am currently struggling with getting gmapping to accept my LIDAR as an input: the LIDAR is posting data without issue, but gmapping can't use it due to "no frame of reference". I know that I have to modify the Turtlebots URDF-file, but that has turned out to be somewhat of a challenge) which, in turn, take away from my other limiting factor: my time. In case you don't know, I'm a high school student and am currently nearing my diploma (I'm in 11th Grade) and that means that next year more of my time will be taken up by studying. Also, the competition is in February 2019, and while that might seem like plenty of time, juggling school, another competition (CanSat) and a life, really diminishes the time I have for this project (even though this and school are what I'm currently focusing). However, I am confident that those problems will be solved.

This blog will contain sporadic updates on the project, mainly when I reach certain milestones. I you want to follow the project more intensely, or just want to check on my progress, you can take a look at my Trello TO-DO board (At the time of writing this, it is still fairly empty). My Twitter is probably not going to be active any time soon, but you can follow me anyway here.