[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Blog

Hello there! I am Aris, a recent graduate from the Mechanical Engineering and Aeronautics Department at the University of Patras, Greece.

For the last couple of years, as a member of the Robotics Group at the University of Patras, I have been working with Bitcraze products, including the Crazyflie 2.1 and the Lighthouse Positioning System, exploring swarming scenarios. In my diploma thesis I investigated the autonomous transportation of an object by a swarm of two rope-tethered quadcopters, where the introduced path planning algorithm enables the swarm to approach, grasp, lift and transport the load.

The swarm grasping the load.
Trajectories Simulation

I’m excited to join Bitcraze as part of the team, where I can further develop my skills and gain valuable work experience. My principal objective is to improve the Lighthouse Positioning System and contribute to the development of a new solution for tracking in larger volumes. During this time, I look forward to deepen my knowledge on Bitcraze’s products and learn how they apply in real-world applications.

After a busy fall of testing and fine-tuning, we’re thrilled to announce that the Brushless is now available! Our team has put in a lot of effort to ensure it meets our high standards, and we can’t wait for you to experience it.

If you’re curious to see it in action, we’ve featured the Brushless in our recent Christmas video, where it showcases its capabilities by navigating through Christmas obstacles with precision.

For those interested in its application in research, our latest blog post demonstrates how the Brushless can be used in academic settings. It’s exciting to see the potential it holds for various fields!

If you need anything to keep your Brushless flying, all spare parts are already stocked in our store. Additionally, many of our bundles now offer Brushless versions, providing more options to suit your needs.

We’re eager to hear your thoughts and feedback as you explore the capabilities of our latest drone. Your insights are invaluable to us and help drive our continuous improvement.

We look forward to seeing what you’ll achieve with the Brushless!

Robotics and Simulation at FOSDEM 25

Arnaud will be at Fosdem the 1st and 2nd of February 2025 in Brussels, Belgium. He’s actually hosting with Kimberly the robotics and simulation dev-room! If you’re in Brussels, we’ll be happy to meet you.

Drones can perform a wide range of interesting tasks, from crop inspection to search-and-rescue. However, to make drones practically attractive they should be safe and cheap. Drones can be made safer by reducing their size and weight. This causes less damage in a collision with people or the environment. Additionally, being cheap means that the drones can take more risk – as it is less expensive to lose one – or that they can be deployed in larger numbers.

To function autonomously, such a drone should at least have some basic navigation capabilities. External position references such as GPS or UWB beacons can provide these, but such a reference is not always available. GPS is not accurate enough in indoor settings, and beacons require prior access to the area of operation and also add an additional cost.

Without these references, navigation becomes tricky. The typical solution is to have the drone construct a map of its local environment, which it can then use to determine its position and trajectories towards important places. But on tiny drones, the on-board computational resources are often too limited to construct such a map. How, then, can these tiny drones navigate? A subquestion of this – how to follow previously traversed routes – was the topic of my MSc thesis under supervision of Kimberly McGuire and Guido de Croon at TU Delft, and my PhD studies. The solution has recently been published in Science Robotics – “Visual route following for tiny autonomous robots” (TU Delft mirror here).

Route following

In an ideal world, route following can be performed entirely by odometry: the measurement and recording of one’s own movements. If a drone would measure the distance and direction it traveled, it could just perform the same movements in reverse and end up at its starting place. In reality, however, this does not entirely work. While current-day movement sensors such as the Flow deck are certainly accurate, they are not perfect. Every time a measurement is taken, this includes a small error. And in order to traverse longer distances, multiple measurements are summed, which causes the error to grow impractically large. It is this integration of errors that stops drones from using odometry over longer distances.

The trick to traveling longer distances, is to prevent this buildup of errors. To do so, we propose to let the drone perform ‘visual homing’ maneuvers. Visual homing is a control strategy that lets an agent return to a location where it has previously taken a picture, called a ‘snapshot’. In order to find its way back, the agent compares its current view of the environment to the snapshot that it took earlier. The trick here is that the difference between these two images smoothly grows with distance. Conversely, if the agent can find the direction in which this difference decreases, it can follow this direction to converge back to the snapshot’s original location.

The difference between images smoothly increases with their distance.

So, to perform long-distance route following, we now command the drone to take snapshots along the way, in addition to odometry measurements. Then, when retracing the route, the drone will routinely perform visual homing maneuvers to align itself with these snapshots. Because the error after a homing maneuver is bounded, there is now no longer a growing deviation from the intended path! This means that long-range route following is now possible without excessive drift.

Implementation

The above mentioned article describes the strategy in more detail. Rather than repeat what is already written, I would like to give a bit more detail on how the strategy was implemented, as this is probably more relevant for other Crazyflie users.

The main difference between our drone and an out-of-the-box one, is that our drone needs to carry a camera for navigation. Not just any camera, but the method under investigation requires a panoramic camera so that the drone can see in all directions. For this, we bought a Kogeto Dot 360. This is a cheap aftermarket lens for an older iPhone that provides exactly the field-of-view that we need. After a bit of dremeling and taping, it is also suitable for drones.

ARDrone 2 with panoramic camera lens.

The very first visual homing experiments were performed on an ARDrone 2. The drone already had a bottom camera, to which we fitted the lens. Using this setup, the drone could successfully navigate back to the snapshot’s location. However, the ARDrone 2 hardly qualifies as small as it is approximately 50cm wide, weighs 400 grams and carries a Linux computer.

To prove that the navigation method would indeed work on tiny drones, the setup was downsized to a Crazyflie 2.0. While this drone could take off with the camera assembly, it would become unstable very soon as the battery level decreased. The camera was just a bit too heavy. Another attempt was made on an Eachine Trashcan, heavily modified to support both the camera, a flowdeck and custom autopilot firmware. While this drone had more than enough lift, the overall reliability of the platform never became good enough to perform full flight experiments.

After discussing the above issues, I was very kindly offered a prototype of the Crazyflie Brushless to see if it would help with my experiments. And it did! The Crazyflie brushless has more lift than the regular platform and could maintain a stable attitude and height while carrying the camera assembly, all this, with a reasonable flight time. Software-wise it works pretty much the same as the regular Crazyflie, so it was a pleasure to work with. This drone became the one we used for our final experiments, and was even featured on the cover of the Science Robotics issue.

With the hardware finished, the next step was to implement the software. Unlike the ARDrone 2 which had a full Linux system with reasonable memory and computing power, the Crazyflie only has an STM32 microcontroller that’s also tasked with the flying of the drone (plus an nRF SoC, but that is out of scope here). The camera board developed for this drone features an additional STM32. This microcontroller performed most of the image processing and visual homing tasks at a framerate of a few Hertz. However, the resulting guidance also has to be followed, and this part is more relevant for other Crazyflie users.

To provide custom behavior on the Crazyflie, I used the app layer of the autopilot. The app layer allows users to create custom code for the autopilot, while keeping it mostly decoupled from the underlying firmware. The out-of-tree setup makes it easier to use a version control system for only the custom code, and also means that it is not as tied to a specific firmware version as an in-tree process.

The custom app performs a small number of crucial tasks. Firstly, it is responsible for communication with the camera. Communication with the camera was performed over UART, as this was already implemented in the camera software and this bus was not used for other purposes on the Crazyflie. Over this bus, the autopilot could receive visual guidance for the camera and send basic commands, such as the starting and stopping of image captures. Pprzlink was used as the UART protocol, which was a leftover from the earlier ARDrone 2 and Trashcan prototypes.

The second major task of the app is to make the drone follow the visual guidance. This consisted of two parts. Firstly, the drone should be able to follow visual homing vectors. This was achieved using the Commander Framework, part of the Stabilizer Module. Once the custom app was started, it would enter an infinite loop which ran at a rate of 10 Hertz. After takeoff, the app repeatedly calls commanderSetSetpoint to set absolute position targets, which are found by adding the latest homing vector to the current position estimate. The regular autopilot then takes care of the low-level control that steers the drone to these coordinates.

The core idea of our navigation strategy is that the drone can correct its position estimate after arriving at a snapshot. So secondly, the drone should be able to overwrite its position estimate with the one provided by the route-following algorithm. To simplify the integration with the existing state estimator, this update was implemented as an additional position sensor – similar to an external positioning system. Once the drone had converged to a snapshot, it would enqueue the snapshot’s remembered coordinates as a position measurement with a very small standard deviation, thereby essentially overwriting the position estimate but without needing to modify the estimator. The same trick was also used to correct heading drift.

The final task of the app was to make the drone controllable from a ground station. After some initial experiments, it was determined that fully autonomous flight during the experiments would be the easiest to implement and use. To this end, the drone needed to be able to follow more complex procedures and to communicate with a ground station.

Because the cfclient provides most of the necessary functions, it was used as the basis for the ground station. However, the experiments required extra controls that were of course not part of a generic client. While it was possible to modify the cfclient, an easier solution was offered by the integrated ZMQ server. This server allows external programs to communicate with the stock cfclient over a tcp connection. Among the possibilities, this allows external programs to send control values and parameters to the drone. Since the drone would be flying autonomously and therefore low-frequencies would suffice, the choice was made to let the ground station set parameters provided by the custom app. To simplify usability, a simple GUI was made in python using the CFZmq library and Tkinter. The GUI would request foreground priority such that it would be shown on top of the regular client, making it easy to use both at the same time.

Cfclient with experimental overlay (bottom right).

To perform more complex experiments, each experiment was implemented as a state machine in the custom app. Using the (High-level) Commander Framework and the navigation routines described above, the drone was able to perform entire experiments from take-off to landing.

While the code is very far from production quality, it is open source and can be viewed here to see how everything was implemented: https://github.com/tomvand/2020-visualhoming-crazyflie . The PCB used to fit Crazyflie decks to the Eachine Trashcan can be found here: https://github.com/tomvand/cf-deck-carrier .

Outcome

Using the hardware and software described above, we were able to perform the route-following experiments. The drone was commanded to fly a preprogrammed trajectory using the Flow deck, while recording odometry and snapshot images. Then, the drone was commanded to follow the same route in reverse, by traveling short sections using dead reckoning and then using visual homing to correct the incurred drift.

As shown in the article, the error with respect to the recorded route remained bounded. Therefore, we can now travel long routes without having to worry about drift, even under strict hardware limitations. This is a great improvement to the autonomy of tiny robots.

I hope that this post has given a bit more insight into the implementation behind this study, a part that is not often highlighted but very interesting for everyone working in this field.

2024 is almost over and 2025 is coming fast so it is a good time for us to think about what is coming for Bitcraze in 2025.

Projects

2025 will be the year of the Crazyflie 2.1 Brushless! We are finally releasing it early January. The end of the development phase was longer that expected (who would have thought manufacturing would be hard … :-), but it is now ready and we are very excited to see what awesome projects the community will come up with using the Crazyflie Brushless.

With the Crazyflie 2.1 Brushless released, we will have more time to dedicate to other projects. Following is a list of fields we think we might look at, no strong promises here though and if you think we should focus on anything specific do not hesitate to drop a comment under this post, a post on github discussions or just send us an email. We have way too many ideas, but we are always open to more :-).

We still have the Lighthouse deck V2 in the back-burner, this is likely something we will look more at soon. As a reminder, the new deck is going to support lighthouse systems with 16 base-stations which will increase the space that can be covered by a lighthouse system

We also have a Wifi camera deck in early prototype, we have been showing it at various conferences in the last years. We are still working on trying to find the perfect Wifi-chip/processor/Camera combo to make it a reality.

We (ie. I) also have a lot of ideas on how to improve the communication with the Crazyflie. Crazyradio 2.0 has a lot of untaped capabilities that we can use to implement better, easier to use and more reliable communication protocols so this is definitely something we want to have a look at.

Finally, I apparently managed to sell the Rust programming language a bit too well to the team. Some of us are even more enthusiastic than I am about it! This together with an increased frustration with Python and PyQT6 distribution to various platform will likely make us experimenting even more in Rust in the future. The first target in sight is to write a Rust lib for the Crazyflie together with binding for various languages including Python and C/C++. This means that would be able to have only one lib for most use case of the Crazyflie both in Python and in ROS.

Conferences

Arnaud is part of the organizing committee of the FOSDEM Robotics and Simulation devroom at Fosdem. Fosdem is one of the biggest open source conferences in Europe. It is an awesome community-driven conference, so if you are not too far and are interested in open-source please join us in Brussels the 1st and 2nd of February 2025!

As usual we will likely participate to a couple of robotic conferences during the year. We are not yet sure which one though so stay tuned for more information on this blog.

Team

The team has been evolving in 2024, we are now 6 in the Malmö office, Mandy is working from Shenzhen in China handling the production and Joe is doing a post-doc in Stockholm in close collaboration with us.

We are actively looking at hiring 2 more team members, one in technical support and one in Sales. Our goal is to build a team where all of us can focus on our strengths to develop even further and faster the Crazyflie ecosystem.

We wish you a great new year filled with hacking and exciting new discoveries!

As 2024 comes to an end, it’s the perfect time to reflect on what we’ve accomplished over the past year. A major highlight has been our work on the Crazyflie 2.1 Brushless. We’re thrilled that it will be available early in the new year! While much of our efforts focused on refining and preparing the platform as a whole, we also introduced some standout features like support for contact charging on a charging pad, perfecting the specially optimized motors, and propeller guards to enhance safety for both users and the drone.

Finalizing the integration of the Crazyflie 2.1 Brushless into our software ecosystem and expanding its documentation were key steps in preparing for its launch. These efforts ensure compatibility, improve the user experience, and make the platform more accessible to the community. We’re looking forward to a smooth launch and to seeing how the community will utilize the new platform!

This year, we introduced updates to the Crazyflie 2.1 kit, making the 47-17 propellers the new default and including an improved battery. These upgrades enhance flight performance and endurance, culminating in the release of the Crazyflie 2.1+—an optimized iteration of our established platform.

The Crazyflie 2.1 Brushless featured on the cover of Science Robotics vol. 9, no. 92

Community

In 2024, Bitcraze had an action-packed year, engaging with the robotics community through numerous conferences, workshops, and live events.

In May, we attended ICRA 2024 in Yokohama. We collected several research posters that now proudly feature at the office. Kimberly presented at the Robotics Developer Day, where she won Best Speaker Award for her impressive live hardware demos with ROS2. We co-organized the ‘Aerial Swarm Tools and Applications’ workshop at RSS 2024 in Delft. Arnaud and Kimberly shared insights on demo-driven development on an episode of OpenCV Live!. Additionally, we had a booth at ROSCon ’24 in Odense, connecting with the vibrant ROS community and showcasing our latest developments.

And don’t forget the developer meetings, where we shared some more behind the scenes information and collected invaluable feedback from the community.

We also released a new edition of our research compilation video, showcasing some of the coolest projects from 2023 and 2024 that highlight the versatility and impact of the Crazyflie platform in research.

Team

In the past year, Bitcraze saw significant changes within the team. in February, Rik rejoined the team. Tove started at Bitcraze in April. Mandy, with whom we’ve already worked extensively over the years, joined as our production representative in Shenzen. At the end of the year, we said goodbye to Kimberly, whose contributions will be deeply missed. Additionally, we had Björn with us for a few months, working on his master’s thesis on fault detection, and Joe continued his industrial postdoc at Bitcraze that began in December 2023. Looking ahead, Bitcraze is hiring for two new roles: a Technical Sales Lead and a Technical Success Engineer, to support our ongoing projects and customer collaborations.


As we close the chapter on 2024, we’re proud of the progress we’ve made, the connections we’ve strengthened, and the milestones we’ve reached. With exciting launches, new faces on the team, and continued collaboration with our community, we’re ready to soar to even greater heights in 2025. Thank you for being part of our journey!

This Christmas, Bitcraze is sending out a callout to Santa. As it turns out, one of our Brushless prototypes has a lifelong dream of becoming one of Santa’s reindeer. In a hopeful attempt to fulfill its wishes, we shot a video to prove that it’s ready for Santa’s most elite aerial team!

Imagine a tiny, determined drone with big dreams, practicing its sleigh route moves with the intensity of an Olympic athlete. Our little Brushless is proving it has what it takes to join the North Pole’s premier delivery squad.

Going through small openings, avoiding obstacles, and flying in perfect precision are skills that any good reindeer should have – but here, the Brushless accomplish this in an autonomous flight, and in a much smaller and more practical package than Rudolph and consorts.

Of course, there’s some technical magic behind this Christmas miracle. For this project, we relied on stock firmware and Python library, taking advantage of the new spiral and constant velocity features (check out the GitHub PR here). These features added variety and fluidity to the maneuvers, moving beyond straight lines and making the flight more interesting. By using the high-level commander, we took a simpler approach compared to trajectory optimization, which we’ve used in past Christmas videos. Trajectory optimization would have been far more difficult for this project due to the unique challenges of the flight path—namely its length and the need for pinpoint accuracy near obstacles and through gates.

Positioning relied on four Lighthouse base stations, which we used to manually locate the Christmas wreaths by holding the drone within each one to log their exact coordinates. This project also gave us the opportunity to further integrate the Brushless into our firmware and Python libraries, setting the stage for a smoother launch in the new year. The Brushless impressed us yet again during this project. Even though we’ve tested it extensively in the past, seeing it navigate tight gates with such precision and handle the demanding flight path reinforced just how capable it is. Working with it in this setting has made us even more excited to release it soon and share its potential with everyone.

Santa, if you’re reading this, we think we’ve found your next top reindeer recruit. You can watch the full audition tape here or below:

And if you think what you just saw is a pretty straight-forward and easy path, think again! This year’s blooper video highlights the resilience of the Crazyflie 2.1 Brushless and the fast, iterative workflow we used for this project. Since testing reliability and resilience was a key goal, we adopted a workflow that allowed for quick scripting, flying, and adjusting—often completing the cycle in just minutes. This approach made crashes more likely, especially during the spiral sections where the drone struggled to keep up and started cutting corners. While we resolved the issue by slowing those sections down, we suspect that more aggressive tuning of the Mellinger controller could have helped the drone maintain speed without cutting corners. The Brushless managed some impressive recoveries, but even minor collisions usually meant it couldn’t keep pace with the rest of the trajectory. After all the trial and error, we had a stable and reliable setup that not only performed well for the demo but also flew beautifully when we showed it to our families at the Christmas party.

Here is what our Brushless could endure during training:


Merry Christmas from all of us at Bitcraze – where even our prototypes have holiday dreams!

Hi everyone! I have a bit of news to share… I’ve decided to leave Bitcraze at the end of 2024. But not before I share with you my latest Fun Friday project that I’ve tried my best to finish up before I leave before my Christmas holiday in December.

Frankensteining the Pololu Robot with the Crazyflie Bolt

During the ROSCon talk about the lighthouse system (see the recording here), I’ve already shown a small example of how the lighthouse system could be used on other robots as well. Here you see a Pololu RPI 2040 (the hyper edition of course), with a slimmed down Crazyflie Bolt and a Lighthouse deck. The UART2 port on the Bolt (pinout is the same as Crazyflie) is interfacing with the UART0 connection on the Pololu (pinout). Then the Pololu’s 3v3 is connected to the vUSB and GND to GND (obviously), so 4 wires in total. Technically, the 3v3 port is not supplying enough power for the Crazyflie on paper, but it seemed to be enough as long as the Crazyflie Bolt doesn’t have motors connected it should be fine. But if anyone would like to do a driving-flying hybrid with this combo, you might need to check the specifications a bit closer. For now, just ignore the red low-battery LED on the Bolt, but if you see it restarting then perhaps give the Pololu a fresh set of batteries.

Since the Pololu RPI 2040 doesn’t have any wireless communication, this can be done through the Crazyflie Bolt and the Crazyradio. I’ve made an app layer variant for the Bolt to forward state estimates and velocity commands; however, it did require a bit of an extra logging variable in the firmware itself. But this allows me to control the Pololu through the CFclient! Since it’s using velocity commands, this means that the mobile app is out though, but perhaps if anyone is interested in getting this rolling, let me know. Also, the screen shows the current X, Y, Z, and yaw estimate of the Bolt transferred to the Pololu with the commands that I’ve given it.

I’d like to have connected this to a differential drive controller to make use of the position setpoints, but unfortunately the AA batteries ran out at the office and I was unable to complete this by the last day. It would have been great to use the Lighthouse positioning for this. Perhaps in the next coming months, I can try to continue with it and have my cats chase an autonomous robot around the house, who knows! If anyone is interested in playing around with this, these are the repositories/branches for both the Bolt and the Pololu:

What is next?

First of all, I’ll take a long holiday in the US, first visiting New York (first time) before I hop over to Tulsa and Santa Barbara to visit family. Early 2025 I’ll be taking a long break, or a mini sabbatical of sorts, where I plan to work on some personal projects but mostly have a breather. I haven’t had a break like this in over 15 years, and given a tough 2023, I can definitely say that I’ve deserved some time off. What will happen after, I will hopefully figure out then, but for sure I will be continuing to co-lead the Aerial Robotics Interest Group at ROS and helping out in support of the Crazyswarm2 project.

I’d like to thank my colleagues at Bitcraze for an amazing 5 years here in Malmö, Sweden, and everyone that I was able to meet through them. I’ve learned a lot in terms of joint software development, code maintenance, community interaction, and, most importantly, having fun during work. I also will never forget the support I received while I was going through cancer treatment, and for that I’m very grateful. I wish you all the best and I hope the Crazyflie continues to thrive, saving more PhD projects as it did mine. Thank you.

The last couple of months we have been working really hard on finalizing the Crazyflie 2.1 Brushless and we’re happy to say that we’re finally nearing the release date! The first units are scheduled to be finished during December with shipping in January. Make sure so sign up for the Crazyflie 2.1 brushless product notifications to not miss out once it’s available!

We’re also working hard to offer everything that complements the Brushless. You’ll have access to spare parts and the option to choose your platform for some bundles. Hopefully, the charger will also been available soon, to allow you to charge the Brushless effortlessly!

November is always a tough month in Sweden, when the darkness deepens and the cold begins to bite. We had our first snow last week, a sudden reminder that winter has arrived. So instead of letting the gloom settle in, we decided to turn to what makes us feel good: pretty lights, pretty trees, and pepparkakor!

I realized that, although we talked about it last year, we never fully showed our big new flight arena once it was up and running. It made an appearance in our latest Christmas video, but was actually never fully revealed before. Capturing a 110square meters space in a single photo is no small feat, but here is my best shot:

It felt the right time to make the office feel a little more wintery. It may be a little early, but we couldn’t resist the charm of festive decorations and a cozy atmosphere to brighten up the dark days. Especially now that we’re more settled into our massive flight arena – a space this large calls for many more Christmas lights! Of course, there’s more to it than just creating a Netflix-Christmas-movie vibe—we’re also gearing up for two big events: our annual Christmas party, which will be hosted here, and the filming of our newest Christmas video!

Speaking of Christmas video, that’s exactly what we’ll be diving into during our next dev meeting! We recently had a great time revisiting how we’ve used demos to guide development – if you missed that particular dev meeting, you can see it here. We thought it would be a great idea to dedicate our next session to exploring all the Christmas videos we’ve created over the years. It’s fascinating to see how our Christmas projects have evolved over time and we hope you’ll join us to reminisce about it Wednesday 11th of December at 15.00 (CET). You’ll find all the info here.

It’s been a while since I last talked about hiring! We successfully onboarded our most recent recruit, and now it’s time to start planning for the future.

One of our challenges as a team is that we’re very heavy on engineers and developers. While that’s fantastic for building products, it means we lack expertise in other important areas. That’s why we’re now shifting our focus to bringing in talent to help fill those gaps. We’ve partnered with a recruitment agency once again to help us find the right people for the job.
We’re currently hiring for two distinct roles—here’s what we’re looking for!

Technical sales lead

You will be responsible for developing and implementing sales strategies while exploring both new and existing markets. You’ll take the lead in driving sales and acquiring new customers, becoming the company’s go-to expert on marketing and sales tactics. Your day-to-day tasks will include supporting business development, optimizing sales processes, and proposing effective marketing strategies. This role is perfect for someone with a background in technical sales with a strong strategic mindset and a sense of responsibility.

You can read more about it here.

Technical success engineer

We’re looking for a Technical Success Engineer to provide our customers with technical guidance and product expertise. This role involves offering first-line support, creating documentation and tutorials, and assisting with tech-focused sales efforts. The goal is to ensure a smooth and seamless customer experience while building strong client relationships. It’s an ideal position for a “social developer”—someone with a solid technical background who also excels in communication and enjoys engaging with others.

You can read more about it here.

Both positions are full-time and based at our office in Malmö, Sweden. If you’re curious about why you should join our team, I’ve already shared some of the many reasons why I love being part of Bitcraze.

If you’re interested or have any questions, please send an email to fredric.vernqvist@techtalents.se or contact us at contact@bitcraze.se.