MAV Assisted Fabrication Strategy
for Large Scale Fiber-Composite Structures

Aim

 

Applying flying robots for constructing wound fiber-composite structures at full scale of architecture, detached from lab-like environment and placed into the real-world fabrication scenario.

Every step of the creation process, from a design idea, to physical fabrication, has been supported by a practical solution and is implemented as various tools, processes and workflows in the developed prototype of the construction system.

Flying Robots for Architectural Fabrication

Methods

Developing a prototype of a construction system involves an implementation of multiple tools, processes and workflows, that work together seamlessly. The system includes solutions for every step of the build process, starting from a design idea and ending with the actual physical implementation of the structure.

 

The diagram below shows multiple aspects of an aerial fabrication system that were inspired by  various state of the art industrial and academic approaches and implemented within the scope of the aerial construction system prototype.

 

A Total Station geodetic instrument was employed for both environmental sensing via 3d laser scanning function and navigation of the vehicle by means of kinematic prism reflector tracking.

Robot operating system is used for high level programming of the functionality of the aerial vehicle and allows seamless integration of the vehicle’s autopilot into the design and control workflow.

The design and fabrication instruction generation of produced structures is performed using standard computational design workflow of Rhinoceros and Grasshopper.

Simulation of filament interaction allows to see the kind of geometries achievable with certain type and position of formworks and is achieved using Grasshopper Kangaroo physical engine.

Similar to how industrial robot jogging works, the aerial vehicle can be manually overridden by user using a radio remote controller.

The proposed material system originates in a series of coreless fiber winding experiments by ICD/ITKE institutes and is focusing on such important aspects as filament contact area and fiber pressure in order to achieve double curved structural surfaces.

The industrial helicopter powerline stringing is used as a conceptual example for a filament stringing process, which illustrates that for best control over material string a filament must be kept under constant tension.

A custom built unmanned aerial vehicle, a multirotor, is used as a programmable robotic manipulator, which is able to be both manually controlled by a user, and autonomously driven by a computer program.

 

Common Industrial Practices

Industrial Surveying
Machine Guidance
ROS Robot Programming
Control Interface
Robot Jogging
BIM
Physical Form-Finding
Helicopter Stringing
Core-less Winding
Industrial Robot
Show More

Extracted Principles

Laser Scanning
UAV Guidance
ROS UAV Programming
Grasshopper Interface
Manual Control
Rhino Design Interface
Computer Simulation
UAV Stringing
Aerial Winding
Flying Robot
Show More
 

UAV Prototypes

 

The main part of hardware development is the design of several aerial vehicle prototypes. The vehicles were build from scratch, using both off the shelf components and custom made parts.

A “Pixhawk” autopilot platform {fig.01}, was chosen as the core of the system, as it has a growing developer community, a substantial online documentation and is completely hardware and software open source. In general terms, Pixhawk is an embedded computer system with sensors and hardware interfaces for controlling an autonomous vehicle.

Electronics test rig

 

Pixhawk is used in conjunction with the onboard computer “Odroid XU4”, which is an ARM single board computer with 8-core Samsung Exynos 5422 processor, 2GB of RAM and an eMMC memory storage card. It’s as fast as a modern hi-end smartphone device and has proven to work in other setups.

Various external sensors were initially used in the early stages of the project for testing different navigation methods. A uEye 1221LE global shutter camera - for testing Computer Vision navigation techniques, a PX4 Flow sensor - position estimation sensor based on optical flow and sonar, a TeraRanger One infrared rangefinder - for precisely estimating distance to ground and a GNSS receiver uBlox M8T with Tallysman TW4721 antenna for testing Real Time Kinematic Satellite positioning methods. In the final setup however, none of these sensors were used.

The propulsion system (motors, ESC, battery) were purchased from “Hobbyking” and are standard for DIY quadrotors. The frame parts are CF plates purchased online and aluminium arms from a local hardware store.

Other onboard periphery is important as well. WiFi adapter - for ground computer connection, accessing the terminal of the onboard computer, uploading code and flight programs and observing autopilot state. Two radio modules - FrSky radio receiver for manual remote control and Laird RM024 for fast and robust data exchange. USB-serial converter - for connecting radio module to Odroid. Two power modules, a 5V 5A power supply for Odroid, and a special Pixhawk power module, which not only provides power to the autopilot, but also the data about the current voltage and current of the battery.

A larger quadrotor prototype was designed in Solidworks and custom built mostly from carbon fiber sheets and POM plates, milled on a CNC. Other hardware included 3d-printed parts and  aluminium spacers milled on a lathe. The new prototype integrated a number of specific hardware features, that would positively impact the functionality of the vehicle in a fabrication process. These developments included: angled arms for more clearance to avoid fiber strangling, special magnetic gripper and attachment bracket that prevents decoupling of the mounted manipulator in case of strong horizontal forces, swappable battery module for fast battery replacement, charging pads for battery charging in landed mode, propeller ducts for safety and for keeping the filament away from the rotors, sensors that are moved to the end of the motor arms to clear space for a center-mounted effector and a thrust vectoring system.

[1] 15”CF propeller, [2] brushless motor, [3] propeller guard, [4] tilt mechanism, [5] charging pad, [6] tilting servo motor, [7] global shutter camera, [8] swappable battery module, [9] TeraRanger distance sensor, [10] electro-permanent magnet, [11] protective cover, [12] PX4 Flow sensor, [13] power module, [14] motor controller, [15] WiFi adapter, [16] WiFi antenna, [17] onboard computer, [18] electronics compartment, [19] remote control radio receiver, [20] Pixhawk autopilot, [21] data radio antenna

Thrust Vectoring

Although carbon fiber filament is light, (3g/m 50k), the wet in-air fiber that is being transported has a considerable weight. This creates a lateral force acting on the vehicle which grows proportionally to the structure size. Therefore, the larger the constructed structure, the more lateral thrust needs to be provided. The only way to create this thrust with conventional multirotors is by tilting the frame to change the thrust vector, which for larger, less agile multirotor vehicles introduces a potential problem due to their slower dynamic response.

 

Another way to have more control over translational behaviour of the vehicle is thrust vectoring. It allows to move the vehicle in space without affecting its attitude. The thrust vectoring system was implemented in the large quadrotor prototype. A series of tests showed that this approach provides more flight stability, than the conventional setup, as one only needs to control the translational behaviour and the orientation is always parallel to the ground.

In a scenario of a fabrication setup, where the multirotor needs to manipulate the filament, thrust vectoring actuation technique proves to be an important factor for larger vehicles, providing a safer and more predictable behaviour.

Software Workflow

The developed workflow consists of a number of design, control and programming software environments.  The system is divided into two platforms, the autonomous vehicle and the ground computer.

Ground computer has two operating systems running simultaneously, native Windows and Ubuntu as a virtual machine using VMware workstation. Windows runs Rhino with Grasshopper and QGroundControl software. Rhino and Grasshopper are used as a main fabrication visualization tool, for simulating and parametrically creating designs and generating fabrication instructions for aerial vehicle. Ground control station software (QGroundControl)  allows to visualize vehicle states and sensor data in realtime and control the system parameters that influence many aspects of vehicle’s operation in a user friendly fashion.

 

 

Autonomous vehicle has two main pieces of hardware - the Pixhawk autopilot and the onboard computer. The Odroid XU4 onboard computer is running Ubuntu, with Robot Operating System (ROS). ROS is “a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms” http://www.ros.org/about-ros/In this project, ROS is used as a high level control mechanism for the vehicle. It’s able to run navigation algorithms, control packages with advanced logic and external interfaces and serve as a bridge to the autopilot.

Pixhawk autopilot is running NuttX real time operating system with PX4 firmware, which “is a collection of guidance, navigation and control algorithms for autonomous drones” http://dev.px4.io/. It “consists of two main layers: The PX4 flight stack, an autopilot software solution and the PX4 middleware, a general robotics middleware which can support any type of autonomous robot” http://dev.px4.io/. Pixhawk is directly controlling the propulsion hardware of the vehicle.

 

For the fabrication process control, Rhino Grasshopper together with a custom ROS control package are used. Grasshopper node, consisting of an HTTP client (Bengesht plugin by Behrooz Tahanzadeh), communicates with ROS control package through Rosbridge HTTP server node. Using GhPython, the JSON messages to and from ROS are generated or parsed. This allows seamless integration of ROS with a designer friendly Grasshopper tool.

 

Developing UAV Guidance System 

Machine guidance and control is one of the most important aspects for implementing an automated construction process, which in the first place, requires the ability to robustly and precisely position machines on the construction site. In most cases, Differential Global Navigation Satellite System (DGNSS) is used for positioning and guiding the vehicles, which allows to achieve accuracies in the order of a few centimeters.  In situations where the most accuracy is required, a tracking Total Station instrument is used.

This project takes advantage of the precision and robustness of Leica Multi Station MS60 instrument. It’s able to operate in a wide range of atmospheric conditions, tolerant to dust, rain, and made to work in harsh conditions of a construction site. It’s able to automatically lock onto target and keep tracking when it’s in motion at distances up to 500m, measuring target position at millimeter precision. In a scenario of target occlusion, it’s able to automatically perform a search and lock to target. Moreover, it’s capable of 3d scanning the environment, providing a precise pointcloud. The instrument is programmable as well, with Leica GeoCom interface. It allows to extend the functionality of the instrument by writing custom external applications. The communication with the instrument can be performed over a hardware serial connection, using an ASCII based protocol, which makes it compatible with any platform, be it a microcontroller, a smartphone, a Windows or a Linux PC.

A dedicated ROS node, runnning on the ground computer's Ubuntu system is used for communicating and controlling the Multi Station. The tracking node is based on an existing Python implementation of the GeoCom serial communication protocol (by Marcel Schoch) and is slightly modified to work with MS60 for receiving the coordinates of the tracked Leica Mini 360 reflector.

 

This data is then fed to another custom ROS “sender” node, that sends the coordinates to the vehicle over a second serial connection and the dedicated radio module (Laird RM024).  On the vehicle side, a “receiver” ROS node is receiving the coordinates from the same type radio module setup connected over USB and publishing the data for the onboard control node to use.

At the same time, Grasshopper is also receiving the measured point using a websocket server Rosbridge node and a websocket client Grasshopper node, allowing to trace, record and evaluate flight path trajectories in real-time.

Another important point is achieving correct reference frame orientation of the Total Station. The autopilot is utilizing an internal compass sensor for determining its orientation in space - heading. Autopilot navigation frame is North-East-Down (NED), which means its XY axes are pointing along the magnetic North and East directions.

 

However, ROS internally operates using a different, ENU (East-North-Up) convention, so in order to send measured coordinates to the autopilot in a correct format, the instrument’s orientation frame should be converted to ROS ENU convention.

 

Slight misalignment of the frames (few degrees) is not critical though, it will only cause small deviations in the trajectory of the vehicle. Therefore it is possible to use the system with no reference points in the surrounding environment to help orient the instrument in the right direction - it can simply be approximated.

Problems might arise with this approach when the magnetic field of the Earth gets distorted by the nearby large metallic structures, such as buildings or towers. In this case, the onboard compass heading could be sufficiently distorted for the operation to become unreliable or imprecise. The solution for such scenario would be to have an additional orientation estimation source, such as a computer vision system.

Slight frame misalignment causing deviations in vehicle trajectory

A “Position Controller” module of the autopilot, based on Proportional-Integral-Derivative (PID) controller method is responsible for calculating the amount and direction of force that vehicle needs to produce in order to get to the destination target setpoint. These controllers need to be tuned properly, for achieving acceptable motion accuracy of the vehicle. If the controllers are untuned, the vehicle tends to oscillate around a target setpoint, never actually reaching it. In worst cases, the amplitude of oscillations will increase and the vehicle becomes completely unstable. In this project, the PID controllers were tuned using an heuristic approach, which has proven to be sufficient for eliminating oscillations.

Plot of vehicle trajectories projected on X,Y axis-red,green curves. Black lines- target setpoints. Untuned Position Controller on the left, tuned on the right. Vertical scale in meters.

 

Automated Control

The onboard vehicle control package is built as a separate Python ROS node. It implements various navigation, control and safety functions and serves as a hub between the Grasshopper user control interface and autopilot.

The continuous inputs to this node consist of the stream of vehicle’s coordinates from Multi Station and multiple streams from autopilot such as states, sensor data and current mode. At startup, the node is also waiting for the flight program to be received from Grasshopper. This program has a custom format and consists of XYZ position and orientation target setpoints in order of execution.

 

Control node receives vehicle’s position from Multi Station and sends them to autopilot over a serial connection. The data however is not passed directly, but after being processed by several functions. These functions perform safety checks for data consistency, engaging failsafe manual control mode if needed.

The node is responsible for aligning and offsetting the coordinate frames of the Multi Station, in order to have the measured position interpreted correctly by the autopilot. This method allows to neglect the autopilot’s starting position (which resets to zero at system startup) and allows to always work in the same world reference frame. It’s done by offsetting both the Multi Station position measurements and the command target setpoints to the initial position of the autopilot position estimation packages. The target setpoints, the measured prism position as well as the fabrication model inside Rhino are drawn in the ENU world frame which works for both Rhino and Multi Station, making it easy to use by avoiding unnecessary frame rotations.

Custom control node is communicating with a Mavros node, which in turn sends data to autopilot. Mavros is a ROS package, which provides simple means of communication and control of the vehicle. Among simple one-time commands such as “land” or “change mode”, there are two continuous data streams that are sent to autopilot over Mavros: its external position estimation by Multi Station and the target setpoint that it needs to reach. Using Mavros, the control node can obtain and utilize any data from the autopilot, including its current orientation, state, mode,battery charge level and multiple others. This data can then passed through the websocket bridge over WiFi to Grasshopper to be visualized. In addition,

 

Mavros provides a WiFi UDP link to QGroundControl as well.

The externally estimated position data that PX4 autopilot receives is processed by a module called “Local Position Estimator” (running on the Pixhawk), which is responsible for fusing multiple position sources (if such are present) using Extended Kalman Filter to get a more precise estimate. In the final setup however, no other sensors were required, as Multi Station provides measurements with mm accuracy.

Integrating into the Environment

 

In addition to the Multi Station “tracking” program, another ROS node was implemented that utilizes the scanning ability of the instrument to get the pointcloud of the surrounding environment, thus allowing to integrate the fabricated structure into it, as well as to set the safe flight boundaries for the construction vehicle.

 

The node again uses GeoCom to access the instrument’s internal functions, in this case for controlling the motors, that point the measuring laser in a certain direction, reading the direction angles and triggering coordinate measurement function. The data is streamed in real-time from the ROS node to Grasshopper over the local network connection using abovementioned websocket approach.

Having a 3d model of the environment allows to achieve a certain amount of awareness, which makes it possible to integrate the fabricated structures into the real-world spaces. Additionally, by knowing where the obstacles are helps avoid prolonged tracked target occlusion, by optimizing the flight paths. Current implementation utilized this advantage to place the target setpoints that the vehicle needs to reach and stop at, in places where they are not occluded by the formwork.

Application of fiber-composites

 

Fiber-composite material systems have been used for a long time by the industry in the process called filament winding. This process usually involves winding filaments, continuously impregnated with resin over a mold and curing afterwards. It is mostly used to create engineering structures such as pipes, poles, storage tanks and pressure vessels.

 

Recently, filament winding has also been adopted for architectural fabrication in a process called “core-less filament winding”. Instead of using a mold, the fibers are wrapped between several reconfigurable frames, with the shape of the final piece emerging from the interaction of the intersecting fibers, the geometry of the frames, winding locations and their order.

 

The work by ICD and ITKE institutes, led by Professors Achim Menges and Jan Knippers, has demonstrated, in a series of Research Pavilion projects that fiber winding can be successfully applied for creating large architectural objects.

 

Institute of Building Structures

and Structural Design

Institute for Computational Design

ICD/ITKE Research Pavilion 2013-14

Institute of Building Structures

and Structural Design

Institute for Computational Design

ICD/ITKE Research Pavilion 2013-14

 

Interacting with Filament Materials

 

Industrial examples of handling string-like materials with aerial vehicles exist, but are few. In the power line construction industry, a common method is “stringing” - a process of pulling a conductor cable with a helicopter through a sequence of “catcher” mechanisms on transmission towers. The conductor spool in this process is fixed to the ground and the helicopter is dispensing the wire by applying force while flying forward with a strongly tilted attitude.

This industrial practice illustrates that it is possible to have precise control over the position of the filament in space when manipulated by a flying vehicle, given that it is kept under constant tension. This would be beneficial for scenarios where a filament needs to be caught by a specific hooking node on the formwork of the fabricated structure.

Powerline helicopter stringing, Meridian Helicopter’s Aerial Stringing

Core-less filament winding process, mentioned above, showcased the importance of the “winding syntax” and the resultant “fiber-fiber interaction”. A number of factors, influencing the shape and structural performance of the fabricated part are pointed out. They include the pressure that the fibers apply over other fibers to form double curved surfaces defined by tension forces, the  differentiation of the fiber orientation, giving anisotropic properties to the structure and the orientation of the fibers relative to the stress directions.

 

In order to test the described filament interaction methods as well as the developed MAV guidance system, the experiment with two poles with hooking nodes (screws) was set up. The quadrotor vehicle with a mounted thread spool, placed between the rear motors, was programmed to fly around the poles, in a way that always kept the filament in tension, by constantly increasing the distance from the last attachment point. With the spool having a certain unwinding friction, a sufficient degree of pulling force was achieved by making the quadrotor face away from the attachment point.

This experiment showed that using the described techniques, it is possible to achieve predictable material behaviour, by being able to relatively accurately, with 5-10cm leeway, span thread-like filaments between two objects, in a straight line, without slack. It has also proved that the developed guidance system can operate for long periods of time completely autonomously, without user intervenience.

Testing Large-Scale Trajectories

 

In order to test the guidance system on a larger scale, the experiments for tracing paths were set up. The color information for every point of trajectory was integrated into the path program, created in Grasshopper, which sent and stored in the onboard control package of the vehicle. At runtime of the flight program, the colors were passed as a separate data stream (topic in ROS), to an Arduino Micro ROS node connected over USB. Arduino board would then interpolate the received color messages and drive a bright RGB LED,

 

Simulating Filament Structures

 

Simulation of wound fiber-composites is challenging due to highly differentiated material stiffness throughout the structure. In some cases however, it is still useful to roughly approximate filament behaviour in a physical simulation, in order to know whether the key factor for achieving structural integrity - “fiber-fiber interaction” can be achieved in a given formwork setup.

For this project, a Grasshopper Kangaroo 2 physical engine (by Daniel Piker) was used to simulate filament structures. Kangaroo 2, is capable of adding new objects to the running simulation, compared to its first version, where it would require a full reset. This allows to visualize the emerging geometry, by consequently spanning new filaments between the nodes, either automatically, with a preset time interval using a predefined syntax, or manually, by letting the user draw the feeding path in the viewport.

The setup has shown to be able to simulate large, 10 m in diameter structures with over 100 spanned filaments, each consisting of around 100 segments with a collision radius of 5 mm.

 
 

Winding Experiments

 

Based on described developments, a full-scale filament winding experiment was set up. It is based on a freestanding configuration of 4 winding frames (nodes), with one 4m high in the center and the rest 1.5m high, equally spaced around to form a triangle with a side of 7m

After arranging the frames in a desirable way, the MS60 Total Station was used to scan the environment to find both the surrounding boundaries of the fabrication space and the precise position of the frames themselves. The pre-modeled frame geometries were then aligned with scan data.

Next, the flight program based on a desirable syntax including the target point on the path as well as orientation could be created. In this experiment they were drawn manually in the Rhino viewport using auto-generated guides. The resulting flight path is then converted into customly-formatted instructions for the vehicle and uploaded to the onboard system.

After Multi Station guidance is set up and the vehicle is receiving its measured position, the user is able to activate automatic flight mode for the quadrotor to start executing the program.

Constructed flight paths

Desired vehicle orientations at targets of the constructed path

Resulting flight trajectories

Resulting vehicle orientations during flight

Final result of the experiment

In another winding experiment, two formwork frames were placed in a way to achieve a funnel-shaped wound structure. This setup is aimed at showing the capabilities of the guided vehicle to freely maneuver around and through structures, providing a design flexibility, that no other fabrication machine has.

Show More

Outlook

 

To summarize, MAV assisted fabrication approach of large-scale fiber-composite structures with unmanned flying vehicles has great potential not only to serve as a showcase for capabilities of flying robots in construction in general, but also for exploring new amazing ways of composite materials application in architectural fabrication of near future.

 
WINDING-09