MicroMakers II is a program for Bulimba State School students to invent their own gadget, device or digital product to solve a problem or build a game. As part of the MicroMakers II program, students, parents and teachers from Bulimba State School visited the ARC Centre of Excellence for Robotic Vision (The Centre) at QUT, Gardens Point Campus. The Centre is the world’s first centre that focuses on creating robots that can see.
Sue Keay, Chief Operating Officer of The Centre took us on a guided tour of the facility. We were able to meet and interact with a number of the robots at the centre.
We, the students, parents and educators, learned a lot on our visit to the The Centre. It was surprising to see how much Australian roboticists had achieved in teaching robots to "see". We learned about machine learning, that robots can learn to identify objects e.g. crown-of-thorns starfish or ripe capsicums. The idea that robots can "learn" and that the learning progressions from identifying images of objects, to 3D prints of objects to the actual objects themselves was astounding. Workers of the future will either be developing these technologies or be replaced by them. But future is now.
Here are the robots we met at The Centre:
- A drone with heat sensors for surveying koala populations as part of environmental planning for road construction projects.
- Guiabot by Adept "Casper": A robot that reads road signs for humans e.g. speed signs rather than reading computer signals. This technology helps improve the artificial intelligence for autonomous vehicles.
- COTSbot (COTS, crown-of-thorns starfish) by QUT. Agriculture and mining on the east coast of Australia has led to increased sediments in the Great Barrier Reef. This environmental change has encouraged the population of COTS, a native organism in the Reef, to flourish. Controlled populations of COTS is part of the natural ecosystem, however overpopulation is destructive to the Great Barrier Reef as COTS eat coral. Currently, the only means to control the COTS population is for human divers to inject COTS with vinegar, which kills the COTS. COTSbot was developed by QUT to navigate the Great Barrier Reef, detect and inject COTS with vinegar. COTSbot used machine learning to learn to identify COTS. COTSbot started with learning to identify images of COTS, then progressed to identifying 3D prints of COTS and then was taken to the Great Barrier Reef to learn to identify real COTS. COTSbot has now learned to identify COTS with 99% accuracy and also recognises that 3D prints are not real.
- Robotic boats which work as a system called "Inference" developed by QUT. These boats stay in the ocean or other water bodies long-term collecting water data.
- Harvey the harvester by QUT (based on a UR5 arm) is a capsicum picking robot. 30% of crops in Australia are wasted due to lack of workers at the exact time of year when crops need to be harvested. Harvey has been trained to identify and pick ripe capsicums. As with COTSbot, Harvey used machine learning to learn to identify ripe capsicums. Harvey started learning with images of capsicums, then plastic capsicums and progressed to real capsicums. Harvey can now also differentiate between plastic capsicums and real capsicums. Harvey uses a suction cup to take hold of the ripe capsicum and then a cutter piece on the robotic arms cuts the stem of the capsicum.
- AgBot by QUT is a 3m wide weed remover robot. It instantaneously determines what type of plant it sees, whether it is a weed or cotton plant. If the robot identifies a weed it determines which method of removal to use; dug out or sprayed. The roboticists are also working on a third weed removal option: microwaves.
- Baxter is a factory worker robot with useful working arms. Baxter is known for being safe to work with humans, i.e. does not require a cage to separate it from humans. We humans use non-verbal (facial) cues to signal that we are about to do something. Baxter does the same, prior to moving either arm Baxter uses facial expressions on its tablet ‘face’ to cue that it is about to move an arm. The arms are fitted with sensors so that when an arm is moving and it hits resistance, the arm automatically retracts so as not to harm the obstruction. Baxter can be programmed with coding or you can manually teach Baxter movements to memorise. A QUT team used Baxter to compete at the International Amazon Picking Challenge. Amazon fulfillment centre’s are almost fully automated. Shelves of goods in fulfillment centers are automated, which ‘move’ to the human packers according to customer orders. E.g. a customer orders a book, a cup and a toy. The shelves with the appropriate items travel to the human packer. The human packer grabs the items needed as each shelf presents itself. The role of the human packer is difficult to automate because items look differently based on which direction it is viewed. Human hands are also very dextrous and challenging to imitate with robotics. The Amazon Picking Challenge is a competition that requires teams of roboticists to replace the human packer with a robot. The Challenge requires a robot to pick up requested objects from Amazon shelves within a set time period, points are lost if the robot drops the item. The QUT team used suction cup ‘hands’ on Baxter. It was the team’s first time to compete at the Challenge and placed 6th out of 16 teams.
- A snake robot which helps orthopedic surgeons deliver knee surgeries. The snake robot is a tiny camera at the end of a partially flexible cable in which the movements can be controlled by the operator. The snake robot is useful in knee operations and Similar to surgeons performing knee operations, the snake robot stops drilling when vision of the camera is obstructed.The last robot we got to meet were three Nao robots. Nao robots are renowned for being cute and friendly. They are used for research and educational purposes. One Nao was programmed to walk when its hand is held and the others were programmed to say funny comments and pose.
The crowd favourites with our students were Baxter, the capsicum picking robot and the Nao robots. The students enjoyed the interactions with the robots, seeing what current technology can do and learning that machines, like students, can also learn.
These robots have the potential to support and improve operations in various industries, e.g. environment, agriculture, logistics and manufacturing. It will only be a matter of time before we see these working prototypes available commercially and become commonplace in our home and work environments.