Demonstration

Current Process

The idea of this demonstration is to condense the entire IoT scenario in a small area and implement all the functions implemented by the modules mentioned above in this area, and feedback from the visualization platform in real time. This project demonstration will use an automatic tracking robot to run on a designed map automatically. The demonstration contains the following features:

  • Automatic Tracking: Draw a pre-set track on the map, and the robot will travel along the track by the line follower and demonstrate functions along the way.

  • Environmental Monitoring: Set different environmental variables along the map, including temperature, humidity, magnetic field strength, noise volume and other data, and reflect real-time changes in the visual interface. Different conditions are set along the map to reflect environmental changes.

  • Positioning: Use three BLE positioning nodes and UWB ranging nodes to set up three regions around the map and judge the current location of the target. The real-time orientation of the robot will also be reflected on the visualization platform in the form of a compass.

  • Traffic Counter: Set up a counter gate and measure the distance to determine whether the robot has passed and count.

  • Fall Detection: Demonstrate the balance and fall alarms, and eventually re-balancing with the robotic arm. This process is visible in the cloud.

Demo Video

This video shows all the features currently included in this project. The demo scheme and the props and scenes used will be described in detail later.

DemoVideov2.1.mp4

Newest Process - 04 Jul 2019

Optimized robot structure.

Demo Structure

Demo Site

The project demonstration site uses a map with pre-set tracks, and the robot trolley will travel along the track on the map. Three BLE positioning nodes and UWB positioning nodes are placed around the map. When the robot approaches any one of the nodes during the journey, the node can determine the area where the robot is located. The human traffic counting module is placed on the map, and the counter is incremented by one when the robot passes the counting position. During the journey, the robot periodically reverses the arm to demonstrate the fall alarm function.

Tracking Robot

An automatic tracking robot will be used in the project demo to demonstrate the functions in the current project. The car uses an Arduino Mega Pi as a controller to connect to the tracking sensor and the motor that controls the wheel and arm. The controller and motor are battery powered.

The car is equipped with the IoT node development board, SensorTile board and the BLE positioning node, which is used as the environment variable detection module and the BLE positioning module. A charging treasure powers the development boards. When the robot is started, it will follow the pre-set trajectory and periodically use the robotic arm to demonstrate the fallen state.

Visualization Interface

Data visualization delivers large amounts of data in a clearer, more understandable way than traditional charts and data dashboards. It helps identify and diagnose business problems and opportunities by using an interactive video wall. Data visualization has become an integral part of big data analytics.

The visualization platform will add the database used to store the data as a data source and extract the data that needs to be displayed from the database and present it in a suitable format. AliCloud uses DataV as a visual display platform.

Functions

Fall Detection

The robot will periodically demonstrate the fall function. When the robot falls, the visual interface immediately displays an abnormal status alarm. The robot will then stand up again using the robotic arm and release the abnormal condition.

Indoor Positioning

The robot will pass through the three areas A1, A2 and A3 on the map. The three areas are respectively deployed with BLE and UWB positioning nodes. When the robot passes through different areas, the visual interface will reflect the position of the robot on the map at the moment through BLE positioning and UWB ranging, and reflect the real-time head orientation through the compass module.

People Traffic Counter

A gate is set on the map and a human flow counter module is placed. The robot will simulate the flow of people passing through the gate. Each time the robot passes through the gate, the visual interface will feedback the number of times the robot has passed and the trend over time.