Skip to Content

Gravity Gesture & Face Detection Sensor

Gravity Gesture & Face Detection Sensor

The Gravity Offline Edge AI Gesture & Face Detection Sensor is a compact vision module that performs real-time gesture and face detection directly on-device. It integrates a small camera and an onboard AI processor capable of recognizing five predefined hand gestures and detecting up to 10 faces or upper-body targets at distances up to several meters. Because all inference runs locally, the sensor offers low latency and avoids any cloud or network dependency.

The sensor supports both I2C and UART interfaces and operates from a 3.3 V–5 V supply, making it compatible with platforms such as Arduino, ESP32, and Raspberry Pi. This makes the module useful for touchless control, presence detection, and other privacy-preserving edge-AI applications.

In the following you will learn how to connect the sensor module to an Arduino UNO or an ESP32, how to program it for gesture and face detection and how to build a room occupancy counter with it.

Required Parts

For this tutorial you will need the Gravity Gesture & Face Detection Sensor from DFRobot. Furthermore, you will need a microcontroller. I am using an Arduino UNO and an ESP32-C3 SuperMini but other Arduino or ESP32 boards will work fine as well. The only requirement is support for an I2C (or UART) interface.

Gravity Gesture & Face Detection Sensor

Arduino

Arduino Uno

USB Data Sync cable Arduino

USB Cable for Arduino UNO

ESP32-C3 SuperMini

OLED display

OLED Display

USB C Cable

Dupont wire set

Dupont Wire Set

Half_breadboard56a

Breadboard

Makerguides is a participant in affiliate advertising programs designed to provide a means for sites to earn advertising fees by linking to Amazon, AliExpress, Elecrow, and other sites. As an Affiliate we may earn from qualifying purchases.

Hardware of the Gravity Gesture & Face Detection Sensor

The Gravity sensor integrates a compact camera and an embedded AI processor that handles all gesture recognition and human-presence detection locally. This means that no external computer or cloud connection is required: the sensor performs inference in real time, directly on the module itself. The picture below shows the front and back of the sensor module:

Front and Back of the Gravity Gesture & Face Detection Sensor
Front and Back of the Gravity Gesture & Face Detection Sensor

Gesture Recognition Capabilities

The module is capable of recognizing five predefined hand gestures within a detection range of about 0.5 to 3 meters. The supported gestures include a thumbs-up 👍, an “OK” sign 👌, an open-palm “stop” gesture ✋, a “victory” sign ✌️, and a “call-me” gesture 🤙.

Gestures (source)

Upon successful gesture recognition, the sensor provides both a digital output (via interrupt) and a visual indication: an RGB LED changes color depending on the gesture recognized. You can see the gestures and corresponding colors printed on the board:

Face and Upper-Body / Presence Detection

Beyond hand gestures, the sensor supports detection of human faces or upper-body presence (head-and-shoulders). It can detect and track up to 10 distinct faces or bodies simultaneously. For each detected person, the module can output positional coordinates (X/Y within the camera frame) and a confidence score, enabling more advanced applications such as people-counting, presence-based automation, or location-aware interactions.

Field of View, Optics and Detection Range

The onboard camera has a diagonal field of view (FOV) of approximately 85°, which gives a relatively wide scene coverage suitable for detecting people or gestures across a room rather than only in a narrow central spot. The specified effective detection range for both gestures and presence/face detection spans from roughly 0.5 m up to 3 m. The camera’s focal length is 1.56 mm, which is optimized for such short-to-medium range, wide-angle detection use.

Electrical and Communication Interfaces

The sensor is designed for broad compatibility with typical embedded platforms. It operates on a supply voltage between 3.3 V and 5 V, with a 3.3 V logic level. Its typical operating current is around 100 mA, making it suitable even for low-power or battery-based systems.

For data output and integration, the module supports two communication interfaces: I2C and UART. The user can select the interface via a mode switch on the board. The default I2C address is 0x72. If using UART, the default baud rate is 9600 bps using a Modbus-RTU protocol.

Note that there is a switch on the board that needs to be set to the chosen communication (I2C or UART):

I2C/UART interface and switch
I2C/UART interface and switch

Additionally, the module offers an interrupt INT output pin (on a 2.54 mm header) that goes low when a gesture is recognized. This useful for triggering external logic or microcontroller actions without continuous polling.

Physical Dimensions and Form Factor

The sensor board itself measures 42 mm × 32 mm, making it compact enough for embedding into small projects or housings. Mounting holes are provided: the spacing for mounting is 25 mm × 35 mm, and the mounting holes have a diameter of 3.1 mm. The board uses a PH2.0-4P connector (or optionally standard 2.54 mm pin-header holes) for power and data lines.

Technical Specification

The following table summarizes the technical features of the Gravity Offline Edge AI Gesture & Face Detection Sensor

ParameterDescription
ModelGravity Offline Edge AI Gesture & Face Detection Sensor (V1.0)
Processing methodOn-device AI inference for gesture, face, and upper-body detection
Camera FOVApproximately 85° diagonal
Focal length1.56 mm
Detection range0.5 m to 3 m for gestures and face/presence detection
Gesture recognitionFive predefined gestures (OK, thumb-up, victory, stop, hang-loose)
Face / presence detectionUp to 10 faces or upper-body targets simultaneously with position and confidence output
InterfacesI2C (default address 0x72) and UART (9600 bps, Modbus-RTU)
Interrupt outputActive-low interrupt pin triggered on gesture recognition
Operating voltage3.3 V to 5 V (3.3 V logic level)
Operating current~100 mA typical
Status indicationRGB LED with gesture-dependent color feedback
ConnectorPH2.0-4P or 2.54 mm header pads
Board dimensions42 mm × 32 mm
Mounting3.1 mm holes with 25 mm × 35 mm spacing
Software supportArduino / ESP32 library, MakeCode, Mind+

Connecting Gravity Gesture & Face Detection Sensor to Arduino

You can connect the Gravity Gesture & Face Detection Sensor via I2C or UART. We are going to use the I2C interface, since it is faster and allows us to connect multiples devices to the same bus. The following wiring diagram shows you how to connect the sensor to an Arduino UNO:

Connecting Gravity Gesture & Face Detection Sensor to Arduino UNO
Connecting Gravity Gesture & Face Detection Sensor to Arduino UNO

Start by connecting GND of the Arduino to GND of the Sensor. Next connect the 5V output (or the 3.3V output) to the VCC pin of the Sensor. Finally, we need to connect SDA/A4 to D/T and SCL/A5 to C/R.

Make sure that the communication switch on the Sensor board is the to I2C and not UART!

Install Libraries

Before you can use the Gravity Gesture & Face Detection Sensor you need to install the DFRobot_GestureFaceDetection library. Open the LIBRARY MANAGER, enter “DFRobot_GestureFaceDetection” in the search bar and press INSTALL to install the library as shown below:

You probably will see a dialog asking you to install library dependencies. Just press INSTALL ALL:

Now you are ready to try out some code examples.

Code Example: Gesture Recognition

This first code example shows you how the gesture recognition works:

#include "DFRobot_GestureFaceDetection.h"

DFRobot_GestureFaceDetection_I2C gfd(0x72);

void setup() {
  Serial.begin(115200);
  gfd.begin(&Wire);
  gfd.setGestureDetectThres(60);
  gfd.setDetectThres(100);
  Serial.println("running...");
}

void loop() {
  static char text[100];

  uint16_t gestureType = gfd.getGestureType();
  uint16_t gestureScore = gfd.getGestureScore();

  if (gestureType > 0) {
    sprintf(text, "Gesture: %d, score: %d\n", gestureType, gestureScore);
    Serial.print(text);
    delay(1000);
  }
   delay(100);
}

Includes

We begin by including the DFRobot_GestureFaceDetection library that provides the gesture and face-detection functionality.

Objects

Next we create an instance of the gesture and face detection class. This object will later be used to initialize communication, configure thresholds and retrieve detection results.

DFRobot_GestureFaceDetection_I2C gfd(0x72);

The constructor receives the I2C address of the DFRobot sensor. The default address is 0x72 but you can configure the I2C address in the range 0x01 – 0xF6 by calling the setDeviceAddr(addr) function.

Setup Function

The setup block initializes the serial port, starts communication with the gesture sensor and configures its detection threshold. Everything inside setup runs once when the board powers up or resets.

void setup() {
  Serial.begin(115200);
  gfd.begin(&Wire);
  gfd.setGestureDetectThres(60);
  Serial.println("running...");
}

The gesture detection threshold ranges between 0..100 and determines the sensitivity to detect gestures. A lower value increases sensitivity but can results in more false detection, while a higher value requires clearer gestures but reduces the number of false detections.

Loop Function

The loop function executes continuously, polling the sensor for new gesture information and printing results when something is detected.

The program requests the current gesture type from the sensor.

uint16_t gestureType = gfd.getGestureType();

The returned value identifies which gesture, if any, the sensor detected. A value of zero means no gesture was recognized. The following table shows the IDs for the different gesture types returned by the getGestureType() function and the corresponding color of the on-board LED:

IDGestureIconLED Color
1Thumbs-up👍Blue
2OK👌Green
3Stop🤚Red
4Victory✌️Yellow
5Call-me🤙Purple

For each gesture we then retrieve the gesture score by calling:

uint16_t gestureScore = gfd.getGestureScore();

This score indicates how confident the sensor is in the detected gesture. Higher scores represent more confident detection. The confidence score is in the range 0…100.

If a gesture was detected, the sketch formats a text message and prints it to the Serial Monitor.

if (gestureType > 0) {
  sprintf(text, "Gesture: %d, score: %d\n", gestureType, gestureScore);
  Serial.print(text);
  delay(1000);
}

The conditional statement verifies that gestureType is greater than zero. The sprintf function writes the gesture information into a character buffer, which is then printed. A one-second delay gives the user time to read the output and prevents excessive repeat messages.

Output on Serial Monitor

If you run the program and make gestures in front of the camera you should see detection results being printed to the Serial Monitor:

Gesture Detection Results on Serial Monitor
Gesture Detection Results on Serial Monitor

Code Example: Face Detection

The following code examples demonstrates how to perform face detection:

#include "DFRobot_GestureFaceDetection.h"

DFRobot_GestureFaceDetection_I2C gfd(0x72);

void setup() {
  Serial.begin(115200);
  gfd.begin(&Wire);  
  gfd.setFaceDetectThres(60);
}

void loop() {
  static char text[100];
  
  if (gfd.getFaceNumber() > 0) {
    uint16_t faceScore = gfd.getFaceScore();
    uint16_t faceX = gfd.getFaceLocationX();
    uint16_t faceY = gfd.getFaceLocationY();

    sprintf(text, "Face:(x=%d, y=%d, score=%d)\n", faceX, faceY, faceScore);
    Serial.print(text);
    delay(1000);
  }
  delay(100);
}

Includes and Objects

As before we start by including the DFRobot_GestureFaceDetection library and creating the gesture & face detection object gfd:

#include "DFRobot_GestureFaceDetection.h"

DFRobot_GestureFaceDetection_I2C gfd(0x72);

Setup Function

The setup block initializes the serial port, starts communication with the sensor and configures the detection threshold for faces:

void setup() {
  Serial.begin(115200);
  gfd.begin(&Wire);  
  gfd.setFaceDetectThres(60);
}

The detection threshold ranges between 0..100 and lower values result in an easier detection of faces but at the price of more false detections.

Loop Function

The loop function runs continuously and checks for face detections. If the number of faces detected is greater than zero we retrieve the confidence score and the x,y location of the detected face:

  if (gfd.getFaceNumber() > 0) {
    uint16_t faceScore = gfd.getFaceScore();
    uint16_t faceX = gfd.getFaceLocationX();
    uint16_t faceY = gfd.getFaceLocationY();

Next we print this information to the Serial Monitor:

    sprintf(text, "Face:(x=%d, y=%d, score=%d)\n", faceX, faceY, faceScore);
    Serial.print(text);

If you point the camera of the sensor to your face you should see detection results printed out:

Face Detection Results on Serial Monitor
Face Detection Results on Serial Monitor

Code Example: Room Occupancy Counter

In this final example we will build a room occupancy sensor that counts the people in a room and displays the counter on an OLED. You could build this with an Arduino as well, but I am going to use an ESP32-C3 SuperMini for a change.

The following wiring diagram shows you how to connect the Face detection sensor and the OLED to the ESP32:

Connecting OLED and Sensor to ESP32-Mini
Connecting OLED and Sensor to ESP32-Mini

Both, the Face detection sensor and the OLED are connected to the I2C bus, which sits on pins 8 (SDA) and 9 (SCL) of the ESP32 SuperMini. Simlarily, VCC and GND of the Sensor and the OLED are connected to 3.3V and GND of the SuperMini. The picture below shows this wiring on a breadboard:

OLED and Sensor connected to ESP32-Mini
OLED and Sensor connected to ESP32-Mini

And here is the code for counting the number of persons in a room:

#include "DFRobot_GestureFaceDetection.h"
#include "Adafruit_SSD1306.h"

DFRobot_GestureFaceDetection_I2C gfd(0x72);
Adafruit_SSD1306 oled(128, 64, &Wire, -1);

void setup() {
  Wire.begin();
  gfd.begin(&Wire);
  gfd.setFaceDetectThres(60);

  oled.begin(SSD1306_SWITCHCAPVCC, 0x3C);
  oled.setRotation(3);
  oled.setTextSize(6);
  oled.setTextColor(WHITE);
}

void loop() {
  uint16_t nFaces = gfd.getFaceNumber();
  oled.clearDisplay();
  oled.setCursor(10, 20);
  oled.printf("%d", nFaces);
  oled.display();
  delay(100);
}

Includes

The code starts by including the libraries for the Face detection sensor and the OLED. If you haven’t used the Adafruit_SSD1306 library before, you will need to install it via the LIBRARY MANAGER:

Install Adafruit_SSD1306 library via LIBRARY MANAGER
Install Adafruit_SSD1306 library via LIBRARY MANAGER

Objects

Next we create the objects for the sensor and the OLED:

DFRobot_GestureFaceDetection_I2C gfd(0x72);
Adafruit_SSD1306 oled(128, 64, &Wire, -1);

Since both devices are connected to the I2C bus they need to have different I2C addresses. But the sensor has the address 0x72 and the OLED typically has the address 0x3C, so there is no conflict. But if nothing appears on the display or the sensor doesn’t work, check the I2C addresses and make sure they are not the same.

Setup Function

In the setup function we initialize the sensor and the OLED: Note that the I2C address for the OLED is set here and not in the constructor:

void setup() {
  Wire.begin();
  gfd.begin(&Wire);
  gfd.setFaceDetectThres(60);

  oled.begin(SSD1306_SWITCHCAPVCC, 0x3C);
  oled.setRotation(3);
  oled.setTextSize(6);
  oled.setTextColor(WHITE);
}

Loop Function

The Gravity Gesture & Face Detection Sensor makes it very simple to count persons. We simply call the getFaceNumber() function to get the number of detected faces/persons. Note, however that the face recognition is limited to 10 faces. Once we have the number of faces (nFaces), we simply display it on the OLED.

void loop() {
  uint16_t nFaces = gfd.getFaceNumber();
  oled.clearDisplay();
  oled.setCursor(10, 20);
  oled.printf("%d", nFaces);
  oled.display();
  delay(100);
}

The small size of the OLED, the ESP32 Supermini and the Face detection sensor allows you to build a very compact room occupancy counter. And since the ESP32 supports Wi-Fi you could send the person counter easily to a server as well, for instance, to control the temperature in a room.

Conclusions

In this tutorial you learned how to connect the Gravity Gesture & Face Detection Sensor to an Arduino UNO or an ESP32. You also learned you how recognize gestures, how to detect faces and how to build a room occupancy sensor.

I recommend that you read the DFRobot Wiki for more information regarding the Gravity Gesture & Face Detection Sensor. Also have a look at their github repo for more code examples.

The gesture detection of the sensor is pretty reliable and quick. In contrast to simpler gestures sensors that detect simple gestures such swipeing up or down, the Gravity Gesture Sensor detects more complex gestures but is also a bit bigger and has a camera with a higher power consumption.

For information on smaller, simpler gestures sensors see our PAJ7620U2 Gesture Sensor with Arduino and APDS-9960 Gesture and Color Sensor with Arduino tutorials.

If you want to detect objects instead of faces have a look at the Train an Object Detection Model with Edge Impulse for ESP32-CAM, Object Detection with ESP32-CAM and YOLO and Getting Started with HUSKYLENS 2 and Arduino/ESP32  tutorials.

Finally, for another solution for an occupancy sensor, I suggest the Edge AI Room Occupancy Sensor with ESP32 and Person Detection tutorial.

If you have any questions or suggestions feel free to leave them in the comment section.

Happy Tinkering ; )