Skip to main content

Static Gesture Classification Logic

Function: classify_gesture in hand_landmark.py

def classify_gesture(gesture_embedding, gesture_classifier_model, gesture_labels, confidence_threshold=0.5):
"""Classify gesture from embedding"""
try:
classification_results = gesture_classifier_model([gesture_embedding])
raw_probabilities = list(classification_results.values())[0][0]
final_probabilities = [1.0 / (1.0 + np.exp(-score)) for score in raw_probabilities]
predicted_class = np.argmax(final_probabilities)
confidence = final_probabilities[predicted_class]
if confidence > confidence_threshold:
return gesture_labels[predicted_class], confidence
else:
return "None", 0.0
except Exception as e:
print(f"Error in gesture classification: {e}")
return "None", 0.0
  • The gesture embedding ([1, 128]) is input to the classifier model.
  • The output is a probability vector ([1, 8]) for each gesture.
  • Sigmoid activation is applied to the raw output.
  • The gesture with the highest confidence above threshold is selected and stored in the HandRegion object.
  • Users can map gestures to custom actions for flexibility.