┌──────────────────────────┐
│ SAMI UI │
│ (PyQt5 Frontend) │
└─────────────┬────────────┘
│
│ UI Events (buttons, sliders, behavior selection)
V
┌──────────────────────────┐
│ SAMIControl │
│ (Robot Controller API) │
│ - Loads configs │
│ - Validates angles │
│ - Runs behaviors │
│ - Sends serial packets │
│ - Interfaces audio │
└───────┬───────────┬──────┘
│ │
Serial │ │ Audio
Packets │ V
V ┌──────────────────────┐
------------------------
┌──────────────────────┐ │ Audio Manager │
│ Arduino Firmware │ │ - Voice profiles │
│ - Joint motion │ │ - Plays WAV files │
│ - Relay control │ └──────────────────────┘
│ - Face/eyes emotes │
└──────────────────────┘
Config Files (Editable by Users):
• Joint_config.json => Joint names, IDs, limits
• Emote.json => Eye/emote IDs
• behaviors/*.json => Motion/audio/emote sequences
• audio/*.wav => Voice files
This is the main control interface. Everything the robot does goes through here.
A PyQt5 UI for interacting with SAMI manually.
/behaviorsHandles voice playback based on behavior definitions.
"Matt")ClipName => actual filename resolutionWhen the robot starts: 1) SAMIControlUI initializes the serial port 2) Loads joints from Joint_config.json 3) Loads emotes from Emote.json 4) Creates the UI (joint dropdown, behavior list) 5) SAMI is ready to interface!!!!! yayyy!!!
4.1 Joint_config.json
Defines every joint the robot exposes:
{
"JointName": "LeftShoulder",
"JointID": 9,
"HomeAngle": 180,
"MinAngle": 30,
"MaxAngle": 190
}
What you can change:
4.2 Emote.json Defines facial expressions used by behaviors:
"Emotes": {
"Neutral": 1,
"Happy": 2,
"Sad": 3
}
4.3 behaviors/*.json
Each behavior is a list of keyframes, executed in order:
{
"HasAudio": "True",
"HasEmote": "True",
"HasJoints": "True",
"AudioClip": "trivia_question_1",
"Expression": "Happy",
"JointAngles": {
"LeftShoulder": 150,
"RightShoulder": 45
},
"WaitTime": 1000
}
Keyframe options that can be customize: