this is an open source for GRA workshop
This repository contains information and files relavent to the Grand River Academy's visit to the ATR Lab. ATR is a research laboratory focused on immersive robotics.
Below is an overview of the project requirements and a simple digram for wire IMU and two Servo into arduino board :
!!Hot Glue Can Sovle EVERYTHING!!
-
pan-tilt movement with IMU:
- Hardware requirment: Arduino, Wires, Breadboard, 3D Print Holder, Servo Motor(SG90), MPU6050
- Software: Arduino IDE
- Wire: Using
Circuit IO
to wire - Implemention:
Use Arduino native example codeServo
->Sweep
for servo control;
Download IMU library from i2cdevlib, and copyI2Cdev
andMPU6050
folder intoArduino
->Library
folder. relaunch arduino IDE and use example codeMPU6050_DMP6
for IMU control - Note:
base on different roll, pitch, yaw value on IMU, map the value to different motor, and control them based on the value change on IMU.
If you want to combine ROS with your project, you can jump to ROS Project from here
-
camera feedback with montion detection:
IgnoreOLED
part if you dont have one.- Hardware requirment: Arduino, Wires, Breadboard, 3D Print Holder, Servo Motor(SG90), MPU6050,
camera
,OLED screen
. - Software: Arduino IDE
- wire: Using
Circuit IO
to wire - Implemention:
Go to Arduino IDER and click onSketch
->Include Library
->Manage Libraries
. Search foradafruit ssd1306
andadafruit gfx
. install them. relaunch arduino IDE, and use example codeAdafruit SSD1306
->ssd1306_128x32_i2c
for start.
use native camera software you have on your PC to access camera feed. (basic)
use python to write a OpenCV code to access camera (advance) - Note:
You can find basic of OLED onPPT
folder.
draw a face on OLED: use Pixilart Website to draw face. use different face drawing in theloop
to animate the emotion.
base on the angle change per second (for example: if the data gap between each second is bigger than 40 degree inyaw
, that means the person is saying "No", then you can make the OLED face turn to an angry emotion) in roll, pitch, yaw (as noding the head, shake the head), to change the face emotion on OLED.
use human mannequin to attach IMU and simulate human move.
- Hardware requirment: Arduino, Wires, Breadboard, 3D Print Holder, Servo Motor(SG90), MPU6050,
-
Arm control
The concept will be the same as head control, you will need to add more motors and add one more IMUs to simulate the montion for each joints- Note:
find and 3d print a 3d servo arm model (you can come to ATR Lab with the model you find, we can print it for you)
if you dont want to do 3D print, you can use cardboard to make an holder yourself, or HOT GLUE them into each other.
link IMU together, and look into example code on how to read date for two IMU
you can use something like Wooden Human Mannequin to attach your IMU for easy programming purpose.
- Note:
-
Upper body control
this would bring the project into next level, a combination of all above work.- Note:
find and 3d print full humaiod robot (you can come to ATR Lab with the model you find, we can print it for you)
link multi-IMU
use multi-motor driver to control multiple motors
- Note:
ROS Project part 1 and 2 is base on the Arduino Project part 1. and ROS Project part 3 is base on the Arduino Project part 2.
- read IMU data and servo control with terminal
- Note:
go through ros arduino tutorial
look into exampleros_lib
->pub_sub
andServoControl
- Note:
- Remote control
Sperate IMU and motors into two arduino- Note:
use one arduino for sending IMU data through ROS, and another for receive data and control motor
- Note:
- Camera view
- Note
Look into OpenCV with ROS and ROS image view for receive image view through ROS
- Note
Please feel free to contact atrlab.kent.help@gmail.com if you need any questions, or you can always drop by the lab between 12 am-5 pm.