Microsoft Kinect 2 Renato Mainetti Alessandro Tironi renato.mainetti@unimi.it alessandro.tironi@studenti.unimi.it Lab 06
Outline: Introduction and Theory Kinect vs others NUI devices Kinect History and how does it works? Avateering (Rotations vs Positions) Quaternions Kinect Skeleton Let s CODE! Microsoft Kinect SDK Kinect Unity Integration Avateering in Unity Kinect as NUI for our AAA game (Assembly line) Kinect & VR Kinect 2 - VR 2
NUI: In computing, a natural user interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word natural is used because most computer interfaces use artificial control devices whose operation has to be learned. (examples are: multitouch, speech recognition, gesture motion recognition) Kinect 2 - VR 3
Obtrusiveness, precision, usability, etc Kinect vs
Kinect 1 Hardware (Primesense) RGB: 640x480 @ 30fps, 1280x960 @ 12fps Depth: 320x240 @ 30fps 4 mic array (16 KHz) Servo for Tilt regulation 5
Kinect 1 (xbox 360) vs Kinect 1 (for Windows) The windows version has introduced: Near mode Sitting position RGB cam exposure etc Different use licenses 6
How does it work? V1 Proprietary technology: Structured light Depth from focus Stereo principles 7
Kinect 2 Hardware (3DV Systems, Canesta) RGB: 1920x1080 @ 30fps Depth & IR cam: 512x424 @ 30fps 4 mic array (48 KHz) USB 3.0 8
How does it work? V2 TOF: 9
Kinect 1 Skeleton: 20 joints 2 bodies 10
Kinect 2 Skeleton: 11
Skeleton - demo 12
Colour and Depth demo (alignment) 13
Face tracking demo Orientations Expression etc 14
Kinect HeartRate: 15
Audio direction demo 16
Speech recognition demo SIT - ROLL - BARK 17
3D scanning (and retopolgy) 18
Kinect Studio 19
Mocap / Avateering Brekel PRO body V2 20
Avateering (Why rotation are so important) The right approach is to apply the rotation evaluated to the avatar skeleton. If we try to translate the avatar s joints to the Kinect s joint position then the mesh could get strechtched, depending how the skeleton was made and skinned to the mesh. 21
Quaternion: Generalization of complex numbers Very powerful to describe rotation and orientation It doesn t suffer of the gimbal lock problem Excellent for interpolation 22
Kinect s rotations Kinect, every frame, returns the rotation quaternion of each joint. (Leaf joint excluded) Each quaternion represents the absolute rotation of the parent bone.(e.g: the ElbowRight quaternion represents the right arm rotation) The Y axis is always parallel to the bone 23
Microsoft Kinect SDK 2.0 The SDK gives you the ability to access all the Kinect s features (frame by frame): Color sensor (RGB image) Depth sensor (Float depth image) Audio samples Body tracking (Joint Pos (X,Y,Z), Bone Quaternion (X,Y,Z,W)) Other features: Face tracking Gesture Recognition 24
Sensor initialization and joint evaluation Let s try showing the movements of the joints in a unity scene. Remember to deallocate each frame after using it. (in c# use the using structure) It is possible to retrieve the 3D coordinates using the Body.Joints dictionary: CameraSpacePoint p = body.joints[jointtype.spinebase]; Pay attention to the reference axis system! (Kinect is a right-handed ) 25
Inizializzazione del sensore KinectManager.InitializeSensor(): // Gets and opens default sensor. Sensor = KinectSensor.GetDefault(); if (!Sensor.IsOpen) Sensor.Open(); // Sets bodyframereader. bodyframereader = Sensor.BodyFrameSource.OpenReader(); // Initializes the bodies array. bodies = new Body[bodyFrameReader.BodyFrameSource.BodyCount]; 26
Dalle posizioni alle rotazioni Mostriamo in un altra scena come cambiano nel tempo le rotazioni misurate dal Kinect. Passiamo dal sistema di riferimento del sensore a quello di Unity invertendo i valori dei campi Z e W del quaternione; È possibile applicare delle rotazioni aggiuntive moltiplicando il quaternione q K ottenuto dal Kinect per un altro quaternione q offset 27
Avateering Importiamo un Avatar da Blender già skinnato e andiamo ad applicare le rotazioni ai joint. Se facciamo tutto per bene dovremmo vedere la mesh muoversi senza particolari deformazioni (carta di caramelle o cose del genere), altrimenti occorre applicare degli offset come visto precedentemente. 28
Come usare il package dentro la nostra scena di gioco Strutturiamo la scena come un sistema client/server in cui il server (il KinectManager) contatta i client (nel nostro caso l avatar) inviandogli di volta in volta i frame contenenti i dati acquisiti. L avatar, per ogni frame ricevuto, applicherà le rotazioni in esso contenute al rispettivo joint, dopo le opportune elaborazioni. KinectManager.Update(): // Executes each of the elaboration tasks in the given order. IEnumerator<KinectService> taskenumerator = Tasks.GetEnumerator(); while (taskenumerator.movenext()) taskenumerator.current.elaborate(bodies); 29
Come usare il package dentro la nostra scena di gioco Avateering.Elaborate(): foreach (JointType currentjoint in Enum.GetValues(typeof(JointType))) { // Gets the quaternion acquired from Kinect. Quaternion neworientation = new Quaternion(orientations[currentJoint].Orientation.X, orientations[currentjoint].orientation.y, -orientations[currentjoint].orientation.z, - orientations[currentjoint].orientation.w); // Applies the correction from a joint-to-correction dictionary. neworientation *= adjustrotations[currentjoint]; // Applies the rotation. jointmap[currentjoint].transform.rotation = Quaternion.Euler(offsetRotation) * neworientation; } 30
Configuriamo la scena per colpire i cubi Vogliamo che l avatar sia in grado di evitare che i cubi rossi arrivino in fondo al nastro prendendoli a pugni o a calci. Per fare questo applichiamo un collider ai joint interessati. 31
Configuriamo la scena per colpire i cubi Per calcolare la direzione del pugno/calcio conserviamo un buffer della posizione del joint (mani e piedi) nel frame precedente, per calcolare il vettore risultante quando viene registrata una collisione. KinectPositionBuffer.Elaborate(): foreach (JointType jointtype in bufferedjoints){ CameraSpacePoint acquiredposition = playerbody.joints[jointtype].position; Vector3 newposition = new Vector3(acquiredPosition.X, acquiredposition.y, acquiredposition.z); } // Updates buffers. bufferold[jointtype] = buffernew[jointtype]; buffernew[jointtype] = newposition; 32
Kinect and VR You can try this combination for your final project in the lab! Kinect 2 - VR 33