
The light wave represents the root mean square of the values observed by the microphone over that time slice.ĭepth visualized in 3D lets you move in the image using instructed keys. The dark wave represents the minimum and maximum values observed by the microphone over that time slice. If there's no sound, the graph is shown as empty, otherwise, you'll see a dark blue waveform with a light blue waveform overlaid on top of it. The microphone view shows a representation of the sound heard on each microphone. The bottom half is the gyroscope portion and shows rotational movement in radians/second It includes acceleration from gravity, so if it's lying flat on a table, the Z axis will probably show around -9.8 m/s 2. The top half is the accelerometer and shows linear acceleration in meters/second 2. The IMU window has two components, an accelerometer and a gyroscope. You can control RGB camera settings from the configuration window during the streaming. The image below shows the color camera view. Hover your cursor, at the pixel in the depth window, to see the value of the depth sensor, as shown below.
#KINECT SKELETON VIEWER SERIAL NUMBER#
#KINECT SKELETON VIEWER HOW TO#
Playback recordings made with Azure Kinect Recorder.įor more information about Azure Kinect viewer, watch How to use Azure Kinect video.Īzure Kinect Viewer is open source and can be used as an example for how to use the APIs.% Get the joint indices of the tracked bodies with respect to the color % image.ĬolorJointIndices = lastframeMetadata.The Azure Kinect Viewer, found under the installed tools directory as k4aviewer.exe (for example, C:\Program Files\Azure Kinect SDK vX.Y.Z\tools\k4aviewer.exe, where X.Y.Z is the installed version of the SDK), can be used to visualize all device data streams to: TrackedBodies = find(lastframeMetadata.IsBodyTracked) % Find the indexes of the tracked bodies.ĪnyBodiesTracked = any(lastframeMetadata.IsBodyTracked ~= 0) % Extract the 90th frame and tracked body information. % SpineBase = 1 % SpineMid = 2 % Neck = 3 % Head = 4 % ShoulderLeft = 5 % ElbowLeft = 6 % WristLeft = 7 % HandLeft = 8 % ShoulderRight = 9 % ElbowRight = 10 % WristRight = 11 % HandRight = 12 % HipLeft = 13 % KneeLeft = 14 % AnkleLeft = 15 % FootLeft = 16 % HipRight = 17 % KneeRight = 18 % AnkleRight = 19 % FootRight = 20 % SpineShoulder = 21 % HandTipLeft = 22 % ThumbLeft = 23 % HandTipRight = 24 % ThumbRight = 25 % Create skeleton connection map to link the joints. % These are the order of joints returned by the kinect adaptor. % Get images and metadata from the color and depth device objects. 'Depth_512x424' video data to be logged upon START.

Summary of Video Input Object Using 'Kinect V2 Depth Sensor'.Īcquisition Source(s): Kinect V2 Depth Source is available.Īcquisition Parameters: 'Kinect V2 Depth Source' is the current selected source. Trigger Parameters: 1 'immediate' trigger(s) on START.

'BGR_1920x1080' video data to be logged upon START.

Summary of Video Input Object Using 'Kinect V2 Color Sensor'.Īcquisition Source(s): Kinect V2 Color Source is available.Īcquisition Parameters: 'Kinect V2 Color Source' is the current selected source.ġ0 frames per trigger using the selected source.
