Azure kinect sdk example Download Microsoft Edge More info about Internet Explorer and Samples for Azure Kinect. Installing the SDK in a non-default location will result in compile failures when CMake is unable to locate the SDK. The download link is below. For example Intel RealSense provide many sample recordings for download, also with humans in view, to test their Body Tracking SDK. If a new sample is not currently available, this function will block until the timeout is reached. Unity 2019. OneKinect_Recording_RGB_DEPTH_IR. I believe it would vastly beneficial to inlcude such examples since it would be helpfull for projects like VR/AR and much more. SDK Version: Azure Kinect SDK v1. This browser is no longer supported. 5feet) from ground and is faced with sensors pointing down on a table(D_2m x W_1. Important when exchanging raw image data between different SDK/systems to declare the step or stride. \n. Please provide SDK API or at least The code is copied from Azure Kinect SDK example "green screen", however, this project is based on the OpenCV 4. Reload to refresh your session. - microsoft/Azure-Kinect-Sensor-SDK A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device. You switched accounts on another tab or window. com/Microsoft/Azure-Kinect-Sensor-SDK convert_2d_to_3d() Bug Something isn't working Code Sample This issue is related to a code sample. Tested environment. Ask a Question A question for the community Code Sample This issue is related to a code sample More Info Needed More information is required before this issue can be approved and removed Triage Approved The Issue has been approved by an Azure Kinect team member. device = k4a. The default is C:\Program Files\Azure Kinect Body Is this true or i’m not that well informed how to use the SDK + Azure + Unit? My Azure Kinect is (2,6m/8. - microsoft/Azure-Kinect-Sensor-SDK Open the Examples. Gets the next sample in the streamed sequence of IMU samples from the device. I tried a couple of things to get this working, so I can't tell which solved the issue but here's the list: use the exact unity version of Use the Body Tracking for Orbbec Femto Bolt, Mega, & Azure Kinect from LightBuzz on your next project. Each pixel will be an XYZ set of 16 bit values, therefore its stride must be 2(bytes) * 3(x,y,z) * width of the depth image in pixels. It has two modes: the capture mode and the playback mode. The installation depends on what sensor you have at your disposal. [158] For example, Philipp Robbel of MIT combined Kinect with iRobot Create to map a room in 3D and have the robot respond to human gestures, Samples for Azure Kinect. The reason for this is that only a couple of native libraries (DLLs) used by the K4A-asset are changed to work with either Azure or Python 3 wrapper for Azure-Kinect-Sensor-SDK. (Azure Kinect) Download and install Azure-Kinect Sensor SDK, as described in the ‘Azure-Kinect SDKsʼ-section below. The objective of this repository is to combine the strong points Learn how to drive characters in Unity using the Azure Kinect Body Tracking SDK. Currently Sensor SDK provides only raw IMU data. e. Skip to content. Move all files from the CMAKE generated Unity example to use Azure Kinect DK (works with both Sensor SDK and Body Tracking SDK). The API will buffer at least two camera capture intervals Describe the bug When I run the simple example: import matplotlib. 0. The goal of this thread is to provide basic support to the users of ‘Azure Kinect Examples for Unity’-package. The depth engine must be in your %PATH% or located next to k4a. Spectacular AI C++ plugin for Microsoft Azure Kinect DK. Sign in Product This GitHub repository contains code samples that demonstrate how to use Microsoft's Azure Kinect DK Sensor and Body Tracking SDKs. It works with Azure Kinect (aka Kinect4Azure, K4A) sensor only and requires at least Unity 2019. dll; k4abt. Sign in Product GitHub Copilot A good way to capture images is using the Azure Kinect DK recorder. For more information about the Azure Kinect DK The Azure Kinect SDK enables you to get the most out of your Azure This GitHub repository contains code samples that demonstrate how to use Microsoft's Azure Kinect DK Sensor and Body Tracking SDKs. More int GetColorControl (ColorControlCommand command) Get the Azure Kinect color sensor control value. There are multiple examples of how to use both Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. - microsoft/Azure-Kinect-Sensor-SDK Samples for Azure Kinect. Azure Kinect Body Tracking SDK v1. Always by your side, ready to support you whenever and wherever you need it. Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support. Unity Discussions Azure Kinect Examples for Unity. The kit includes a 12 megapixel RGB camera supplemented by 1 megapixel-depth camera for body 1. You signed in with another tab or window. 5. Ask a Question A question for the community Code Sample This issue is related to a code sample and removed Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. Clone this repo. But the Avatar is Static and does not move in space. 0\sdk\windows-desktop\amd64\release\bin. 0 The Azure Kinect device was released on June 27, 2019, at a price of US$400, further enabled by the release of the Kinect SDK by Microsoft. A sample application available here demonstrates its はじめにAzure Kinect DK、入手はしましたが、なかなかまとまって触る時間を確保できません。Unity初めて触るので、むずかしいです。ようやくDepthから点群表示できるようになりまし Samples for Azure Kinect. Notifications Issue related to the Body Tracking SDK Bug Something isn't working Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. 5. md file located in the library's project folder. dll For example: C:\Program Files\Azure Kinect SDK\sdk\windows-desktop\amd64\release\bin; If you modify Windows path, close Matlab and open it again in order to detect the Samples for Azure Kinect. For example, this is a series of log entries that appears if I open a specific Kinect, and then try to Be sure to setup udev rules to use the Azure Kinect SDK without being 'root'. import k4a # Open a device using the static function Device. Currently i have not found any examples of tracking an 3d object using the current SDK. Third party assets. \Program Files\Azure Kinect SDK v1. Azure Kinect Examples for Unity, v1. com/Microsoft/Azure-Kinect-Sensor-SDK get_imu_sample() [2/2] This generates the k4a wheel in a build\ folder. While trying to reproduce those spammy OP logs entries, I generated other entries with log levels demonstrating ambiguities in the two buckets. If it has been transformed to the color camera perspective, camera should be Color. Not sure if it is a hardware defect. Describe the bug When I run the simple example: import matplotlib. Plugins folder. - microsoft/Azure-Kinect-Sensor-SDK Part of the ideas in this repository are taken from following repositories: pyk4a: Really nice and clean Python3 wrapper for the Kinect Azure SDK. 6m / 6. com/Microsoft/Azure-Kinect-Sensor-SDK get_next_imu_sample() The Azure Kinect Body Tracking SDK uses the Azure Kinect DK’s 1-megapixel time-of-flight depth camera to extract the position and orientation of 32 key points/joints for each person standing in front of the camera. [1] [2] It is the successor to the Microsoft Kinect line of sensors. 2f1. 4. Sign in Product GitHub Copilot. These parameters were very useful Best Mathieu. - microsoft/Azure-Kinect-Sensor-SDK Azure Kinect Sensor SDK Time to wait for an IMU sample. Describe the solution you'd like. - microsoft/Azure-Kinect-Sensor-SDK 3D Skeletal Tracking on Azure Kinect--Azure Kinect Body Tracking SDK Zicheng Liu, Principal Research Manager Microsoft. For more information about the Azure Kinect DK The Azure Kinect Transformation Example shows the transformations between a color image, a depth image, and a point cloud. Azure Kinect Sensor SDK k4a_float3_t k4a_imu_sample_t::gyro_sample: Gyro sample in radians per second. ". For the most detailed documentation of API behavior, see the documentation for the C functions that the C++ classes wrap. The only ways to get it working again is by either restarting the PC or by power cycling the Kinect. Code; Issues 365; Pull requests The calibration_registration example in the SDK returns a JSON file for the raw calibration of two cameras which would make sense to be used as the first argument in NOTE: The Azure Kinect DK has been retired and no longer available. However, I’m still If the depth map is from the original depth perspective, camera should be Depth. @ChristopherRemde examples have several errors in them. Copy following files from Azure Kinect Sensor SDK or Azure Kinect Body Tracking SDK to directly under the unity folder; k4a. Add an environment variable for AZUREKINECT_BODY_SDK and set it to the Body SDK installation path (no trailing slash). You can find these library project folders grouped by service in the /sdk directory. The Contribute to microsoft/Azure-Kinect-Samples development by creating an account on GitHub. Platform for Situated Intelligence supports using a Microsoft Azure Kinect Camera (see also Microsoft Kinect V2 support). exe -c 3072p -d PASSIVE_IR This is the free version of 'Azure Kinect Examples for Unity' package. Example command: k4arecorder. Documentation for https://github. Device. Logs The OpenCV compatibility example shows how to convert the Azure Kinect calibration type k4a_calibration_t into the corresponding data structures of OpenCV. Quick start Open compile. 1f2. – The new Azure Kinect Body Tracking SDK was developed using advanced Machine Learning AI algorithms. We would like to show you a description here but the site won’t allow us. microsoft / Azure-Kinect-Sensor-SDK Public archive. It is not guaranteed for both systems (cv and k4a) to use the same stride. The Azure Kinect ROS Driver includes CMake files which will try to locate the Azure Kinect Sensor SDK. 5k. The returned image will be of format Custom. Code This would mean an import of an extra asset will be needed, to make it work with Azure Kinect. It contains the following parameters upon construction: video_filename: The Kinect video filename; auto_start: Automatically start the playback wrapper (otherwise, call start() to start); realtime_wait: Wait for the next frame to be displayed, or skip frames if processing is too slow; rgb: Whether to load the RGB image; depth: Whether to load the depth image The default is C:\Program Files\Azure Kinect SDK v1. Azure Kinect DK is not supported for Windows 10 in S mode. Copy link k123jack commented Mar 23, It's not a bug in Azure Kinect Sensor SDK. Is it possible to transform the color image to the depth camera and just assign for example an alpha of 0 to those pixels which are occluded, similar to how it is handled when transforming a depth image to the color camera? Provide code sample to show how to use transformation on downscaled color image rabbitdaxi/Azure-Kinect-Sensor-SDK 4 You signed in with another tab or window. This GitHub repository contains code samples that demonstrate how to use Microsoft's Azure Kinect DK Sensor and Body Tracking SDKs. dll; onnxruntime. Record the rgb+depth+ir stream into the mkv video file. Easy Azure integration. It's a The undistort example indeed is just an example to show case how to undistort a depth image (by design). 1 Firmware version: Current Firmware Versions: Unity Version: 2019. md at develop · microsoft/Azure-Kinect-Sensor-SDK Despite its name, ‘Azure-Kinect Examples for Unityʼ can work with several depth sensors – Azure-Kinect, RealSense and Kinect-v2. Find and fix vulnerabilities Actions Open compile. • Take your project further with easy Azure integration. See here. Note: The Azure Kinect has been discontinued, but nearly equivalent device, ORBBEC Femto is now available and uses the same depth sensing technology, which ORBBEC has licensed from Microsoft. dll microsoft / Azure-Kinect-Sensor-SDK Public archive. Part of the ideas in this repository are taken from following repositories: pyk4a: Really nice and clean Python3 wrapper for the Kinect Azure SDK. Azure Kinect for C++. com/Microsoft/Azure-Kinect-Sensor-SDK get_imu_sample() [2/2] A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device. unity houdini vfx azure-kinect vfx-graph Updated Feb 28, 2021; C#; drumath2237 / k4a-vfx Star 4. For full file code examples, check out the /examples directory in any library project folder. Install the k4a package: "pip install <k4a_wheel_file>" 4. (Azure Kinect) Download and install the latest release of Azure-Kinect Sensor SDK. Instead of the latest version, can you test with the versions of unity and the SDK such as Unity Version: 2019. k4a_imu_sample_t; Generated by We would like to show you a description here but the site won’t allow us. get To get started with a library, see the README. pyplot as plt # This will import all the public symbols into the k4a namespace. 2. Contribute to sotanmochi/AzureKinect4Unity development by creating an account on GitHub. When using the HTTPS protocol, the command line will prompt for account and password verification as follows. @lihk11 thank you for the feedback. Develop more accurate vision and speech models by combining the advanced sensors on your Azure Kinect with Azure Cognitive Services and Documentation for https://github. Handle to an Azure Kinect device. Sign in OpenCV KinectFusion sample shows how to use the Azure Kinect SDK with Documentation for https://github. This includes capture of video, infrared, depth, audio, and body tracking streams from the Azure Kinect. SDK for tracking humans with Azure Kinect DK. 0; cudnn v7. The initial idea of having this example is just trying to help people to understand how to undistort I have written a simple Python example of configuring 2 Azure Kinect devices to be synced together where one is master and the other subordinate: These libraries can be located in C:\Program Files\Azure Kinect SDK v1. dll. md "The undistort example demonstrates how to undistort a depth map. 5m x H_0. open(). Azure Kinect Project Sample compatible both with Azure Kinect Sensor SDK and with Azure Kinect Body Tracking SDK using UE4 - Cris21395/AzureKinectDK You signed in with another tab or window. ORBBEC cameras are supported by their Spectacular AI SDK wrapper. com/Microsoft/Azure-Kinect-Sensor-SDK get_imu_sample() [1/2] Contribute to microsoft/Azure-Kinect-Samples development by creating an account on GitHub. The pointCloud image must be of format Custom. Notifications You must be signed in to change notification settings; Fork 621; Star 1. This project is a node which publishes sensor data from the Azure Kinect Developer Kit to the Robot Operating System (ROS). Import AzureKinect4Unity. Or use the Azure Speech Services SDK, to enable functions like real time translation or transcription. はじめに C#を使ってAzureKinect用のシステムの開発をする際のプロジェクトの設定やSDKの導入手順についてのメモ。 2. For a basic example displaying the first frame, you can run this code: from pyk4a import PyK4A # Load camera with the default config k4a = PyK4A() k4a. Remarks Handles are created with k4a_device_open() and closed with k4a_device_close(). More ImuSample GetImuSample Reads an IMU sample from the device. i used the “Azure Kinect Examples” with the kinect v2 in Hdrp but it seems that there are no more “Smoothing” & “Velocity Smoothing” parameters in the Kinect manager like in the “Kinect v2 examples”. If the depth map is from the original depth perspective, camera should be Depth. The initial idea of having this example is just trying to help people to understand how to undistort Sensor SDK is available for the Windows API (Win32) for native C/C++ Windows applications and is not currently available to UWP applications. All reactions Azure Kinect DK is a developer kit and PC peripheral that combines our best artificial intelligence (AI) sensors with SDKs and APIs for building sophisticate As the Orbbec SDK K4A Wrapper directly uses the Azure Kinect Sensor SDK API, user can directly refer to the relevant examples of the Azure Kinect Sensor SDK: In this repository: examples- each example has a readme page that describes it and the steps to set it up. . It runs on machines without a GPU but it will be much slower OFFLINE - Play a specified file. microsoft python azure-kinect azure-kinect-sdk azure-kinect-dk Updated May 14, 2024; Python Unity VFX Graph sample project with Azure Kinect Fusion and Houdini. 3. For details about the Azure Kinect DK hardware and for more information about getting started Azure Kinect Examples for Unity, v1. Because a user might have installed the Azure SDK in another location, the python Documentation for https://github. Azure-Kinect-Samples repository. m and set the include and lib paths of Azure Kinect SDK (see the provided paths) Add to the windows path the bin directory containing the k4a. 1. /// In Unity they use a left handed coordinate system so it is necessary to convert from right to left handed coords. I have Azure Kinect SDK and Body Tracking SDK installed and I verified that they works via the viewer application. Copy link This change should have fixed the issue you reported: microsoft/Azure-Kinect-Samples#32. More k4a_wait_result_t k4a_device_get_imu_sample (k4a_device_t device_handle, k4a_imu_sample_t *imu_sample, int32_t timeout_in_ms) Reads an IMU A simple program to showcase the image transformation functions in the Azure Kinect API. Navigation Menu Toggle navigation. Find this integration tool & more on the Unity Asset Store. The Azure Kinect - OpenCV KinectFusion sample shows how to use the Azure Kinect SDK with the KinectFusion from opencv_contrib's rgbd module This documentation describes the API usage for the Azure Kinect Sensor SDK. The Azure Kinect ROS Driver requires Azure Kinect stops working after a while. 0\sdk\windows-desktop\amd64\release\bin to the 'bin' folder generated from CMAKE. Azure Kinect DK Build computer vision and speech models using a developer kit with advanced AI sensors • Get started with a range of SDKs, including an open-source Sensor The Azure Kinect Body Tracking Simple3dViewer sample creates a 3d window that visualizes all the information provided by the body tracking SDK. 開発環境 ・Visual Studio 2019 ※C# フォームアプリケーション開発環境を事前にインストール・Azure Kinect SDK1. The principle of how to compute the undistortion lut can be shared by any type of images (it is a concept only depends on the camera intrinsics), it computes the pixel indices mapping relationship between distorted and undistorted images. dll for the SDK to decode frames from Documentation for https://github. The Azure Kinect DK SDK and ROS Node are no longer maintained. Returns true if a sample was available, false if there are none left. ope It contains the following parameters upon construction: video_filename: The Kinect video filename; auto_start: Automatically start the playback wrapper (otherwise, call start() to start); realtime_wait: Wait for the next frame to be displayed, or skip frames if processing is too slow; rgb: Whether to load the RGB image; depth: Whether to load the depth image /// For the purposes of this sample app in Unity we have a game object representing the Azure Kinect Device that has the /// standard y-axis pointing up, x axis pointing right and the z axis point forward. The same way as it is now for Femto Bolt & Mega. - Azure-Kinect-Sensor-SDK/README. The SDKs include samples that do the visualization for you. Copy link jessekirbs commented Jun 17, \Azure Kinect SDK v1. I just run custom code to record videos multiple times a day and it stops working after a while. The Body tracking data has been successfully applied onto the avatar using the Azure kinect body tracking samples - unity integration sdk. import k4a # Open a device using the static function Reads an IMU sample from the device. 20,1 is a set of Azure Kinect and Femto Bolt/Mega camera examples that use several major scripts, grouped in one folder. To your question: Azure and Femto can't work together, unfortunately. There is no API or example, how to convert the IMU data to device rotation and change in position. Additional Prerequisites: Matplotlib installed via pip: pip install matplotlib Numpy installed via pip: pip install numpy tkinter installed via pip: pip install tk In Linux, another way to install tkinter is: sudo apt install python3-tk To run, open a command terminal and type: Azure Kinect Sensor SDK /Microsoft/Azure-Kinect-Sensor-SDK get_next_imu_sample() bool k4a::playback::get_next_imu_sample (k4a_imu_sample_t * sample) inline: Get the next IMU sample in the recording. Returns The next unread IMU sample from the device. Does not require Kinect device SDK Version: Azure Kinect SDK v1. To Reproduce It can not be directly be reproduced. md at develop · microsoft/Azure-Kinect-Sensor-SDK So I am debugging the fastpointcloud example in the azure kinect SDK on visual studio and I want to ask whether when the kinect detectes an object and gives back its coordinates like shown before if I change the direction of the camera why doesnt the values change Closes an Azure Kinect device. The objective of this repository is to @lihk11 thank you for the feedback. Bug Something isn't working Code Sample This issue is related to a code sample Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. Quote from readme. Then open ‘Azure Kinect Viewer’ to check, if the sensor works as expected. all the body tracking is mapped onto to the Avatar but unlike the Point body it does not move in space. sln and had the Azure Kinect DK package directly in Visual Studio 2017; Add some include/libs links in the project properties via Visual Studio 2017; RESULT: the enumerate example works but I still have errors on the transformation example. Developers working with ROS can use this node to connect an Azure Kinect Developer Kit to an existing ROS installation. 3. Quick start. Expected behavior Compile the transformation example from sdk without errors. 14f1; CUDA v10. Hi Mathieu, Yes, you are right! I have to bring these filters to the K4A-asset Azure Kinect examples for Unity. py to first test it on one Azure Kinect device to make sure you can communicate with it. The Azure Kinect DK is a discontinued developer kit and PC peripheral which employs the use of artificial intelligence sensors for computer vision and speech models, and is connected to the Microsoft Azure cloud. Please see the log. Connect 2 Azure Kinect devices and run the example script example\k4a_sync. Run the example script example\the_basics. 10; Get Started. start() # Get the next capture (blocking function) capture = k4a. Azure-Kinect-Python: More complete library using ctypes as in this repository, however, examples about how to use the library are missing and the library is harder to use. - microsoft/Azure-Kinect-Sensor-SDK. Contribute to SkyN9ne/AzureKinectSamples development by creating an account on GitHub. - microsoft/Azure-Kinect-Sensor-SDK After setting everything up as suggested in the unity samples for the Azure Kinect SDK, the body tracking works fine in editor but the build throws an exception in the log: "catching exception for background thread result = K4A_WAIT_RESULT_FAILED" Python library to run Kinect Azure DK SDK functions. I tried a couple of things to get this working, so I can't tell which solved the issue but here's the list: use the exact unity version of the project Use the Azure Kinect and Femto Bolt Examples for Unity from RF Solutions on your next project. Microsoft is developing the official Azure SDK for Rust crates and has no plans to update this unofficial crate. Describe the solution you'd like My main goal is to keep track of a gun's replica (G3A3) in order to use it inside a unity game. In general, body tracking can be performed either using a marker-based or a markerless capturing system. 7feet x 5feet Example. Contribute to etiennedub/pyk4a development by creating an account on GitHub. py. I’ve just purchased the package. This documentation also covers the C++ wrapper. labels Nov 19, 2020 A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device. CPU - Use the CPU only mode. dll For example: C:\Program Files\Azure Kinect SDK\sdk\windows-desktop\amd64\release\bin; If you modify Windows path, close Matlab and open it again in order to detect the tesych added Body Tracking Issue related to the Body Tracking SDK and removed Body Tracking Issue related to the Body Tracking SDK Bug Something isn't working Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. Skip to main content. Contribute to microsoft/Azure-Kinect-Samples development by creating an account on GitHub. 1 to see if that helps. 1\sdk\windows-desktop\amd64\release\bin. Write better code with AI Security. Invalid handles are set to 0. labels Jul 30, 2019 Samples for Azure Kinect. The Unity3D SDK also includes a more advanced Avateering demo. 2f1 and SDK Version: Azure Kinect SDK v1. So I am debugging the fastpointcloud example in the azure kinect SDK on visual studio and I want to ask whether when the kinect detectes an object and gives back its coordinates like shown before if I change the direction of the camera why doesnt the values change ( for the height y for example ) ? does it give all the coordiantes of points of A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device. Comments. 0 is installed Download from here An env variable AZUREKINECT_BODY_SDK that points to the Azure Kinect Body Tracking SDK root path should be If you have run the windows binary installer (see the Azure Kinect DK public documentation for details), you can get a copy of the depth engine from %Program Files%\Azure Kinect SDK\sdk\windows-desktop\amd64\release\bin\depthengine_<major>_<minor>. com/Microsoft/Azure-Kinect-Sensor-SDK get_imu_sample() [1/2] qm13 added Triage Needed The Issue still needs to be reviewed by Azure Kinect team members. 1 / Azure Kinect Body Tracking SDK v1. I see that you are using the latest version of Azure body tracking SDK. More k4a_wait_result_t k4a_device_get_capture (k4a_device_t device_handle, k4a_capture_t *capture_handle, int32_t timeout_in_ms) Reads a sensor capture. Hi Mathieu, Yes, you are right! I have to bring these filters to the K4A-asset You signed in with another tab or window. A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device. The official "Office Sample Recordings" will not output any body tracking data, that will help our project team decide to start developing (new medical diagnostic device) on top of the Azure Kinect DK. So far it contains two demo scenes: Documentation for https://github. Hi Mathieu, Yes, you are right! I have to bring these filters to the K4A-asset i used the “Azure Kinect Examples” with the kinect v2 in Hdrp but it seems that there are no more “Smoothing” & “Velocity Smoothing” parameters in the Kinect manager like in the “Kinect v2 examples”. More int GetColorControl (ColorControlCommand command, out ColorControlMode mode) Get the Azure Kinect color sensor control value Download from here An env variable AZUREKINECT_SDK that points to the Azure Kinect SDK root path should be registered. You signed out in another tab or window. dll and k4arecord. 1. It mimics the point body i. Azure Kinect . - microsoft/Azure-Kinect-Sensor-SDK The Azure Kinect Sensor SDK is primarily a C API. aid xfo zaemr wlbrpe izk fdogh gwuc wnnvk xqslcm rmb