11.Open VR Plugin
Last updated
Last updated
Before using the Open VR plugin, you need to register a Steam account on your computer and install both the Steam platform and Steam VR.
Double click to install the OpenVr-NokovDriver.exe driver. For the specific plugin version, please contact our technical engineers for assistance.
After installation, the Config Tool for the plugin will open automatically.
Open SteamVR, then go to Settings, select Advanced Settings, and navigate to Startup/Shutdown -> Manage Add-Ons.
In the Add-Ons section, enable Nokov and Gamepad Support, and disable any other Add-Ons.
If the directory is shown as in the image below, then the setting is successful.
When installing the plugin, the OpenVR vrpath will be automatically set. If you do not find "nokov" in the add-ons after opening SteamVR, please check if the OpenVR vrpath is correctly configured.
After installing the Open VR plugin, our configuration tool, Config Tool.exe, can be found in the C:\Program Files\Nokov\OpenVR_Nokov_Driver_1.4.xxx\ConfigTool directory. Here is an overview of the functions of our configuration tool.
The Config Tool supports translation function. Click the Language button in the top left corner to switch between Chinese and English. In the Manual Menu, the main function is to directly open the ...\Steam\config\steamvr.vrsettings file. This file is primarily read by SteamVR during runtime for configurations, including our nokov driver settings (please refrain from manually modifying relevant configuration information).
Local Network Address: The Local Network Address and Server Network Address are used for streaming data from our local computer to the target computer via the streaming SDK feature in our nokov positioning software (XINGYING or AI_Mocap). The network address set in the Local Network Address should match the network address set in the network card sending address in the XINGYING or AI_Mocap software.
In the Config Tool, you can set three tracking modes:
Standard Tracking Mode: In this mode, we utilize Nokov's positioning information along with the rotation of the head-mounted device to achieve more precise positioning. We also overlay the offset of the head-mounted device with the offset of the positioning system.
Metric Driver Only Mode: In this mode, we solely rely on the positioning and rotation information from Nokov's positioning system.
Headset Only Tracking Mode: In this mode, we solely use the positioning and rotation information from the head-mounted device. For this mode, an external positioning system like the HTC Vive trackers needs to be provided.
Attach reflective markers to the headset display and two controllers. Create rigid bodies for the headset display and both controllers in XINGYING or AI_Mocap. Set the correct Rigid Body IDs for the headset display and controllers A/B in the Config Tool. These three rigid bodies provide positional data for the VR headset and controllers.
In the Config Tool, make sure to enable the switches for the display and controllers. Be careful to select the correct Rigid Body IDs for the headset display and controllers. Choosing the wrong IDs can lead to positional errors.
Differentiate between controllers A and B for left and right-hand devices. Be cautious not to select the wrong option.
Left-Right Hand Swap: Swapping the data driven by the left and right controllers.
Virtual Display: Virtual Display is an option when not wearing a headset. If there is no headset like HTC, Vive, Pico connected, enabling "Virtual Display" allows for simulated input to output visuals to the monitor solely through data driving. If "Virtual Display" is not enabled, a headset must be used.
Auto Fetch Bone IDs: Another method involves checking "Auto Fetch Bone IDs." In XINGYING or AI_Mocap, use a human template with joints labeled Head, LeftHand, and RightHand. When receiving motion capture data, the system automatically fetches the data for these three joints and binds them to the Hmd and left and right controllers.
Enable Keyboard Input: Enabling keyboard input allows for opening or closing the keyboard input feature through configuration. This feature lets you map keyboard keys to simulate inputs from regular VR controllers. Additionally, we have our own metric controller handles. For more details, please contact our technical engineers.
Steam VR Room Setup
Upon entering the scene, begin by setting up the room.
Click on the topmost option for Room Setup.
After setting up the room, you can adjust the room scale. For a small room scale, choose "Standing Only."
Here, because we have enabled the Virtual Hmd option in the Config Tool, the headset display will show as ready when the data is received. If this option is not enabled, you must wear the headset display.
When calibrating the floor, since we are using motion capture data for calibration, you can directly enter the height from your eyes to the ground for the calibration height.
After completion, if the orientation of the person in the motion capture area is facing the positive Z-direction, our front view aligns with the positive front direction.
Open Steam VR and navigate to Devices -> Controller Settings -> Controller -> Test Controller.
In the bottom left corner of the Test Controller interface, set the button mapping to Left Hand. Test the Left Hand controller's A, B, X, Y, and trigger buttons using the Z, X, C, V, B keys on the keyboard. You can also test the Thumbstick button mapping using the W, A, S, D keys.
In the bottom left corner of the Test Controller interface, set the button mapping to Right Hand. Test the Right Hand controller's A, B, X, Y, and trigger buttons using the "N", "M", ",", ".", "/" keys on the keyboard.
The button mappings for controlling the movement of the HMD and controllers are as shown in the following image. The movement of the HMD can be understood as the movement of the VR perspective. The saved offset is only effective after SteamVR is activated. If you restart SteamVR, the saved offset will be reset to 0, and you will need to save the offset again. Hold the Ctrl key and use the following keys to control the movement:
Using XINGYING motion capture software (creating rigid bodies for positioning):
Open the XINGYING motion capture software and perform calibration, ensuring that the calibration axis is the Y-axis pointing upwards. After calibration, attach markers on the HTC headset and left and right controllers (it is recommended to have at least three markers for each rigid body, with markers placed asymmetrically). In real-time mode, create rigid bodies facing the positive Z-axis with the headset and controllers. Set the XINGYING frame rate to 90, enable SDK broadcasting, enable 3D smoothing and Kalman smoothing, and start the software playback.
Before starting Steam VR, configure the parameters in the XINGYING motion capture software, enable SDK broadcasting, create rigid bodies, and start playback.
After configuring the settings in the XINGYING motion capture software, start Steam VR. If Steam VR is already running, please restart it. After starting, you should see icons for the headset and two controllers lit up. Successful connection to the headset and controllers will also display three "T" icons lit up, indicating that the headset and controllers are correctly connected. The image below shows an example using the HTC Vive as a demonstration connected to the OpenVR plugin.
If Steam crashes or exits abnormally, and the system automatically blocks loading items, you will need to unblock the loading items in the settings. Click on "Manage loading items," then click on "Unblock all" to cancel the block. After that, restart Steam VR.
After restarting and successfully connecting the HTC device, when the corresponding icons in Steam VR light up normally and the virtual scene displays correctly when you put on the headset, it indicates that the motion capture system positioning is in use. You can then proceed to start using it. The scene may resemble the one below, which is the Steam VR Home scene. This option can be disabled in the general settings of Steam VR.
Using XINGYING motion capture software (creating a human body for positioning):
Ensure that the Y-axis is pointing upwards in XINGYING, and create a 53-point human body model facing the positive Z-axis. When using a human body for positioning, make sure to check the "Automatically obtain bone ID" advanced setting option in the Config Tool.
In AI_Mocap, create a Markerless 22-point human body facing the positive Z-axis. The standing pose should have the hands in a gripping controller position.
When the Virtual HMD data access to SteamVR is successful, the status should be as follows:
The HMD indicator light should be constantly on, and there should be hand models displayed in the Headset View. If the hand models are too close to the body, you can adjust the HMD offset by using key mapping to view the hand model's position.
After downloading the "Fruit Ninja VR" game on Steam, create a 53-point human body or rigid body in XINGYING, or create a 22-point human body in AI_Mocap software for positioning.
Copy these two configuration files from the plugin installation directory ...\nokov\resources\input to the local file directory of Ninja Fruit, replacing the actions.json file in that directory.
The Fruit Ninja game reads the default controller mapping. If not replaced, you won't be able to see the hand controller models. However, if the data streaming is correct, you should see models of both the player's left and right hand controllers when you open Fruit Ninja.
When entering keyboard inputs, you can press the Z or N keys to open or close the menu while the game is paused. Pressing X, C, or the right-hand M key will close the pause menu. Use the W and S keys to switch options.
VR Path Setting: Input cmd command in the directory shown in the image below to access the command-line tool. Use the vrpathreg.exe tool to check if the vrpath is set (you can view the available parameters using the vrpathreg.exe -help command). In the command line, use the following command to verify if the vrpath is correctly set to the OpenVR driver version. The reason for setting this item specifically is detailed at