VNI Lab is developed by VLESS team who aims to enrich and reimagine the human-computer interface. Our team concentrates on body gesture with a wide variety of interaction topics including sensing and display hardware, touch and stylus input, spatial and augmented reality, and user modeling.
VNI Lab has developed and manufactured 3D cameras (RGBD Sensor with time-of-flight (ToF) technology) for various companies and markets.Our team has vast experience in design of 3D image processing algorithms and software, for advanced tracking and gesture recognition algorithms to define the relationship between computers and the people using them.
Our hardware measure depth by using TOF technology, an excellent choice for a wide range of indoor applications. It enables fast and efficient 3D imaging by delivering 2D and 3D information at the same time for each pixel. At fulfilling the maximum possible in terms of functions integrated in one chip and by using a standard CMOS process depth sensing becomes profitable. Systems are highly streamlined and economic as the relevant 3D information is available immediately and the depth map calculation requires only simple algorithms.
Image:Depth QVGA resolution (224 x 172) & Color high resolution (2688 x 1520)
Frame rate: 5 - 45 fps
Field of view(horizontal x vertical):62º x 45º
Precision:<= 1% of distance (0.5 –4m @ 5fps);<= 2% of distance (0.1 –1m @ 45fps)
Laser level:Class one
Time-of-flight (TOF) technology:
TOF technology can be explained as radar operating with light. With the technology, the time it takes for a very short light pulse to travel to a certain object is measured, allowing distance to be calculated with great accuracy. TOF camera is a range imaging camera system that resolves distance based on the known speed of light, measuring the TOF of a light signal between the camera and the subject for each point of the image. The time-of-flight camera is a class of scannerless LIDAR, in which the entire scene is captured with each laser or light pulse, as opposed to point-by-point with a laser beam such as in scanning LIDAR systems.
OpenRGBD allows you to naturally and intuitively view point clouds from multiple sensors in multiple viewports – simultaneously! Point clouds can be displayed using configurable color encodings or using a camera’s RGB channel.
The OpenRGBD SDK includes drivers, rich APIs for raw sensor streams, installation documents, and resource materials. It provides OpenRGBD capabilities to developers to build cross platform desktop applications.
- Room size:2m-2m-2m
- Number of units:4
- Input voltage:AC85-265V
- Input frequency:50-60Hz
- LED Power: 50W
- Brightness adjustment:1%-100%
- Color temperature adjustment:2700-6500K
- Wireless control:GFSK
Our creative team that is making everything possible