Striped.Blue™ offers a comprehensive suite of protocols, authoring tools, visual libraries, and cloud infrastructure for developing and operating any conceivable Universal Video Telemetry application.
Easily set up endpoints for video and data through our user-friendly web interface, incorporate processing and analytics capabilities, design compositions, and add visual components to display your data. Finally, establish a source point for your viewers to connect to. It's that simple!
Striped.Blue™ handles the complexities of data harmonization, alignment, resampling, and latency management. You concentrate on perfecting your solution.
Universal Video Telemetry is a platform-independent method of combining multiple video sources with distributed telemetry data to create perfectly synchronized video and data streams.
Video and data are transmitted separately to a set of endpoints where they are automatically analyzed, processed, aligned, and resampled. At this stage, presentation and instrumentation layers are added for client-side rendering.
A custom video player renders the instrumentation and presentation layers on the client-side, allowing for non-destructive video playback with vector-graphics overlays that can be controlled by the user. This video player enables users to step through each data point or video frame to view all conditions present at specific timecodes.
With Striped.Blue™, you can transmit multiple synchronized video channels and related telemetry data all at once over the wire. Easily switch between channels without any lag or dropped frames.
All video augmentations can be controlled by the client. There are no permanent burn-ins or watermarks added to the original footage. Your footage will always stay unaltered.
The Composition Library for each project consists of several video sources laid out in a particular fashion. Video sources can occupy their own space or overlap with other video sources (e.g. Picture-in-Picture). Compositions can be turned on & off conditionally.
The Augmentation Library consists of a range of different data augmentations or instrumentation. Augmentations can be:
Choose from a variety of different instruments and visualizations or build your own extensions.
Use a linear scrubber to step through your video data and see all associated telemetry data for each frame. Jump from event to event, and search within your video or dataset.
Striped.Blue™ enables users to use any A.I. feature extraction algorithms on a video feed and convert the outcomes immediately into timeline data, even if this process takes longer than what is typically feasible for live streaming.
An integrated back channel allows you to implement advanced control functions, such as PTZ camera adjustments or control loops.
Depending on your use-case and industry, you can use our AWS®-based hosted and managed solution, you can use your own private cloud, or we can ship a containerized version for local processing.
There is no difference in how telemetry data or video channels are treated. On Striped.Blue™ they are simple an endpoint for further processing.
No matter what type of technology you use, Striped.Blue™'s architectural pass-through approach is universally compatible with any stack, platform, or device.
Telemetry automatically measures and transmits data from remote sources, using sensors and other devices to collect data. It uses communication systems to transmit the data back to a central location. Subsequently, the data is analyzed to monitor and/or control the remote system.
Machine Vision gives computerized equipment the ability to see. Machine Vision uses cameras to capture visual information from the surrounding environment. It processes the images using a combination of hardware and software to e.g. count objects, detect abnormalities, or find certain characteristics. Machine Vision technology often uses specialized optics (e.g. multi-spectrum cameras) to acquire images.
Universal Video Telemetry combines Machine Vision and Telemetry intelligently. Using a UVT Framework engineers and scientists can easily build applications where camera images need to be analyzed and mixed with telemetry data to create non-destructive and interactive augmented video streams.