Detailed Description
Access to video audio and data streams from an Encoder.
Media Stream Component
The features provided by the SDK are divided between those required to provide an end user application for monitoring multiple media streams and those required to administer the Server, Encoders and user access rights. This section details the features offered by the SDK required to build a monitoring application.
In order to access a media stream it is necessary to first connect to a Server using the Server object which gives access to the available Encoders.
Media Stream Data
An Encoder can deliver a media stream that can include video, stereo audio and metadata. To access a media stream it is necessary to obtain and open the stream from the Encoder object. This request will (assuming there are no other users receiving the media stream) instruct the Encoder to provide the media stream to the Server which in turn will forward the stream to the client. A media stream is only generated if there is at least one recipient of that stream.
Once opened the Stream data is returned in callback interfaces to the client application. Access the contents of the stream will depend on the permissions, configuration and deployment of the Encoder:
Video Frames
The decompressed video frames are provided as raw YCbCr 4:2:0 frames where the SDK includes helper functions to support conversion to RGB bitmaps and save JPEG or BMP images.
Each video frame is uniquely identified by a UTC timestamps corresponding to the time at capture. Additional information supplied with each frame includes a flag indicating that the frame has been assigned to the Encoder frame buffer and an aspect ratio modifier that should be applied as part of the frame render process.
Audio Frames
The audio, if enabled, will be delivered as frames of 16-bit PCM samples. According to the Encoder configuration these frames will represent mono or stereo audio. Each audio frame is uniquely identified by a UTC timestamp corresponding to the time at capture.
GPS Data
Where a GPS device is attached to the Encoder, geospatial metadata is included in the media stream. Geospatial metadata is treated as a special case with respect to its insertion into the media stream in order to ensure that positional information is matched to the corresponding video and audio.
Pass-Thru Data
Where a third-party device is attached to the Encoder and the serial port has been configured to receive data from that device it can be broadcast to viewers in the media stream (one-to-many). The data is received in packets of raw arrays of bytes for interpretation by the application.
SCT Data
This allows the application developer to obtain the raw compressed media stream. This stream can be saved to file and can later be passed through a decoder that will deliver all the media contained in the stream.
Sample applications
Full sample applications are available demonstrating the capabilities of the SDK
Classes | |
struct | AnalyticsActorPoint |
Describes an analytics actor points. More... | |
struct | AnalyticsActorInfo |
Describes an analytics actor data object being received. More... | |
class | EdgeVisDecoderSDK::AnalyticsActorInfo |
Encoders equipped with SafeZone 2D analytics will return an actor when an object is detected. More... | |
class | EdgeVisDecoderSDK::AnalyticsData |
Encoders equipped with SafeZone 2D analytics will return analytics data when an object is detected. More... | |
class | EdgeVisDecoderSDK::AudioData |
AudioData is delivered for every frame of audio decoded from a media stream. More... | |
class | EdgeVisDecoderSDK::CameraPositionData |
If the encoder has camera position data it will return this information as part of the stream data. More... | |
class | EdgeVisDecoderSDK::CommandData |
A command received by the dewclaw library. More... | |
class | EdgeVisDecoderSDK::EncoderMetaData |
An Event received from the Encoder by the decoder library. More... | |
class | EdgeVisDecoderSDK::EventData |
An Event received from the Encoder by the decoder library. More... | |
class | EdgeVisDecoderSDK::FrameConfig |
This is the base class for the FrameConfig objects. More... | |
class | EdgeVisDecoderSDK::GPSData |
If the encoder has live GPS data it will return location information as part of the stream data. More... | |
class | EdgeVisDecoderSDK::H264Frame |
An H264Frame object provides H.264 (Annex B) data from the MediaStream to the applicaiton. More... | |
class | EdgeVisDecoderSDK::H264FrameConfig |
This is the derived class for the H264FrameConfig objects. More... | |
class | EdgeVisDecoderSDK::IndexedSCTData |
An Indexed SCT Data block. More... | |
class | EdgeVisDecoderSDK::IServiceListener |
Defines an interface for a class that listens for incoming state changes from a service. More... | |
class | EdgeVisDecoderSDK::IStreamListener |
Defines an interface for a class that listens for incoming data from a Stream. More... | |
class | EdgeVisDecoderSDK::JPEGFrame |
A JPEGFrame is designed for transforming YCbCr colour space frames into an in memory JPEG frame buffer. More... | |
class | EdgeVisDecoderSDK::JPEGFrameConfig |
This is the derived class for the JPEGFrameConfig objects. More... | |
class | EdgeVisDecoderSDK::MediaStream |
A MediaStream is a Stream class that provides additional methods and properties specific to the Media Stream. More... | |
class | EdgeVisDecoderSDK::OnvifMetadata |
Provides analytics data in ONVIF XML format: https://www.onvif.org/specs/srv/analytics/ONVIF-VideoAnalytics-Service-Spec-v210.pdf. More... | |
class | EdgeVisDecoderSDK::RGBFrame |
An RGBFrame is designed for transforming YCbCr colour space frames into RGB colour space frames. More... | |
class | EdgeVisDecoderSDK::SnapshotFrame |
A SnapshotFrame is an extension of YCbCrFrame. More... | |
class | EdgeVisDecoderSDK::Stream |
The Stream interface represent a service that is available on an Encoder. More... | |
class | EdgeVisDecoderSDK::StreamData |
This is the base class for the StreamData objects. More... | |
class | EdgeVisDecoderSDK::VideoData |
The VideoData class extends the StreamData class and is a base class for all types of video data provided by the MediaStream. More... | |
class | EdgeVisDecoderSDK::YCbCrFrame |
An YCbCrFrame is delivered for every frame of video decoded from a media stream. More... | |
Typedefs | |
typedef struct AnalyticsActorPoint | AnalyticsActorPoint |
Describes an analytics actor points. More... | |
typedef struct AnalyticsActorInfo | AnalyticsActorInfo |
Describes an analytics actor data object being received. More... | |
typedef enum AudioFormat | AudioFormat |
Audio data format. | |
typedef enum H264FrameType | H264FrameType |
H.264 frame types that may be provided by the MediaStream. | |
typedef enum EdgeVisDecoderSDK::IndexedPacketType | EdgeVisDecoderSDK::IndexedPacketType |
Contains the type of IndexedSCTData | |
typedef enum RGBFormat | RGBFormat |
RGB formats supported by the RGBFrame Convert method. | |
typedef enum StreamStatus | StreamStatus |
Stream status information types. | |
typedef enum StreamTimestampContext | StreamTimestampContext |
Stream timestamp context. | |
typedef enum EdgeVisDecoderSDK::StreamType | EdgeVisDecoderSDK::StreamType |
Defines the type of stream. | |
typedef enum VideoCodecMode | VideoCodecMode |
These are used by the EncoderConfig class to represent the codec modes available. | |
typedef enum VideoSourceFormat | VideoSourceFormat |
The type of video source in use at the encoder. | |
Enumerations |
Typedef Documentation
typedef struct AnalyticsActorPoint AnalyticsActorPoint |
Describes an analytics actor points.
The points represent the track of the actor across the scene. The Points will update as the actor moves across the scene.
typedef struct AnalyticsActorInfo AnalyticsActorInfo |
Describes an analytics actor data object being received.
Enumeration Type Documentation
|
strong |
enum H264FrameType |
Contains the type of IndexedSCTData
Enumerator | |
---|---|
IPT_Corrupt |
Frame is corrupt. |
IPT_Frame |
Normal frame of data. |
IPT_KeyFrame |
A ket frame. |
enum RGBFormat |
RGB formats supported by the RGBFrame Convert method.
enum StreamStatus |
Stream status information types.
Defines the type of stream.
enum VideoCodecMode |
These are used by the EncoderConfig class to represent the codec modes available.
enum VideoSourceFormat |