DFX Points Overview

The DFX Points refer to the (list of) results that are produced by the DeepAffex engine after processing an input such as a video stream or “live” extracted facial blood-flow data from a digital camera. Every individually accessible element on the DeepAffex cloud engine will be identified by a DFX Point and is assigned a unique name/label with corresponding attributes and characteristics.

A summary of DFX Points (entries) is listed in the attached document and grouped into sections of related functionality.

View DFX Point PDF

DFX Points Terminology and Components

Regions Of Interest (ROI)

The ROI is a programmable pixel-level geometric outline that becomes ‘virtually’ superimposed onto the face of the subject, whose purpose is to define a coverage area for where the facial blood-flow (FBF) extraction will be targeted. The DeepAffex SDK extraction library performs simultaneous extractions from multiple facial ROI masks while maintaining synchronization with the video frame timing.

The facial blood-flow (FBF) extracted data is provided to the DeepAffex user under the "Regions Of Interest" (ROI) group of DFX points. The DeepAffex engine currently supports 3x categories of ROI masks:

  • A "default" set of ROI masks that are generated according to anatomical landmarks of the face
  • User defined or "configurable" ROI masks such as combinations of existing ROIs or defining entirely new geometric shapes
  • An "experimental" set of ROI masks that are systematically generated according to simple patterns such as facial grids

View ROI PDF

Waveforms (Biosignals)

Over the last century, medical researchers have observed and documented in the literature modulations of blood-flow that appear to carry information that is relevant and correlated with the subject’s physiology. Through studies of these poorly understood "biosignals" that are encoded within the human circulatory system, researchers may gain new insights into the subject"s underlying physiological state.

One way that DeepAffex engine provides information to the user (researcher) is through the recovery of facial blood-flow (FBF) envelopes. By applying demodulating techniques to the (recovered) "biosignal" envelopes, the source (information) may be decoded. These recovered facial blood-flow (FBF) envelopes are provided to the DeepAffex user under the "Waveforms" (Biosignals) group of DFX points.

Examples of low-frequency (LF) waveforms that may be recovered from FBF data include the Mayer Wave, Traube-Hering Wave and the Thermal Wave.

Examples of cyclical or periodic waveforms that may be routinely recovered from FBF data include the Breathing (respiration cycle) envelope and the Heart Beat envelope.

Face-Tracking (Subject position)

The DeepAffex SDK extraction library relies on a customer supplied (3rd party) face-tracker (FT) component to locate and dynamically track the subject"s facial landmarks and head contour and uses the facial point coordinates to locate the ROI masks. Note that the FT is not used to recognize or identify a subject or their identity.

The face-tracker (FT) data is used anonymously to provide enhanced value from the extraction process and is provided to the DeepAffex user under the "Face-Tracker" (Subject image tracking) group of DFX points. For example, the face position (3D coordinates) and simple FT derived facial expressions are exposed to our user.

We have also implemented powerful machine-trained models which accurately predict

Demographics estimation:

  • AGE
  • GENDER
  • HEIGHT
  • WEIGHT
  • BMI

Health Analytics (Physiology)

The facial blood-flow (FBF) extracted data that results from the SDK extraction generates a unique source of data that NuraLogix is harnessing to provide valuable insights and discoveries into human physiology. This growing list of DFX Points represent science backed implementations that exploit characteristics of the subject"s FBF to yield PHYSIOLOGICAL information.

The potential for the FBF extracted data to be utilized for PYHSIOLOGICAL assessment and benefit is provided to the DeepAffex user (researcher) as the "Health Analytics" group of DFX points. A constantly updated list of processed outputs is sorted by functionality. Examples of current and planned functional groups include:

Signs of Life (SOL) classification:

  • The ability to validate a subject"s facial image(s) as having legitimate FBF (Alive) or no FBF (Fake)
  • Spatial representation of the face indicating areas of strongest/weakest hemoglobin fluctuations as a "heat map".

Blood Pressure (BP) estimation:

  • Systolic BP (mmHg)
  • Diastolic BP (mmHg)
  • Delta Mean-Arterial-Pressure (MAP)
  • Delta Pulse-Pressure (PP)

Heart-Rate-Variability (HRV) scores:

  • 7x Time Domain (TD) features
  • 8x Freq Domain (FD) features
  • 3x Poincare (PC) features
  • 2x Non-Linear (NL) features
  • 2x Joint Time-Frequency (TF) features

Stress Index (SI):

  • RRI (Inter-beat-Interval)
  • Mental Stress Index (MSI)

Heart Beat (HB):

  • Heart Rate (freq)
  • Heart Rate (count)
  • Heart Band (envelope)

Cardio Risk Index:

  • 10 Year CHD score

Breathing (respiration cycle):

  • Breathing Rate (freq)
  • Breathing Rate (count)

Emotion Analytics (Psychology)

The facial blood-flow (FBF) extracted data that results from the SDK extraction generates a unique source of data that NuraLogix is harnessing to provide valuable insights and discoveries into human psychology. This growing list of DFX Points represent science backed implementations that exploit characteristics of the subject"s FBF to yield PSYCHOLOGICAL information.

The potential for the FBF extracted data to be utilized for PSYCHOLOGICAL assessment and benefit is provided to the DeepAffex user (researcher) as the "Emotion Analytics" group of DFX points. A constantly updated list of processed outputs is sorted by functionality. Examples of current and planned functional groups include:

Affinity classification:

  • Indicates whether the subject has a positive or negative affinity to the experimental condition based on facial-blood-flow (FBF) analysis.

Mood classification:

  • Indicates the subject VALANCE state as POSITIVE, NEUTRAL or NEGATIVE based on FBF analysis.
  • Indicates the subject AROUSAL level based on FBF analysis

Emotion classification:

  • Indicates the subject EMOTIONS (as prob% vs time) based on FBF analysis.