HED introduction

HED (an acronym for Hierarchical Event Descriptors) is an evolving framework and structured vocabulary for annotating data, particularly data events to enable data search, extraction, and analysis. Specifically, the goal of HED is to allow researchers to annotate what happened during an experiment, including experimental stimuli and other sensory events, participant responses and actions, experimental design, the role of events in the task, and the temporal structure of the experiment.

The resulting annotation is machine-actionable, meaning that it can be used as input to algorithms without manual intervention. HED facilitates detailed comparisons of data across studies and promotes accurate interpretation of what happened as an experiment unfolds.

Brief history of HED

HED was originally proposed by Nima Bigdely-Shamlo in 2010 to support annotation in HeadIT and early public repository for EEG data hosted by the Swartz Center for Computational Neuroscience, UCSD (Bigdely-Shamlo et al. 2013). HED has undergone several revisions and substantial infrastructure development since that time.

The BIDS (Brain Imaging Data Structure) standards group incorporated HED as annotation mechanism in 2019. In 2019, work also began on a rethinking of the HED vocabulary design, resulting in the release of the third generation of HED in August 2021, representing a dramatic increase in annotation capacity and a significant simplification of the user experience.

New in HED (versions 8.0.0+) released August 2021.

  1. Improved vocabulary structure

  2. Short-form annotation

  3. Library schema

  4. Definitions

  5. Temporal scope

  6. Encoding of experimental design

See the HED Specification and the HED Documentation Summary for additional details.

Goals of HED

Event annotation documents the things happening during data recording regardless of relevance to data analysis and interpretation. Commonly recorded events in electrophysiological data collection include the initiation, termination, or other features of sensory presentations and participant actions. Other events may be unplanned environmental events (for example, sudden onset of noise and vibration from construction work unrelated to the experiment, or a laboratory device malfunction), events recording changes in experiment control parameters as well as data feature events and control mishap events that cause operation to fall outside of normal experiment parameters. The goals of HED are to provide a standardized annotation and supporting infrastructure.

Goals of HED.

  1. Document the exact nature of events (sensory, behavioral, environmental, and other) that occur during recorded time series data in order to inform data analysis and interpretation.

  2. Describe the design of the experiment including participant task(s).

  3. Relate event occurrences both to the experiment design and to participant tasks and experience.

  4. Provide basic infrastructure for building and using machine-actionable tools to systematically analyze data associated with recorded events in and across data sets, studies, paradigms, and modalities.

Current systems in neuroimaging experiments do not record events beyond simple numerical (3) or text (Event type Target) labels whose more complete and precise meanings are known only to the experimenter(s).

A central goal of HED is to enable building of archives of brain imaging data in a amenable to large scale analysis, both within and across studies. Such event-related analysis requires that the nature(s) of the recorded events be specified in a common language.

The HED project seeks to formalize the development of this language, to develop and distribute tools that maximize its ease of use, and to inform new and existing researchers of its purpose and value.

A basic HED annotation

HED annotations are comma-separated lists of tags selected from a hierarchically-organized vocabulary.

A simple HED annotation of presentation of a face image stimulus.

Sensory-event, Experimental-stimulus, (Visual-presentation, (Image, Face, Hair)), (Image, Pathname/f032.bmp), Condition-variable/Famous-face, Condition-variable/Immediate-repeat

The annotation above is a very basic annotation of an event marker representing the presentation of a face image with hair. The event marker represents an experimental stimulus with two experimental conditions Famous-face and Immediate-repeat in effect.

Because HED has a structured vocabulary, other researchers use the same terms, making it easier to compare experiments. Further, the HED infrastructure supports associating of these annotation strings with the actual event markers during processing, allowing tools to locate event markers using experiment-independent strategies.

The annotation in the example uses the most basic strategy for annotating condition variables — just naming the different conditions. However, even this simple strategy allows tools to distinguish among events taken under different task conditions. HED also provides more advanced strategies that allow downstream tools to automatically extract dataset-independent design matrices.

Every term in the HED structured vocabulary (HED schema) must be unique, allowing users to use a single word for each annotation tag. Tools can expand into their full paths within the HED schema, allowing tools to leverage hierarchical relationships during searching.

An equivalent long-form HED annotation of face image stimulus from above.


HED is also extensible, in that most nodes can be extended to include more specific terms. HED also permits library schema, which are specialized vocabularies. HED tools support seamless annotations that include both terms from the base schema and from specialized, discipline-specific vocabularies.

How to get started

The HED annotation quickstart provides a simple step-by-step guide to doing basic HED annotation, while the Bids annotation quickstart introduces the various types of annotation that should be included in a BIDS (Brain Imaging Data Structure) dataset. This tutorial also includes instructions for using the online tools to start the annotation process.