EP3160347B1 - System und verfahren zur synchronisierung eines pc mit nicht-echtzeit-betriebssystems mit einem entfernten echtzeit-datenerfassungsmikrosteuergerät - Google Patents
System und verfahren zur synchronisierung eines pc mit nicht-echtzeit-betriebssystems mit einem entfernten echtzeit-datenerfassungsmikrosteuergerät Download PDFInfo
- Publication number
- EP3160347B1 EP3160347B1 EP15814892.4A EP15814892A EP3160347B1 EP 3160347 B1 EP3160347 B1 EP 3160347B1 EP 15814892 A EP15814892 A EP 15814892A EP 3160347 B1 EP3160347 B1 EP 3160347B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- subject
- receiver
- transceiver
- time
- microcontroller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
- 238000000034 method Methods 0.000 title description 6
- 230000004044 response Effects 0.000 claims description 45
- 230000035484 reaction time Effects 0.000 claims description 30
- 230000000295 complement effect Effects 0.000 claims description 10
- 230000005540 biological transmission Effects 0.000 claims description 9
- 230000001747 exhibiting effect Effects 0.000 claims description 8
- 230000005236 sound signal Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 description 25
- 239000000523 sample Substances 0.000 description 20
- 238000013480 data collection Methods 0.000 description 16
- 230000001360 synchronised effect Effects 0.000 description 11
- 238000012360 testing method Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 230000001953 sensory effect Effects 0.000 description 8
- 230000001149 cognitive effect Effects 0.000 description 7
- 210000003128 head Anatomy 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000002123 temporal effect Effects 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 6
- 238000012544 monitoring process Methods 0.000 description 6
- 230000000638 stimulation Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 241001422033 Thestylus Species 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 239000000090 biomarker Substances 0.000 description 4
- 210000004556 brain Anatomy 0.000 description 4
- 230000036995 brain health Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 210000004761 scalp Anatomy 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000003925 brain function Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000002560 therapeutic procedure Methods 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 210000000133 brain stem Anatomy 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 230000000193 eyeblink Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 210000000653 nervous system Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 239000008177 pharmaceutical agent Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000003908 quality control method Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 101100126328 Homo sapiens ISLR2 gene Proteins 0.000 description 1
- 102100023540 Immunoglobulin superfamily containing leucine-rich repeat protein 2 Human genes 0.000 description 1
- 241000577979 Peromyscus spicilegus Species 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 210000003926 auditory cortex Anatomy 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008827 biological function Effects 0.000 description 1
- 229960000074 biopharmaceutical Drugs 0.000 description 1
- 230000005978 brain dysfunction Effects 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 210000003169 central nervous system Anatomy 0.000 description 1
- 235000019506 cigar Nutrition 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000036992 cognitive tasks Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000002651 drug therapy Methods 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000000763 evoking effect Effects 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 238000013160 medical therapy Methods 0.000 description 1
- 230000007721 medicinal effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009022 nonlinear effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000037368 penetrate the skin Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000006461 physiological response Effects 0.000 description 1
- 238000002106 pulse oximetry Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000012536 storage buffer Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/04—Generating or distributing clock signals or signals derived directly therefrom
- G06F1/14—Time supervision arrangements, e.g. real time clock
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09C—CIPHERING OR DECIPHERING APPARATUS FOR CRYPTOGRAPHIC OR OTHER PURPOSES INVOLVING THE NEED FOR SECRECY
- G09C1/00—Apparatus or methods whereby a given sequence of signs, e.g. an intelligible text, is transformed into an unintelligible sequence of signs by transposing the signs or groups of signs or by replacing them by others according to a predetermined system
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0006—ECG or EEG signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0017—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system transmitting optical signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/002—Monitoring the patient using a local or closed circuit, e.g. in a room or building
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/12—Audiometering
- A61B5/121—Audiometering evaluating hearing capacity
- A61B5/125—Audiometering evaluating hearing capacity objective methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/162—Testing reaction times
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/167—Personality evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/168—Evaluating attention deficit, hyperactivity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/25—Bioelectric electrodes therefor
- A61B5/279—Bioelectric electrodes therefor specially adapted for particular uses
- A61B5/291—Bioelectric electrodes therefor specially adapted for particular uses for electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/375—Electroencephalography [EEG] using biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
- A61B5/377—Electroencephalography [EEG] using evoked responses
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4803—Speech analysis specially adapted for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4884—Other medical applications inducing physiological or psychological stress, e.g. applications for stress testing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/70—Services for machine-to-machine communication [M2M] or machine type communication [MTC]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/339—Displays specially adapted therefor
Definitions
- the invention relates to hardware and systems to synchronize the presentation of stimuli and probes on a commercial PC with a standard but non-real-time operating system such as Windows, Linux, Android, or iOS with the precision clock in a real-time data acquisition in an embedded microcontroller.
- a standard but non-real-time operating system such as Windows, Linux, Android, or iOS
- a data collection engine The function of a data collection engine is to collect data from each of the biosensors, human interface devices (HID) devices (including keystrokes and mouse/touchpad/touch screen events), and to provide synchronous time-stamping of data streams.
- the data collection engine will also communicate with the REM to receive data packet stream (e.g. via Bluetooth or Wi-Fi) and send configuration commands and stimulus data (e.g. audio files).
- the data collection engine will also provide timestamps for all output stimuli (video display). This module must have the highest priority of any program running on the device to ensure precise and accurate time stamping of all data is an incoming and outgoing.
- the interval between a mouse click, a keyboard switch contact, or a touchpad event and the software assigned timestamp must occur with the same latency for each event of that type.
- the software generated times tamp of a mouse click must align precisely with the EEG data sample timestamps.
- the timestamp for a stimulus (such as display of an object on the video screen) must be accurately synchronized to input data streams. This will allow the ability to identify the exact EEG data sample that occurred during another input event (e.g. mouse click) or a stimulus event (e.g. video display event).
- a stimulus event e.g. video display event
- a computer server that collects biosensor data provides a ring buffer for client access functions as well as a storage buffer in volatile and memory or encrypted flat files.
- a data collection engine leverages the security/data encryption module to store data locally or retain in volatile memory (RAM) and transmits encrypted data to a cloud API using a cloud communications module.
- the I/O module will also allow other client software components access to the data stream.
- the server will also allow external devices to connect to the ring buffer for read only access.
- Level 1 Timestamp precision of better than 10 milliseconds resolution. Most if not all functions will be at this level. For example, the time-stamping precision between keyboard events and EEG data-sample timestamps need to be less than 10 milliseconds. Specifically, a keyboard event will be time-stamped within 10 milliseconds of the EEG data sample with which it occurred.
- Level 2 Timestamp precision within or less than one millisecond. This level of precision is reserved for two fundamental tasks: an audio evoked response potential (ERP) task and two reaction time tasks.
- ERP audio evoked response potential
- a data sample timestamp must be synchronized across input streams within 1 millisecond.
- the audio ERP task the start of an audio tone must occur and be time-stamped within 1 millisecond of the EEG data-sample with which it co-occurred.
- a second example is the reaction time task.
- a mouse button press, keyboard press, or touch screen event must be time-stamped with a precision of 1 millisecond relative to the onset of an image on the screen. This level of precision is critical because human reaction times are measured on the millisecond or sub-millisecond scale.
- the invention includes systems that synchronize the presentation of stimuli and probes on a commercial PC with a standard but non-real-time operating system such as Windows, Linux, Android, or iOS with the precision clock in a real-time data acquisition embedded microcontroller.
- Paired emitter/receivers and encoding software implemented on the processing device may also be provided to precisely synchronize in time an operating system of the processing device with a real-time environment set by a real-time clock of the processing device receiving biosensor outputs at inputs thereof.
- An exemplary embodiment of the system synchronizes a PC exhibiting latency of operations to a biosensor enabled microcontroller with real-time clock by providing an encoding scheme that captures the subject's absolute reaction time transmits the subject's reaction time from the PC exhibiting latency to the microcontroller with real-time clock.
- the system includes a transmitter that transmits a stimulus signal from the PC exhibiting latency, an input device indicating the subject's response to the stimulus signal, an encoding circuit adapted to encode a difference in time between the stimulus signal and the subject's response to the stimulus signal, an emitter adapted to transmit the encoded difference signal representing the subject's reaction time, and a complementary receiver adapted to detect the encoded difference signal.
- the receiver includes a decoding circuit that decodes the encoded difference signal to determine the subject's reaction time, and the receiver provides the subject's reaction time to the microcontroller with real-time clock for synchronization with received biosensor data such as EEG data.
- the emitter comprises a visible LED, an ultrasonic transducer, an infrared (IR) LED, an audible speaker, audible transducer, a Bluetooth transmitter/transceiver, a Wifi transmitter/transceiver, a ZigBee transmitter/transceiver or AM or FM transmitter/transceiver
- the complementary receiver comprises a visible photodiode, visible phototransistor, an ultrasonic receiver/microphone, an infrared (IR) photodiode, an infrared phototransistor, an audible microphone, a Bluetooth receiver/transceiver, a Wifi receiver/transceiver, a ZigBee receiver/transceiver, an AM receiver/transceiver, or an FM receiver/transceiver.
- the input device may be a mouse that provides an input to the encoding circuit.
- the encoding circuit is also responsive to the stimulus signal from the transmitter and encodes the difference signal as an on/off keyed wireless signal that is provided to the emitter for wireless transmission to the complementary receiver.
- the encoding circuit also may encode a left mouse click as one keyed pulse and a right mouse click as two keyed pulses.
- the complementary receiver is located at an EEG headset of the subject.
- the EEG headset includes the decoding circuit and the microcontroller for synchronizing the subject's reaction time to EEG data collected by the EEG headset.
- the emitter may comprise an audible speaker and the receiver may comprise an earbud of the subject, where the earbud provides received sound signals to the decoding circuit.
- the EEG headset also may be adapted to include a finger tap input as the input device.
- the input device may a stylus and the PC exhibiting latency may be a tablet PC.
- the systems and methods of the invention comprise multiple transducers to both stimulate and record the physiological response of the brain and the body in order to assess its health and function.
- Central to the system is the ability to directly record brainwave activity from an electrode place non-invasively on or near the scalp.
- additional information on brain health and function can be derived from transducers that measure position and motion, temperature, cardiovascular properties like heart rate, heart rate variability, and arterial oxygen, as well as cognitive information, speech, eye movement, and surface skin conductance to name a few non-limiting additional biological signal measurement data stream examples. It is often necessary to bring the system to the human subject, getting out of the hospital or doctor's office and enabling data collection in the home or sports field or combat theater, thus providing accessibility to the brain health and function assessment from a lightweight and portable form factor. Moreover, it would be advantageous to have a minimal cost associated with the system so that it can be used around the globe to help those in need of brain health and function assessments.
- a solution to these problems includes the creation of a system of body worn or body proximal electronic modules (EMs) or reusable electronic modules (REMs) with the ability to both record biological signal measurement data streams (biosensor data) as well as present stimuli to the human subject in the form of various sensory and cognitive challenges and tasks.
- EMs body worn or body proximal electronic modules
- REMs reusable electronic modules
- one such electronic module (EM) or reusable electronic module (REM) can be placed in the vicinity of the head and be either reused over and over if it does not touch the human body or disposed of if it comes in direct contact with the human body.
- a peripheral computer system (typically laptop or tablet PC but includes smartphone and other intermediate form factors) is used to administer a human subject's brain health assessment including various biosensors to record physiological parameters and streams of biosensor data. It can also include temporal measures of a subject's reaction time when presented with a sensory stimulus (i.e. video, acoustic) and records the subject's response via the computer or other input device (i.e. mouse, stylus, keyboard, microphone, accelerometer, etc.) as it measures the reaction time between stimulus presentation and the subject's response via the peripheral computer's clock.
- the precision and repeatability of the reaction time measurement is typically dependent on a commercial multi-tasking operating system which can introduce timing errors due to software latency and timing jitter (e.g. Microsoft Windows, Linux, Google Android, or Apple iOS).
- the present invention describes a low-cost and simple to implement electronic hardware solution which can attach to the peripheral computer system.
- the invention provides real-time time-stamping between the patient's stimulus, the patient's response and other biosensor data that is streaming from the human subject to an embedded microcontroller with a real-time clock, capable of synchronizing the various data packets at a much higher rate (less than 1 millisecond and perhaps as fast.1 microsecond) and greater temporal precision than a commercial multi-tasking OS not designed as a real-time operating system (RTOS).
- RTOS real-time operating system
- the real-time hardware system in an exemplary embodiment includes a sensor that detects when a stimulus is generated by the computer system or presented to the subject, a second sensor to sense the subject's response, and a microcontroller to precisely record the response time with microsecond resolution.
- the invention provides a synchronization signal to the biosensor measurement system enabling a time-lock the biosensor data to the stimulus/response event in the peripheral PC.
- the present invention is a solution which can temporally synchronize the probes and stimuli on the laptop PC with the biosensor system using various encoding schemes.
- FIG. 1 One embodiment of the invention is illustrated in Figure 1 .
- the real-time clock of the microcontroller could keep microsecond precision (or better) between the various biosensor data streams.
- popular microcontrollers such as the Texas Instruments MSP430 series or the ARM Cortex series could be utilized. What is difficult to do is to synchronize the probe presentation on the commercial PC with the real-time clock of the embedded microcontroller.
- FIG. 1 where a laptop or tablet PC with visual and audio display probes 420 is connected via Bluetooth module 1 (434) to Bluetooth module 2 (432).
- the various sensors 422 are combined into one and I2C or alternatively SPI digital bus 426 provide the biosensor data to the micro controller with real time clock 430.
- the biosensor enabled microcontroller with real time clock 430 is interfaced to Bluetooth module 2 (432) via a UART interface, which completes the loop to Bluetooth radio 1 (434), a part of the PC or tablet 420.
- the PC sound card could emit a short 20 millisecond burst of sine waves of various frequencies.
- the first burst at the first second could be 1010 Hz so that 20 periods could be broadcast in the short 20 millisecond transmission.
- the frequency could be 1020 Hz, the third emission could be 1030 Hz, etc.
- a microphone sensor attached to the embedded microprocessor could be used to precisely adjust for latency in the variable PC operating system.
- information transfer from the peripheral PC to the microcontroller will require energy to be transmitted from the PC to the microcontroller.
- This energy can be in the form of an electrical signal if hardwired or alternately in the form of light (photons), sound waves, radio transmission (RF).
- Photons can be in the form of light
- RF radio transmission
- Other forms of energy can be contemplated as well.
- Sub-forms can further be segmented as well, such as photons that can be visible or UV or infrared. In the case of sound waves, they can be audible to a human or ultrasonic.
- FIG. 2 An alternate embodiment can be seen in Figure 2 where one could use any one of the emitters 522 attached to PC 520 to emit energy which would be detected by sensor 526 directly connected to the biosensor enabled microprocessor/microcontroller 530 with a real-time clock and operating system.
- an LED attached to the PC 520 acts as the emitter and a photodiode or other light receiver is attached to the embedded microcontroller 530.
- the light based encoding scheme could modulate fiducial signals from the PC to a specially included sensor into the inputs of the microprocessor 530, thereby independently and precisely measuring the relative position in time of the probe/stimuli on the PC 520 with the recorded biosensor signals. As light travels much faster than sound, this would be even more precise but is not necessarily required.
- Alternate embodiments are self-apparent and include use of ultrasound waves via an ultrasonic transducer in the 20-50 KHz range which is inaudible to humans but easily broadcast and measured with modern emitters and receivers. Small emitter devices could be plugged into the USB, headphone output and other analog and digital outputs of the PC 520 which are then coupled to particular biosensors included in the microprocessor 530 in order to temporally synchronize the PC with the recording biosensor array.
- Alternate embodiments include the use of an infra-red LED (wavelengths shorter than the eye can see) with an appropriate IR photodiode to receive the transmitted light, audible sounds and an audible microphone, or even a first Bluetooth radio 1 and a second Bluetooth radio 2.
- this paired emitter-receiver approach can be generalized by the inclusion of Transceiver 1 (532) attached to the PC 520 with latency and Transceiver 2 (534) attached to the real-time embedded microprocessor 530.
- the hardware for a visual stimulus and mouse click response computer system includes a light sensor 566 (i.e. photodiode, photo transistor, photocell, photo-resistor) which attaches to the computer's screen 570 (i.e. LCD, touchscreen).
- the light sensor 566 is positioned in the corner of the screen such that it does not block the main view of the computer's screen.
- the light sensor 566 could be attached by means of a suction-cup, clip, tape or other means.
- the light sensor 566 would be shrouded to prevent stray room light from shinning on the detector to insure the sensor is only detecting light from the computer display.
- a small dot 568 is also simultaneously illuminated on the computer display 570 under the light sensor 566.
- the light sensor signal 580 would be amplified by amplifier 582 ( Figure 6 ) and pulse shaped by a voltage comparator 584.
- a reference voltage 583 sets the light intensity threshold. This reference could be automatically adjusted via an analog output from the microcontroller 586.
- the light sensor 580 and voltage comparator 584 circuitry will present a precision trigger signal to the microcontroller 586 thereby signaling the start-time of the visual stimulus.
- the microcontroller would immediately activate a digital output 588 logic high signaling the start of the stimulus.
- Wireless transmitter 590 then provides the digital output 588 to antenna 592 for transmission.
- a customized mouse 572 ( Figure 4 ) is used.
- the microcontroller would monitor the switch contact 600 ( Figure 7 ) of the computer mouse. When the subject responds to the stimulus by clicking the mouse button, the microcontroller would reset the digital output 588 to logic low to signal the precise time the patient's reaction response occurred.
- the pulse width of the digital output 610 ( Figure 8 ) would represent the subject's reaction time and the leading edge 612 ( Figure 8 ) would indicate the start time of the visual stimulus.
- the gated digital on/off pulse 588 would be used to turn on/off a wireless transmitter 559 which would transmit an On/Off Keyed (OOK) modulated carrier to a matching receiver 578 ( Figure 9 ).
- the receiver 578 would amplify 632 and demodulate 630 the carrier thereby presenting a digital replicate 616 ( Figure 8 ) of the reaction time pulses to the biosensor data acquisition microcontroller.
- the biosensor data acquisition microcontroller would time-stamp the data with the biosensor data 646 ( Figure 10 ).
- the wireless transmitter/receiver pair could be a radio transmitter operating in the RF spectrum, a pulsed infrared light operating in the infrared spectrum, or an ultrasonic pulse operating in the ultrasonic sound spectrum as follows.
- the hardware for an RF wireless link may include the LINX Inc., TXM-433-LR RF Transmitter and RXM-433-LR Receiver pair that operates in the 433 MHz RF band.
- the small sized integrated circuit makes a simple low-parts-count solution for a RF link. http://www.linxtechnologies.com/resources/data-guides/txm-xxx-lr.pdf. Alternate choices include a Silicon Labs SI4010 paired with a ST4313 or a RF Solutions AM110C1-315 operable in the 868 MHz and 915 MHz band as well. Even use of a 2.4 GHz radio transceiver like the Nordic nRF24L01+ can be used in a pair. Bluetooth transceivers, ZigBee, ANT and others are also embodiments of the present invention. Even Wi-Fi modules such as ESP8266 Wi-Fi could be employed in a pair.
- the IR transmitter hardware for an infrared wireless link may include a Vishay Semiconductors TSAL4400 infrared (IR) light emitting diode (LED) operating at an IR wavelength of 940nm and pulsed on/off via the microcontroller at a carrier frequency of 36 kHz. http://www.vishay.com/docs/81006/tsal4400.pdf.
- the IR receiver hardware could include, as non-limiting examples, a Vishay Semiconductor TSOP6236 which has a 36 kHz band-pass filter to eliminate background noise. http://www.vishay.com/docs/82463/tsop62.pdf. Also possible would be VS1838 TL and TL1838 or VS1838B universal receiving head.
- the ultrasonic transmitter hardware for an ultrasonic link may include a Murata Electronics MA40S4S ultrasonic transmitter operating at an acoustic frequency of 40 kHz using the microcontroller to generator the 40 kHz carrier. http://www.murata.com/ ⁇ /media/webrenewal/support/library/catalog/products/k70e.ashx.
- the ultrasonic receiver hardware could include a Murata Electronics MA40S4R receiver. Alternates pairs of transmitter / receiver in the ultrasonic space include the HC-SR04 transmitter with a US-015 receiver, as well as a TCT40-16T transmitter with a TCT40-16R receiver.
- the transmission link could also be a hardwired connection between the biosensor headset and the stimulus/response circuitry.
- optical isolation circuitry would be used to insure patient safety.
- a preamble transmission prior to start of the patient's test could be easily sent by using two or more light sensors attached to the computer screen 570. By lighting an appropriate array of dots in a binary format, the microcontroller could identify which test is about to occur and transmit this information to the EEG data acquisition microcontroller.
- a system to synchronize a jittery PC with latency 560 with visual display 570 is synchronized with the head worn electronics module 576 on a subject which includes an embedded microcontroller.
- the emitter is a particular region 568 of the visual display 570 which emits light that is received by a photodiode or phototransistor 566 which is connected to an electronic circuit 564 which includes an encoding scheme 562 which wirelessly transmits encoded data to a receiving circuit 578 in the electronics module 576 worn by the subject.
- precise timing of mouse clicks in mouse 572 is directly wired via connection 574 as an input to the system through electronic circuit 564.
- PC 560 with visual screen 570 has a particular region of interest 568 which is responsible for generating the fiduciary light signal which is well synchronized with the other objects 571 displayed and presented as probes to the subject on the PC visual display 570.
- Figure 6 shows a detail of photodiode or photo resistor 580 which is wired into electronic circuit 564 which consists of amplifier 582 whose output is connected to a voltage comparator 584 which looks at a threshold reference 583 and compares the two in microcontroller 586 who has a gated on off output 588 which moves to wireless transmitter 590 to be output via the antenna 592.
- Figure 7 shows a detail of computer mouse 572 which has a high-speed precision switch 600 included in its manufacturing such that the left and right mouse buttons 602 can be encoded via a direct link 574 to either a wired or wireless interface to the subject form embedded micro controller.
- Figure 8 shows a detailed view of the encoding scheme 562 whereby in the upper trace one sees a graphical representation of amplitude as a function of time along the x-axis. To the far left of the upper trace one can see visual stimulus 612 starts and then ends at mouse click response time 614 thereby creating a test one reaction time 610. In the lower trace, one can see the emitter's output in the form of an on and off keyed wireless signal of high relative frequency to enable precise timing in determining the start of the visual stimulus 612 and the end of the mouse click response of the subject 614 thereby encoding the test one reaction time 610 by the presence and duration of the keyed high-frequency wireless signal.
- reaction time 618 To the right in the upper trace, one can see a second test to reaction time 618 and below it its corresponding keyed high-frequency wireless signal 620, which encodes the duration of test to reaction time 618 by the initiation and termination of the keyed high-frequency wireless signal 620.
- Figure 9 provides a more detailed look at a EEG headset or other electronic module 576 worn on the head with a wireless receiver used to capture the wireless reaction time signal for synchronization with EEG data, accelerometer data, pulse oximetry data, and any other head worn biosensor based signals.
- the receiver circuit 578 which is comprised of a receiving antenna 634 which is attached to amplifier 632 which is demodulated by demodulator 630 within the electronic module worn on the subject 576.
- Figure 10 illustrates how the subject's reaction time information can be precisely synchronized with a biosensor based signal such as an EEG signal in the microcontroller with real time clock.
- This scheme 580 shows in the upper trace an amplitude on the y-axis as a function of time on the x-axis for the digital signal which is arriving from the subject's interaction with the jittery PC with latency.
- the presentation of something such as a visual stimulus would start at 642 and the subject would have an event or response here shown as a non-limiting mouse click response 644 thus creating a time difference between the visual stimulation 642 and the response 644 which consists of test one reaction time 640.
- This digital signal would then be in tight synchronization with the ongoing biosensor based signal 646, shown in the present example as a non-limiting EEG signal.
- the ongoing biosensor based signal 646 shown in the present example as a non-limiting EEG signal.
- the right of Figure 10 in the upper trace one can see a repeat with a longer response test 2 reaction time 648 along with the synchronized and corresponding lower right trace of the biosensor signal 650, in this case an EEG signal.
- Figure 11 shows a light sensor 662 and lens 660 mounted inside the EEG headset or subject worn electronics module capturing light directly from the computer LCD or visual display.
- the light from the jittery PC with latency emitted from the visual display is directly collected via lens 660 into photodiode or phototransistor 662.
- the photo diode or photo transistor signal is coupled to amplifier 664 which is connected to voltage comparator 668 along with the threshold reference 666 such that microcontroller 670 can thus determine when the collected light from the jittery PC with latency rises above a preset threshold indicative of the presentation of a probe or stimulate the subject for their response.
- This embodiment has the advantage that it does not require additional hardware in the vicinity of the jittery PC with latency but has the further requirement that the light which is emitted by the visual display of the jittery PC with latency is being received and can be collected by lens 660.
- Figure 12 illustrates an alternate embodiment of the present invention which consists not of a computer mouse but instead a special stylus 684 in this case connected with a wire 686, whereby the stylus 684 has a high-speed switch 682 embedded in the tip of the stylus so that each time it is pressed onto the tablet PC or visual display 680 the precise timing is encoded in the opening and closing of the high-speed switch 682.
- this human interface device is different than a mouse and can in some instances provide a more natural interaction and thereby better assessment of the relative timing of the subject's response to the stimulation probe on the visual display or auditory stimulation via the soundcard to the subject's auditory cortex.
- the present invention can also be used with other types of stimuli such as an acoustic stimulus.
- the light-sensor would be replaced with a miniature microphone 694 ( Figure 13 ) attached to the computer's speaker grill which would signal the exact time the acoustic stimulus was presented to the patient, again independent of the computer's operating system timing jitter.
- earbuds are employed rather than a speaker, one can insert a dongle in between the source generating sound card and ear buds. As shown in Figure 13 , audible sound is used as the energy transfer for synchronization.
- an audible microphone 694 is included as the receiver in the electronics module worn by the subject or positioned locally on the peripheral PC as the sound is captured from human audible sound generated from the PC speaker 690 functioning as the emitter. More precisely, the computer speaker 690 emits sound waves as the energy source 692 which is received on the human subject by microphone 694 whose output is connected to electronic circuit 708 which includes amplifier 696 which is coupled to voltage comparator 700 with an input threshold reference 698. The output of the voltage comparator 700 then goes to microcontroller 702 which has an on/off gate output which is received by wireless transmitter 704 as an input whose output is generated through antenna 706.
- the advantage of this embodiment is that the sound generated by the PC speaker 690 is uniformly admitted so that it is easy to be detected by microphone 694 worn by the subject. A potential problem with this approach is the effect of the sound on the central nervous system of the subject.
- FIG 14 an alternate embodiment is shown for the detection of audible sound delivered via earbuds.
- the wired sound going to earbuds 712 is intercepted at a junction box 714 which divides a second copy of the electrical sound signal via connection 710 which enters electronic circuit 708 of a similar type.
- the detected signal would be above and this will gate on and off the wireless transmitter in synchrony at high precision and short relative latency to enable a synchronized wireless transmission between the audible stimuli probe of the earbuds 712 and the rest of the biosensor based data being collected by the microcontroller with real-time clock.
- the present invention can also be used with other patient response devices besides a computer mouse.
- a tablet PC 680 the invention could be incorporated into a stylus 684 that has a switch 682 to detect when the stylus touches the screen of the table PC's touchscreen.
- the stylus could have two ends which are colored differently to uniquely label each end. Depending on the stimulus presented, either the end colored/labeled with an A could be pressed and the micro-switch contained. To another task, the alternate end labelled B could be pressed, thus differentiating the subject's choices.
- a cigar shape form factor could be alternatively employed with a tilt or accelerometer within to measure the subject's response. Angular rotation could be picked up via a gyrometer to make a 6 axis rather than just 3 linear axis motion analysis for more subtle refinement of the subject's motion.
- the response to the stimulus could also be an eye-blink of the Subject.
- an EEG signal at positions Fp1 and/or Fp2 would easily pickup the eye-blink response directly.
- the response to the stimulus could also be a tilt of the patient's head, as embodied in Figure 15 , if an accelerometer is incorporated into the biosensor headset electronics module 720 or is placed adjacent to it. Gesture detection could also be leveraged whereby a motion processing unit (accelerometer + specialized microcontroller 722) is employed as an event marker such that one physical tap is encoding one type of event while two physical taps in succession would encode a second type of event. More concretely, tap once for right, tap twice for left instead of using a mouse click.
- Figure 16 illustrates an alternate embodiment that includes an alternate means of encoding left or right mouse click events or right or left touch screen events into an on/off keyed wireless signal whereby a left mouse click is encoded by a single pulse of output whereas a right mouse click is encoded as a double pulse of output.
- this scheme one can see an upper trace showing amplitude as a function of time of the digital signal and just below it in precise synchronization is the amplitude output is a function of time of the on off keyed wireless signal.
- response 734 When a visual stimulus or auditory stimulus is presented at 732 is started it is perceived by the subject by a mouse click in this case response 734 and the difference in time is shown as test one reaction time 730.
- reaction time is encoded in the one off keyed wireless signal 746 with a leading and following edge indicating the start and response to the visual stimulation start in the mouse click response but in this case it is preceded by two pulses of energy 744 thus encoding the fact that it was a right-handed mouse button depression not a left.
- the additional information is encoded either in the frequency of the additional pulses of energy or in the amplitude of the number of pulses of energy or in this case the actual number of discrete pulses of energy.
- the invention may be applied to other applications and may be modified without departing from the scope of the invention.
- the jittery PC with latency described herein may be used in an industrial application and the remote sensors and embedded microcontroller could be on a server, in the cloud, in the electronics module, or on a local PC, tablet PC, smartphone, or custom hand held device. Accordingly, the scope of the invention is not intended to be limited to the exemplary embodiments described above, but only by the appended claims.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Developmental Disabilities (AREA)
- Computer Networks & Wireless Communication (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Educational Technology (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Cardiology (AREA)
- Physiology (AREA)
- General Physics & Mathematics (AREA)
- Dermatology (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Otolaryngology (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
Claims (10)
- System zum Synchronisieren eines PCs, der eine Latenz (420) von Vorgängen aufweist, und einer mit einem Biosensor aktivierten Mikrosteuereinheit mit Echtzeittakt (430), das Folgendes umfasst:einen Sender (570, 680, 690, 712), der ein Anregungssignal von dem PC, der eine Latenz aufweist, sendet;eine Eingabevorrichtung (422), die die Reaktion der Testperson auf das Anregungssignal angibt;eine Codierschaltung (564, 584, 708), die ausgelegt ist, einen Zeitunterschied zwischen dem Anregungssignal und der Antwort der Testperson auf das Anregungssignal zu codieren;einen Sender (434, 522, 534), der ausgelegt ist, das codierte Unterschiedssignal, das die Reaktionszeit der Testperson repräsentiert, zu senden; undeinen komplementären Empfänger (432, 526, 534), der ausgelegt ist, das codierte Unterschiedssignal zu detektieren, und der eine Decodierschaltung enthält, die das codierte Unterschiedssignal decodiert, um die Reaktionszeit der Testperson zu bestimmen, wobei der komplementäre Empfänger die Reaktionszeit der Testperson an die mit einem Biosensor aktivierte Mikrosteuereinheit mit Echtzeittakt für eine Synchronisation des PCs, der eine Latenz von Vorgängen aufweist, und der mit einem Biosensor aktivierten Mikrosteuereinheit mit Echtzeittakt liefert.
- System nach Anspruch 1, wobei der Sender eine sichtbare LED, einen Ultraschallwandler, eine Infrarot-LED (IR-LED), einen Audio-Lautsprecher, einen Audio-Wandler, einen Bluetooth-Sender/Sendeempfänger, einen WLAN-Sender/Sendeempfänger, einen ZigBee-Sender/Sendeempfänger oder einen AM- oder FM-Sender/Sendeempfänger umfasst.
- System nach Anspruch 2, wobei der komplementäre Empfänger eine sichtbare Fotodiode (662), einen sichtbaren Fototransistor (662), einen Ultraschallempfänger/ein Ultraschallmikrofon (694), eine Infrarotfotodiode (IR-Fotodiode), einen Infrarotfototransistor, ein Audio-Mikrofon, einen Bluetooth-Empfänger/Sendeempfänger, einen WLAN-Empfänger/Sendeempfänger, einen ZigBee-Empfänger/Sendeempfänger, einen AM-Empfänger/Sendeempfänger oder einen FM-Empfänger/Sendeempfänger (534) umfasst.
- System nach Anspruch 1, wobei die Eingabevorrichtung eine Maus (572) ist und die Codierschaltung auf das Anregungssignal von dem Sender und ein Eingangssignal von der Maus als Reaktion auf das Anregungssignal reagiert.
- System nach Anspruch 4, wobei das Codiersignal das Unterschiedssignal als ein verschlüsseltes drahtloses EIN/AUS-Signal codiert und das codierte Unterschiedssignal an den Sender für eine drahtlose Übertragung an den komplementären Empfänger liefert.
- System nach Anspruch 5, wobei die Codierschaltung einen linken Mausklick als einen verschlüsselten Impuls und einen rechten Mausklick als zwei verschlüsselten Impulse codiert.
- System nach Anspruch 1, wobei sich der komplementäre Empfänger an einem EEG-Kopfhörer (576) der Testperson befindet, wobei der EEG-Kopfhörer die Decodierschaltung und die Mikrosteuereinheit zum Synchronisieren der Reaktionszeit der Testperson mit den durch den EEG-Kopfhörer gesammelten EEG-Daten enthält.
- System nach Anspruch 7, wobei der Sender einen hörbaren Lautsprecher (690) umfasst und der komplementäre Empfänger einen Ohrhörer der Testperson umfasst, wobei der Ohrhörer empfangene Tonsignale an die Decodierschaltung liefert.
- System nach Anspruch 7, wobei der EEG-Kopfhörer eine Fingertippeingabe (720) als die Eingabevorrichtung enthält.
- System nach Anspruch 1, wobei die Eingabevorrichtung ein Stift (684) ist und der PC, der eine Latenz aufweist, ein Tablet-PC ist.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462019291P | 2014-06-30 | 2014-06-30 | |
PCT/US2015/038673 WO2016004111A1 (en) | 2014-06-30 | 2015-06-30 | System and methods for the synchronization of a non-real time operating system pc to a remote real-time data collecting microcontroller |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3160347A1 EP3160347A1 (de) | 2017-05-03 |
EP3160347A4 EP3160347A4 (de) | 2018-02-28 |
EP3160347B1 true EP3160347B1 (de) | 2021-03-03 |
Family
ID=55019937
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15814892.4A Not-in-force EP3160347B1 (de) | 2014-06-30 | 2015-06-30 | System und verfahren zur synchronisierung eines pc mit nicht-echtzeit-betriebssystems mit einem entfernten echtzeit-datenerfassungsmikrosteuergerät |
Country Status (4)
Country | Link |
---|---|
US (2) | US20180184964A1 (de) |
EP (1) | EP3160347B1 (de) |
CN (1) | CN107847194B (de) |
WO (2) | WO2016004111A1 (de) |
Families Citing this family (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180184964A1 (en) | 2014-06-30 | 2018-07-05 | Cerora, Inc. | System and signatures for a multi-modal physiological periodic biomarker assessment |
EP3335126A4 (de) | 2015-08-11 | 2019-05-01 | Cognoa, Inc. | Verfahren und vorrichtung zur bestimmung des entwicklungsfortschritts mit künstlicher intelligenz und benutzereingabe |
US11064881B2 (en) * | 2015-11-13 | 2021-07-20 | Hennepin Healthcare System, Inc | Method for predicting convergence disorders caused by concussion or other neuropathology |
US11972336B2 (en) | 2015-12-18 | 2024-04-30 | Cognoa, Inc. | Machine learning platform and system for data analysis |
WO2017106770A1 (en) * | 2015-12-18 | 2017-06-22 | Cognoa, Inc. | Platform and system for digital personalized medicine |
US10169560B2 (en) * | 2016-02-04 | 2019-01-01 | Lenovo Enterprise Solutions (Singapore) Pte. Ltd. | Stimuli-based authentication |
US11504038B2 (en) * | 2016-02-12 | 2022-11-22 | Newton Howard | Early detection of neurodegenerative disease |
US20170258390A1 (en) * | 2016-02-12 | 2017-09-14 | Newton Howard | Early Detection Of Neurodegenerative Disease |
US11164596B2 (en) * | 2016-02-25 | 2021-11-02 | Samsung Electronics Co., Ltd. | Sensor assisted evaluation of health and rehabilitation |
JP6196402B2 (ja) * | 2016-02-29 | 2017-09-13 | ダイキン工業株式会社 | 判定結果出力装置、判定結果提供装置、及び判定結果出力システム |
US10682096B2 (en) * | 2016-03-31 | 2020-06-16 | Wave Neuroscience, Inc. | System for diagnosing mental disorders using neurometrics |
EP4105921A1 (de) * | 2016-06-20 | 2022-12-21 | Magic Leap, Inc. | Anzeigesystem für erweiterte realität zur beurteilung und modifizierung von neurologischen erkrankungen einschliesslich visueller verarbeitungs- und wahrnehmungserkrankungen |
CN106361322A (zh) * | 2016-08-31 | 2017-02-01 | 博睿康科技(常州)股份有限公司 | 一种脑电装置的累计偏差自动检测方法和装置 |
CN110622179A (zh) | 2017-02-09 | 2019-12-27 | 科格诺亚公司 | 用于数字个性化医疗的平台和系统 |
RU2670668C9 (ru) * | 2017-07-07 | 2018-12-12 | Федеральное государственное бюджетное учреждение "Национальный медицинский исследовательский центр психиатрии и наркологии имени В.П. Сербского" Министерства здравоохранения Российской Федерации (ФГБУ "НМИЦ ПН им. В.П. Сербского" Минздрава России) | Способ выявления и дифференциальной диагностики шизофрении от расстройств личности |
EP3454339A1 (de) | 2017-09-07 | 2019-03-13 | Koninklijke Philips N.V. | Verfahren und vorrichtung zur verbesserung der messung des timings von berührungen eines berührungsbildschirms |
WO2019060298A1 (en) | 2017-09-19 | 2019-03-28 | Neuroenhancement Lab, LLC | METHOD AND APPARATUS FOR NEURO-ACTIVATION |
JP6337362B1 (ja) * | 2017-11-02 | 2018-06-06 | パナソニックIpマネジメント株式会社 | 認知機能評価装置、及び、認知機能評価システム |
US11717686B2 (en) | 2017-12-04 | 2023-08-08 | Neuroenhancement Lab, LLC | Method and apparatus for neuroenhancement to facilitate learning and performance |
US20210366602A1 (en) * | 2017-12-28 | 2021-11-25 | Nec Corporation | Signal-processing device, analysis system, signal-processing method, and signal-processing program |
WO2019133997A1 (en) | 2017-12-31 | 2019-07-04 | Neuroenhancement Lab, LLC | System and method for neuroenhancement to enhance emotional response |
US11364361B2 (en) | 2018-04-20 | 2022-06-21 | Neuroenhancement Lab, LLC | System and method for inducing sleep by transplanting mental states |
US10799169B2 (en) * | 2018-06-08 | 2020-10-13 | Timothy J. Wahlberg | Apparatus, system and method for detecting onset Autism Spectrum Disorder via a portable device |
WO2020056418A1 (en) | 2018-09-14 | 2020-03-19 | Neuroenhancement Lab, LLC | System and method of improving sleep |
AU2019376682A1 (en) * | 2018-11-09 | 2021-05-27 | Akili Interactive Labs, Inc, | Facial expression detection for screening and treatment of affective disorders |
CN109545293A (zh) * | 2018-12-04 | 2019-03-29 | 北京大学 | 一种基于app的孤独症高危婴儿筛查系统 |
CN111326253A (zh) * | 2018-12-14 | 2020-06-23 | 深圳先进技术研究院 | 自闭症谱系障碍患者的多模态情感认知能力的评估方法 |
CN109784287A (zh) * | 2019-01-22 | 2019-05-21 | 中国科学院自动化研究所 | 基于情景信号类前额叶网络的信息处理方法、系统、装置 |
BR112021018770A2 (pt) | 2019-03-22 | 2022-02-15 | Cognoa Inc | Métodos e dispositivos de terapia digital personalizada |
CN110141231B (zh) * | 2019-05-17 | 2022-08-16 | 天津大学 | 一种无线脑电采集中的事件时点同步记录方法 |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US20220218268A1 (en) * | 2020-07-22 | 2022-07-14 | Actibrain Bio, Inc. | Ai (artificial intelligence) based method for providing brain information |
RU2744353C1 (ru) * | 2020-07-23 | 2021-03-05 | Федеральное государственное бюджетное учреждение "Национальный медицинский исследовательский центр психиатрии и наркологии имени В.П. Сербского" Министерства здравоохранения Российской Федерации (ФГБУ "НМИЦ ПН им. В.П. Сербского" Минздрава России) | Способ диагностики предрасположенности к импульсивному агрессивному поведению у психически здоровых лиц |
WO2022046740A1 (en) * | 2020-08-26 | 2022-03-03 | Brain Scientific, Inc. | Integrated brain machine interface platform with graphene based electrodes |
EP3967223A1 (de) * | 2020-09-09 | 2022-03-16 | Beats Medical Limited | System und verfahren zur bereitstellung einer massgeschneiderten therapie für einen benutzer |
AU2021232775A1 (en) * | 2020-09-22 | 2022-04-07 | Health Tech Connex Inc. | Methods and apparatus for triggering a stimulus for evoked brain response analysis |
MX2023007230A (es) * | 2020-12-22 | 2023-06-27 | Regeneron Pharma | Sistemas y métodos para el análisis de características basado en señales para determinar resultados clínicos. |
CN112773331B (zh) * | 2021-01-14 | 2022-07-12 | 成都福瑞至脑健康科技有限公司 | 一种脑健康状态监测与评价方法与系统 |
FR3121834A1 (fr) | 2021-04-20 | 2022-10-21 | Scale-1 Portal | Système de traitement de troubles neurovisuels ou vestibulaires et procédé de commande d’un tel système |
BR112023021898A2 (pt) * | 2021-04-28 | 2023-12-19 | Cumulus Neuroscience Ltd | Técnica para obter e processar uma medição de um biossinal |
SE545345C2 (en) | 2021-06-30 | 2023-07-11 | Tobii Ab | Method and system for alignment of data |
WO2023003672A1 (en) * | 2021-07-22 | 2023-01-26 | Justin Ryan | Learning system that automatically converts entertainment screen time into learning time |
CN113724863B (zh) * | 2021-09-08 | 2022-10-25 | 山东建筑大学 | 一种自闭症谱系障碍的自动判别系统、存储介质及设备 |
CN115062328B (zh) * | 2022-07-12 | 2023-03-10 | 中国科学院大学 | 一种基于跨模态数据融合的信息智能解析方法 |
HUP2200356A1 (hu) * | 2022-09-06 | 2024-03-28 | Kiserleti Orvostudomanyi Ki | Rendszer és eljárások biológiai adatok valós idõben milliszekundumos idõbeli pontossággal történõ szinkronizációjára |
Family Cites Families (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7565905B2 (en) * | 1998-06-03 | 2009-07-28 | Scott Laboratories, Inc. | Apparatuses and methods for automatically assessing and monitoring a patient's responsiveness |
US6366805B1 (en) | 1999-05-26 | 2002-04-02 | Viasys Healthcare Inc. | Time frame synchronization of medical monitoring signals |
WO2001088836A1 (en) | 2000-05-18 | 2001-11-22 | Commwell, Inc. | Method and apparatus for remote medical monitoring incorporating video processing and system of motor tasks |
US7261690B2 (en) | 2000-06-16 | 2007-08-28 | Bodymedia, Inc. | Apparatus for monitoring health, wellness and fitness |
US7460903B2 (en) | 2002-07-25 | 2008-12-02 | Pineda Jaime A | Method and system for a real time adaptive system for effecting changes in cognitive-emotive profiles |
CN1323637C (zh) * | 2003-01-27 | 2007-07-04 | 周曙 | 神经行为通用测试系统 |
US20050085744A1 (en) * | 2003-10-20 | 2005-04-21 | Stmicroelectronics S.R.I. | Man-machine interfaces system and method, for instance applications in the area of rehabilitation |
CN1247148C (zh) * | 2003-12-18 | 2006-03-29 | 常州市第一人民医院 | 儿童认知功能发育测试系统 |
US20050261559A1 (en) | 2004-05-18 | 2005-11-24 | Mumford John R | Wireless physiological monitoring system |
EP1781165B1 (de) * | 2004-06-18 | 2017-11-08 | Neuronetrix Solutions, LLC | Testsystem für evozierte potentiale für neurologische erkrankungen |
CN1778272A (zh) * | 2004-11-22 | 2006-05-31 | 中国科学院心理研究所 | 便携式事件相关电位仪 |
US20090024050A1 (en) * | 2007-03-30 | 2009-01-22 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Computational user-health testing |
CN100571612C (zh) * | 2007-07-13 | 2009-12-23 | 深圳迪美泰数字医学技术有限公司 | 用于临床或非临床生物信号记录的纯数字医用放大器 |
US8393734B2 (en) | 2007-09-14 | 2013-03-12 | Neuroptics, Inc. | Pupilary screening method and system |
GB2462101B (en) * | 2008-07-24 | 2012-08-08 | Lifelines Ltd | A system for monitoring a patient's EEG output |
US8494857B2 (en) * | 2009-01-06 | 2013-07-23 | Regents Of The University Of Minnesota | Automatic measurement of speech fluency |
US8521439B2 (en) | 2009-05-08 | 2013-08-27 | Pulsar Informatics, Inc. | Method of using a calibration system to generate a latency value |
US9492344B2 (en) * | 2009-08-03 | 2016-11-15 | Nike, Inc. | Unified vision testing and/or training |
CN101695448A (zh) * | 2009-10-22 | 2010-04-21 | 安徽医科大学 | 一种用于测量反应延时的测量装置 |
US20110224571A1 (en) * | 2009-11-16 | 2011-09-15 | Alvaro Pascual-Leone | Non-invasive methods for evaluating cortical plasticity impairments |
US8744875B2 (en) | 2009-12-23 | 2014-06-03 | Mindray Ds Usa, Inc. | Systems and methods for synchronizing data of a patient monitor and a portable sensor module |
US8655428B2 (en) | 2010-05-12 | 2014-02-18 | The Nielsen Company (Us), Llc | Neuro-response data synchronization |
US8948861B2 (en) * | 2011-03-31 | 2015-02-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Methods and systems for determining optimum wake time |
US20130035579A1 (en) * | 2011-08-02 | 2013-02-07 | Tan Le | Methods for modeling neurological development and diagnosing a neurological impairment of a patient |
CN103211652B (zh) * | 2012-01-20 | 2015-07-22 | 有医科技股份有限公司 | 生物刺激信号测量方法 |
US20170020406A1 (en) * | 2012-04-13 | 2017-01-26 | UE Technology | Method for measuring biological stimulus signal |
US20130274626A1 (en) * | 2012-04-13 | 2013-10-17 | UE Technology | Measuring method for synchronizing bio-signals with stimulations |
US20140012509A1 (en) * | 2012-07-06 | 2014-01-09 | Daniel Barber | Methods and systems for synchronization and distribution of multiple physiological and performance measures |
US20140128735A1 (en) | 2012-11-02 | 2014-05-08 | Cardiac Science Corporation | Wireless real-time electrocardiogram and medical image integration |
US10085688B2 (en) * | 2012-11-20 | 2018-10-02 | El-Mar Inc. | Method of identifying an individual with a disorder or efficacy of a treatment of a disorder |
AU2013352294A1 (en) | 2012-11-28 | 2015-07-09 | Neuren Pharmaceuticals Limited | Treatment of Autism Spectrum Disorders using glycyl-l-2-methylprolyl-l-glutamic acid |
US20140180060A1 (en) * | 2012-12-17 | 2014-06-26 | Todd Parrish | Methods and Systems for Automated Functional MRI in Clinical Applications |
JP6142354B2 (ja) * | 2013-02-27 | 2017-06-07 | 国立研究開発法人理化学研究所 | 脳波信号処理装置、脳波信号処理方法、プログラム、及び記録媒体 |
US9463132B2 (en) | 2013-03-15 | 2016-10-11 | John Castle Simmons | Vision-based diagnosis and treatment |
US20160029965A1 (en) | 2013-03-15 | 2016-02-04 | Adam J. Simon | Artifact as a feature in neuro diagnostics |
CA2906652A1 (en) * | 2013-03-15 | 2014-09-18 | Adam J. Simon | System and signatures for the multi-modal physiological stimulation and assessment of brain health |
EP3048955A2 (de) * | 2013-09-25 | 2016-08-03 | MindMaze SA | Physiologische parametermessung und rückkopplungssystem |
US20150261936A1 (en) * | 2014-03-13 | 2015-09-17 | Hong Kong Baptist University | Method for Separating and Analyzing Overlapping Data Components with Variable Delays in Single Trials |
US20180184964A1 (en) | 2014-06-30 | 2018-07-05 | Cerora, Inc. | System and signatures for a multi-modal physiological periodic biomarker assessment |
US20160029962A1 (en) | 2014-07-29 | 2016-02-04 | Elwha Llc | Medically active toys |
CN106156480A (zh) * | 2015-07-01 | 2016-11-23 | 安徽华米信息科技有限公司 | 一种数据统计方法及装置 |
-
2015
- 2015-06-30 US US15/323,238 patent/US20180184964A1/en not_active Abandoned
- 2015-06-30 US US15/323,249 patent/US10254785B2/en not_active Expired - Fee Related
- 2015-06-30 CN CN201580045886.4A patent/CN107847194B/zh not_active Expired - Fee Related
- 2015-06-30 EP EP15814892.4A patent/EP3160347B1/de not_active Not-in-force
- 2015-06-30 WO PCT/US2015/038673 patent/WO2016004111A1/en active Application Filing
- 2015-06-30 WO PCT/US2015/038684 patent/WO2016004117A1/en active Application Filing
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
US10254785B2 (en) | 2019-04-09 |
EP3160347A4 (de) | 2018-02-28 |
WO2016004111A1 (en) | 2016-01-07 |
CN107847194A (zh) | 2018-03-27 |
WO2016004117A1 (en) | 2016-01-07 |
US20180184964A1 (en) | 2018-07-05 |
CN107847194B (zh) | 2020-11-24 |
US20170177023A1 (en) | 2017-06-22 |
EP3160347A1 (de) | 2017-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3160347B1 (de) | System und verfahren zur synchronisierung eines pc mit nicht-echtzeit-betriebssystems mit einem entfernten echtzeit-datenerfassungsmikrosteuergerät | |
US20230088533A1 (en) | Detecting and Using Body Tissue Electrical Signals | |
KR102215442B1 (ko) | 착용형 모바일 기기, 및 착용형 모바일 기기의 생체신호의 선택적 활용 방법 | |
US20180368755A1 (en) | Sensory stimuli to increase accuracy of sleep staging | |
US20160015289A1 (en) | Form factors for the multi-modal physiological assessment of brain health | |
US20200305792A1 (en) | Determining an orientation of a wearable device | |
KR101238192B1 (ko) | 귀-부착형 센서셋 및 그 동작 방법 | |
KR101218203B1 (ko) | 인체 장착형 센서셋 및 그 동작 방법 | |
KR102173725B1 (ko) | 생체 신호를 측정하는 방법 및 장치 | |
TWI786338B (zh) | 帕金森診斷系統 | |
KR20190078929A (ko) | 집중력 향상 스마트 책상 | |
JP2015188649A (ja) | 複数の生理指標および視線の分析支援装置、プログラム | |
WO2019200362A1 (en) | Wearable multi-modal bio-sensing system | |
US20150272508A1 (en) | Signal processing system providing marking of living creature physiological signal at a specific time | |
JP2002102190A (ja) | 生体機能測定システム | |
WO2020133426A1 (zh) | 移动监测装置、监护设备、监护系统及病人状态监测方法 | |
WO2024214498A1 (ja) | 情報処理システム、情報処理装置および情報処理方法 | |
KR20180128159A (ko) | 고감도 다중 생체신호 취득 장치 및 이를 이용한 건강관리 방법 | |
WO2023115558A1 (en) | A system and a method of health monitoring | |
JP2005278706A (ja) | 携帯型筋電図・体動計測器 | |
RADHAKRISHNAN et al. | Wearables for in-situ monitoring of cognitive states: Challenges and opportunities.(2023) | |
WO2021190720A1 (en) | System for eye movement detection with contact lens | |
Li | Sleep-related fall monitoring among elderly using non-invasive wireless bio-sensors | |
CN115770013A (zh) | 辅助弱势人群的眼动测试方法、装置、设备及介质 | |
Bülow | Infant Research–The Complete Pocket Guide |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170130 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20180130 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04W 4/00 20180101ALI20180124BHEP Ipc: A61B 5/0482 20060101ALI20180124BHEP Ipc: A61B 5/16 20060101AFI20180124BHEP Ipc: A61B 5/0478 20060101ALI20180124BHEP Ipc: A61B 5/0484 20060101ALI20180124BHEP Ipc: H04L 29/08 20060101ALI20180124BHEP Ipc: A61B 5/044 20060101ALI20180124BHEP Ipc: G06F 1/14 20060101ALI20180124BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04W 4/70 20180101ALI20200330BHEP Ipc: A61B 5/16 20060101AFI20200330BHEP Ipc: G09C 1/00 20060101ALI20200330BHEP Ipc: H04W 4/00 20180101ALI20200330BHEP Ipc: A61B 5/044 20060101ALI20200330BHEP Ipc: H04L 29/08 20060101ALI20200330BHEP Ipc: A61B 5/0478 20060101ALI20200330BHEP Ipc: A61B 5/0482 20060101ALI20200330BHEP Ipc: A61B 5/0484 20060101ALI20200330BHEP Ipc: G06F 1/14 20060101ALI20200330BHEP |
|
INTG | Intention to grant announced |
Effective date: 20200417 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SIMON, ADAM J. Owner name: CERORA, INC. Owner name: KATH, GARY S. |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: CERORA, INC. |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1366327 Country of ref document: AT Kind code of ref document: T Effective date: 20210315 Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602015066402 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210603 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210603 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210604 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20210303 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1366327 Country of ref document: AT Kind code of ref document: T Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210705 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210703 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602015066402 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602015066402 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
26N | No opposition filed |
Effective date: 20211206 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220101 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210703 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20150630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210303 |