[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021090331A1 - A system and method of diagnosing or predicting the levels of autism spectrum disorders (asd) using xr-ai platform - Google Patents

A system and method of diagnosing or predicting the levels of autism spectrum disorders (asd) using xr-ai platform Download PDF

Info

Publication number
WO2021090331A1
WO2021090331A1 PCT/IN2020/050928 IN2020050928W WO2021090331A1 WO 2021090331 A1 WO2021090331 A1 WO 2021090331A1 IN 2020050928 W IN2020050928 W IN 2020050928W WO 2021090331 A1 WO2021090331 A1 WO 2021090331A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
rating score
autism
functions
autism rating
Prior art date
Application number
PCT/IN2020/050928
Other languages
French (fr)
Inventor
Sathyanarayanan A R
Bobin CHANDRA
Rema Bai
Joseph Bose H H
Original Assignee
Embright Infotech Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Embright Infotech Private Limited filed Critical Embright Infotech Private Limited
Publication of WO2021090331A1 publication Critical patent/WO2021090331A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/377Electroencephalography [EEG] using evoked responses
    • A61B5/378Visual stimuli
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy

Definitions

  • the present invention mainly relates to a system and method of diagnosing or predicting the levels of autism spectrum disorders (ASD) in individuals and/or related emotional, physiological disorders using assistive intervention/therapy & scoring using XR-AI Platform.
  • ASD autism spectrum disorders
  • Autism spectrum disorder is well known in the art which is a range of mental disorders of the neurodevelopmental type. It includes Childhood Disintegrative Disorder, Pervasive Developmental Disorder, Autism and Asperger syndrome. Autism spectrum disorder relates to a variety of brain developmental disorders. These conditions, commonly recognized as autism, are characterized by social skill problems, both verbal and nonverbal communication, repetitive and stereo typed behaviors, delayed childhood development, and other distinctive strengths and challenges.
  • ASD autism spectrum disorder
  • ABA Applied Behavior Analysis
  • Virtual reality is a computer-based technology developed to create artificial simulation placing the user in an immersive experience.
  • Virtual reality treatment is the use of virtual reality technology for different rehabilitation process.
  • Several examples of virtual reality treatment include iBIoom VR, Neuro- Rehab VR, VAST Rehab, Verapy and Looxid VR.
  • One of the prior arts uses virtual reality to provide a supplementary method of teaching social and communication skills for individuals with Autism Spectrum Disorder.
  • Another prior art that uses virtual reality as a base level tool in support to the other assistive technology intervention while VHAB is another product that uses personalized simulated environments on-screen that children with autism or cerebral palsy can explore through motion sensors as they carry out their physiotherapy regimen. This is an on-screen therapy module which may result the autistic kids to have lower concentration levels since they will be distracted by other objects and sound stimulus in the room which prove to reduce their learning curve.
  • An aspect of the present invention is to address at least the above- mentioned problems and/or disadvantages and to provide at least the advantages described below.
  • the present invention relates to a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform (1200), the method comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210), capturing, by a pair of eye tracking device and a combination of one or more BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user’s eye gazing coordinates display on a screen (111 ) to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and capturing by the BCI, EEG and fNIR sensors, the user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations (1220), computing, by processing unit of the VRD, an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and a
  • Figure 1 shows an outline of the ABA system using XR-AI Platform according to an exemplary implementation of the present invention.
  • Figure 2 shows a hardware perspective of the system using XR-AI
  • Figure 3 shows an example working of the system according to an exemplary implementation of the present invention.
  • Figure 4 shows an example results of color identification and learning according to an exemplary implementation of the present invention.
  • Figure 5 shows an example login process of XR-CARE System
  • Figure 6 shows an example choosing diagnosis or intervention option of the system according to an exemplary implementation of the present invention.
  • Figure 7 shows an example diagnosis process according to an exemplary implementation of the present invention.
  • Figure 8 shows an example intervention process according to an exemplary implementation of the present invention.
  • Figure 9 shows an example graph of color identification score card according to an exemplary implementation of the present invention.
  • Figure 10 shows an ABA system according to an exemplary implementation of the present invention.
  • Figure 11 shows an example step by step process of the system according to an exemplary implementation of the present invention.
  • Figure 12 shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention.
  • ASD autism spectrum disorders
  • Figure 13 shows an example level-1 diagnosis - Inter-Pupillary
  • Figure 14 shows an example level-2 diagnosis - ASD & Non ASD
  • Figure 15 shows an example level-3 ASD Content specific Diagnosis according to an exemplary implementation of the present invention.
  • the various embodiments of the present invention describe about an applied behavior analysis (ABA) system which helps the physicians/therapists/Special Education Trainer /doctors to perform diagnosis and intervention /therapy/skill development for the patients/users.
  • ABA applied behavior analysis
  • the present invention system helps the doctors with precise and live feedback of the diagnosis and intervention/therapy/skill development to take knowledgeable decisions on patients.
  • the present invention provides a revolutionized product which is based on Virtual Reality to provide proper diagnosis and treatment for such patients.
  • the present invention product trains and improve the responses and skills of high functionality autistic people.
  • the patients/users/children’s who have trouble in concentrating on a task can use the Virtual Reality headgear where they get to see realistic images in front of them in 3 Dimension. This gives a trigger to the brain, making them believe that they are in a real world.
  • a Virtual Reality environment provides a system where children diagnosed with high functional autism can practice cognitive, social, communication and self-care interactions in a safe and controlled environment. By the repetition and analysis of these virtual interactions, children can improve their cognitive, social, communication and self-care skills, etc.
  • the present invention is designed and implemented with various scenarios, a virtual reality (VR) environment provides the data for Doctors to analyze the autistic children.
  • the present invention system integrates sensors such as eye tracker, i.e. a pair of eye tracking devices and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors, in order to get more informative data and using VR-based assistive technology to help users with neurodiverse backgrounds to practice the interactive and self-care skills.
  • the present invention is an autism diagnosis and intervention/therapy/skill development module which is an immersive experience designed to help children with high functional autism practice cognitive, social and self-care skills.
  • the present invention system is based on building an Al (Artificial intelligence) engine algorithm which help to monitor and analyze the patterns of behavior shown by high functional autistic children and other neuro- developmental disorders.
  • Al Artificial intelligence
  • the scenarios include various activities which improve the cognitive, social, motor, self-care and creative skills of the autistic and other disabled children through the ABA intervention/therapy/skill development and virtual reality treatment system.
  • the VR therapy system is more fun and engaging than traditional therapy. It also yields more result than the traditional ones. The child is encouraged and rewarded accordingly.
  • the scenarios will help to understand the particular shapes, colors, objects and certain situations that trigger negative and aggressive behavior of the child.
  • HMD Virtual Reality Flead Mounted Display
  • a pupil detector helps in identifying the attention span of the child. This tracks the attentiveness of the child throughout the playing scenario and allows to know whether the child is interested in the current scenario. This pupil detector also generates a log data of span attention of the child which is further used by the doctors for training in the future. Vive tracker and leap motion are used to make the scenarios more interactive by capturing the hand motions realtime. With the help of EEG and fNIR doctors will get real-time live feed data about the emotional responses happening in the brain.
  • the technology used in the present invention is Virtual Reality.
  • the scenarios are played with the help of high-end laptop and leap motion sensors transforms real world hand gestures into Virtual Reality.
  • Virtual Reality is an interactive experience in a simulated environment containing both auditory and visual contents. It presents a near to human’s experience with an immersive look around on realistic images giving a trigger to brain, making it believe being in a real world thus keeping a potential to stimulate and access the person.
  • VR projections have not only been limited in entertainment, but also extended to social science, psychology, healthcare, clinical therapies, education, training, fine arts, tourism, engineering, safety, health, marketing etc.
  • the CARS (Childhood Autism Rating Scale) score method works by rating child’s behavior, characteristics and abilities.
  • ADOS-2 Altism Diagnostic Observation Schedule
  • the present invention product is an observation assistive tool that helps to maintain the CARS score. These observations are the inputs and by using Applied Behavior Analysis (ABA), psychologists, occupational Therapist and speech-Language Pathologist validate the CARS score and IQ.
  • ABA Applied Behavior Analysis
  • psychologists psychologists
  • occupational Therapist and speech-Language Pathologist validate the CARS score and IQ.
  • the input is the virtual observation through machine instead manual observation.
  • scoring is based on virtual reality scenarios.
  • the present invention has categorized intervention/therapy module for observing attention span, gross motor skills, haptic feedbacks and social interactions through the sensors such as eye tracker, fNIR and EEG that is processed on a BCI platform that is integrated with Head Mounted Display.
  • the present invention system further has a clinical validation and based on that one will get outcomes, i.e. , success/ failure count.
  • the present invention product is HIPPA compliance.
  • the ABA system may be referred as Auticare system.
  • Figure 1 shows an outline of the ABA diagnosis system using XR-
  • Figure 1a shows an example ABA diagnosis system using XR-AI
  • the present invention system has created a set of immersive experiences that are aimed at improving the cognitive, social and self-care skills of a patient.
  • the present invention system acquires input: observing the character of the autistic children through VR (Immersive Technology) and provides output: success & failure count.
  • VR Visual Engineering Technology
  • These exercises are arranged in such a way that the patients are introduced into these immersive experiences through simple steps and going more specific as the patient shows signs of improvement.
  • the next phase of development will incorporate eye tracking and BCI sensors that will give valuable feedback to doctors for them to analyze the patient’s difficulty in detail. These details will be uploaded in cloud and can be accessed by doctors through the patient’s treatment to note changes in behavior.
  • the present invention is not a hospital management system.
  • the present invention system is an ABA treatment system which helps the physicians/Special Education Trainer/Doctors to do diagnosis and intervention for the patients.
  • the present invention system helps the doctors with precise and live feedback of the diagnosis and intervention to take knowledgeable decisions on patients.
  • the first version of the system will have the Color Learning Scenarios for cognitive development.
  • the present invention system does the diagnosis and intervention for the patients by engaging the patients playing with colors. This scenario checks the gross motor skills of the patient by touching the colors with their hands and improve these skills by further intervention.
  • the scenario that helps the patient to hold the brush to paint will check the Fine motor skills.
  • Visual abilities with ability to focus their eyes for visual information is the fundamental for using the present invention system. Healthy eyes and good vision play an important role even before they learn to reach and grab with their hands which are unique offerings.
  • the present invention system has interactive character (i.e. avatar) which assists the patients with instructions, helps the Doctors to understand, if the patient can take the instruction, process it and execute it which will also help the Doctors to understand the hearing disabilities (Hearing Impairment).
  • interactive character i.e. avatar
  • the present invention relates to a system and method of diagnosing or predicting the levels of autism spectrum disorders (ASD) in individuals and/or related emotional, physiological disorders using assistive intervention/therapy & scoring using XR-AI Platform.
  • the sensors such as eye tracker, BCI, EEG and fNIR integrated with Head Mounted Display (HMD)/ Virtual Reality Display (VRD) unit, which includes the wireless edge computing device.
  • the captured data from edge computing device is transferred to the back end i.e. unity platform. These obtained data will be validated by Doctors and label it. These labeled data come back to unity platform.
  • the process goes continuously and integrating it with Al system for prediction (ASD behavior logging and therapy prediction Al dashboard) (as shown in Figure 1(a)).
  • the Virtual Reality Display (VRD) may be a Head Mounted Display (HMD) or any other three dimensional display device.
  • the present invention relates to a system for predicting a level of autism spectrum disorders (ASD) using extended reality platform, the system comprising: a Virtual Reality Display (VRD) / Head Mounted Display (HMD) (110) configured to display one or more interactive animations to a user, a pair of eye tracking devices (112) (not shown) and a combination of one or more BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) (not shown) configured to capture the user’s eye gazing coordinates display on (111) a screen to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and configured to capture user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations, one or more processing unit (116) (not shown) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the
  • the cognitive functions includes attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation, behavioral functions includes gross motor skill, sensory simulation, escape, access to attention and access to tangible and linguistic functions includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending. Further, the sensors also measure gross motor skill, functional response, anxiety and relaxation, emotions, haptic feedback (tactile, Proprioception), reaction time, hemodynamics, audio response, success score of each sub task on the virtual simulated environments during specific animation incidents of the user.
  • the processing unit of VRD/HMD is further configured to monitor and analyze patterns of behaviors of the user based on eye gazing coordinates, categorize, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculate the attention span of the user during said specific interactive animation incidents occurring within said interactive animations and further configured to monitor and analyze patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions and calculate attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
  • the processing unit of the VRD is configured to compute the
  • Autism Rating score of the user comprises: computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
  • the eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc.
  • the processing unit of the VRD is configured to compute the
  • Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations, capturing, by the BCI (Brain Computer Interfaces), EEG (Electroencephalogram) and fNIR (Functional near-infrared spectroscopy) sensors of the VRD/FIMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
  • BCI Brain Computer Interfaces
  • EEG Electroencephalogram
  • fNIR Frectional near-infrared spectroscopy
  • the processing unit of the VRD is configured to compute the
  • Autism Rating score of the user comprises few steps: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations, capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity, capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold.
  • the system further comprising, a voice communication device
  • Figure 2 shows a hardware perspective of the system according to an exemplary implementation of the present invention.
  • FIG. 1 shows an example hardware perspective of the system.
  • the present invention provides diagnosis and intervention for ASD children and adults with a pair of eye tracking devices (112) (not shown) and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) fitted in an FIMD through high quality 3D graphical contents.
  • a pair of eye tracking devices (112) not shown
  • a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) fitted in an FIMD through high quality 3D graphical contents.
  • Diagnosis 1 . Develop ASD specific diagnostic solution for ASD children and Adults with eye tracking, sensors such as Brain-computer interface (BCI) (such as EEG) and fNIR and HMD integrated hardware’s and software’s.
  • BCI Brain-computer interface
  • EEG EEG
  • fNIR fNIR and HMD integrated hardware
  • the present invention system and method is an Affordable Autism
  • Intervention VR Kit for High Functional ASD which subjects the child to experience different virtual environments/ scenarios.
  • Such immersive and interactive medium helps them in differentiating and adapting to distinct situations.
  • the contents in the scenario gives an immersive look around on realistic images giving a trigger to brain, making it believe being in a real world.
  • the developed contents are based on high intensity and task-specific practices at the comfort of their homes giving them more exposure to the outside world.
  • intervention analysis can be done, and these data logs can be stored for further trainings in future.
  • Different VR environment thus enriches the child with awareness in availability of freedom in their movements, ensuring their engagement in social and meaningful way.
  • the processing unit (116) (not shown) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score.
  • the Virtual Reality Display (VRD) / Head Mounted Display (HMD) is integrated with combination of sensors and processing unit.
  • the processing unit of VRD/HMD is further configured to monitor and analyze patterns of behaviors of the user based on eye gazing coordinates, categorize, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculate the attention span of the user during said specific interactive animation incidents occurring within said interactive animations and further configured to monitor and analyze patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions and calculate attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
  • the processing unit of the VRD is configured to compute eye gazing coordinates parameters scores based on a specific threshold, cognitive functions, behavioral functions and linguistic functions parameters scores based on a specific threshold, performance of the user/patient/children’s scores based on a specific threshold, total number of attempts, success and failure scores based on a specific threshold, process and predicts the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score.
  • ASD autism spectrum disorders
  • the Autism Rating score may be based on
  • CARS 1 Childhood Autism Rating Scale
  • CARS 2 Childhood Autism Rating Scale
  • CARS 3 Childhood Autism Rating Scale
  • ISAA Indian Scale for Assessment of Autism
  • COPM Canadian Occupational Performance Measure
  • HMD Virtual Reality Head Mounted Display
  • a pupil detector helps in identifying the attention span of the child. This tracks the attentiveness of the child throughout the playing scenario and lets to know if the child is interested in the current scenario.
  • This pupil detector also generates a log data of span attention of the child which is further used by the doctors for training in the future. Vive tracker and leap motion are used to make the scenarios more interactive by capturing the hand motions real-time. With the help of BCI, doctors will get a real-time live feed data about the emotional responses happening in the brain.
  • the present invention system uses a game engine platform which is used for the development of 2D and 3D contents.
  • the interactive contents in virtual environment/ scenarios are created according to the requirement in Unity Engine.
  • Figure 3 shows an example working of the system according to an exemplary implementation of the present invention.
  • the figure shows an example embodiment working of the system.
  • the figure shows capturing live data of the diagnosis and intervention:
  • Figure 4 shows an example results of color identification and learning according to an exemplary implementation of the present invention.
  • the Results should be specific to the attempts based on the Colors, graph and Text should specify the colors.
  • Figure 5 shows an example login process of XR-CARE system requirement flow according to an exemplary implementation of the present invention.
  • the figure shows an example embodiment login process of XR-
  • Physician/Special Education Trainer has to login to both the Auticare Reporting System (Front-end/Dashboard) and to the Auticare Diagnosis/Intervention System (UNITY).
  • Both the systems should have one common Authentication Database to register the Special Education Trainer/doctors with their login and password credentials and other related information of the Special Education Trainer, hospitals address etc.
  • Auticare Diagnosis/Intervention System will display the details based on categories and VR intervention will be assigned to identified patients (using retina scan and two verification log in).
  • Figure 6 shows an example choosing diagnosis / intervention option of the system according to an exemplary implementation of the present invention.
  • the figure shows an example choosing diagnosis / intervention option of the system has various steps.
  • Auticare diagnosis/intervention system will have the list of patients registered for diagnosis/intervention in queue for the day. These patients should be listed or can be searched from the patient database.
  • Figure 7 shows an example diagnosis process according to an exemplary implementation of the present invention.
  • the figure shows an example embodiment diagnosis process which has various steps.
  • Figure 8 shows an example intervention process according to an exemplary implementation of the present invention.
  • FIG. 8 The figure shows an example embodiment intervention process which has various steps.
  • Figure 9 shows an example graph of color identification score card according to an exemplary implementation of the present invention.
  • the figure shows an example graph of color identification score card.
  • a set of various (e.g. 13) scenarios have been developed for the patients with focus given to social, self-care and cognitive intervention.
  • Figure 10 shows an ABA system (Auticare Diagnosis/Intervention
  • the figure shows an ABA system which has various steps.
  • Special Education Trainer login from different locations may not be permitted. Data of one institution accessible to within a closed loop and maximum permitted to institutions within that network. 11. Searching for old records of student to be based on Name
  • Body animation rigger to be identified and procured.
  • Road Crossing Score pattern a. Reaction time (first step) to signal change (Red to Green) is recorded b. Time taken to reach from one side to other side is noted c. Time elapsed standing on the middle of the road is noted d. Percentage of distance covered is noted (How Far patient went)
  • the patient is instructed by the doctor with the assistance of a portable interphase such as Tab and the patient is asked to wear the head mount display. If a patient is not willing to put on at first, a projected environment will be displayed on screen in front of the patient and the experience is given. The patient is logged in through retina scan and the data’s generated through the VR experience is send to the cloud network. The doctor will be able to monitor a patient based on live data output from the eye tacker and the BCI (Brain Computer Interphase). For the first time, doctors will be able to monitor a patient through these feedbacks which will give them a clear picture about the patient. These feedbacks will be recorded every time a patient is logged in and so the continuous monitoring will be possible. The doctors thus can understand in depth about the disability and will be able to prescribe treatment methods for a patient in the individual level. This will help these autism patients to overcome their communicative and interactive skill to a great level.
  • a portable interphase such as Tab
  • Figure 11 shows an example step by step process of the system according to an exemplary implementation of the present invention.
  • FIG. 1 shows an example step by step process of the system.
  • Step 1 Profile Management
  • Step 2 First Level of Diagnose with Eye Tracker Eye Alignment Pupil Detection
  • Step 3 Second Level of Diagnose with Eye Tracker Datasets Comparison
  • Step 4 Intervention High Quality Graphical Content Based - Unity.
  • Figure 12 shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention.
  • ASD autism spectrum disorders
  • the figure shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention.
  • the method for predicting a level of autism spectrum disorders (ASD) using extended reality platform comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210), capturing, by a pair of eye tracking device and a combination of one or more BCI (Brain Computer Interfaces), EEG (Electroencephalogram) and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user’s eye gazing coordinates display on a screen (111 ) to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and capturing by the BCI, EEG and fNIR sensors, the user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations (12
  • the cognitive function includes gross motor skill, attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation behavioral function includes sensory simulation, escape, access to attention and access to tangible and linguistic function includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending. Further, the sensors also measure gross motor skill, functional response, anxiety and relaxation, emotions, haptic feedback (tactile, Proprioception), reaction time, hemodynamics, audio response, success score of each sub task on the virtual simulated environments during specific animation incidents of the user.
  • the method further comprising, monitoring and analyzing patterns of behaviors of the user based on eye gazing coordinates, categorizing, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculating the attention span of the user during said specific interactive animation incidents occurring within said interactive animations.
  • the method further comprising, monitoring and analyzing patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions, calculating attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
  • the method of computing the Autism Rating score of the user comprises computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
  • the eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc.
  • the method of computing Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations, capturing, by the BCI, EEG and fNIR sensors of the VRD/HMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
  • the method of computing Autism Rating score of the user further comprises: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations, capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity, capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold.
  • the method further comprising, providing a voice communication to the user to identify and perform the activity in specific animation incidents occurring within the interactive animation.
  • Figure 12 shows an example level-1 diagnosis - Inter-Pupillary Distance and Eye Alignments according to an exemplary implementation of the present invention.
  • the figure shows an example level-1 diagnosis - Inter-Pupillary Distance and Eye Alignments has various steps.
  • Step 1 Measure the distance between the eyes with the Eye Tracker integrated with Head Mounted Display (HMD).
  • HMD Head Mounted Display
  • Step 2 Check if the eyes are Crossed Eye/Lazy Eye OR if the visual axis of one eye higher than the other eye OR if the eyes are outward turning. The database looks for a match and allow the sign in.
  • Step 3 Check if the Eye Tracker can adjust and scan these pupil position
  • Step 4 Check if the eyes are centered within the frame by comparing with reference point
  • Step 5 Check if the eyes are out of focus by comparing with reference point
  • Step 6 Check if the Eyes can verge and accommodate to a focal point by comparing with reference point
  • Step 7 If the above diagnosis is successful, then the next step is Diagnosis Level-2
  • Step 8 If the above diagnosis fails, then the VR intervention cannot be done.
  • Figure 13 shows an example level-2 diagnosis - ASD & Non ASD Comparison according to an exemplary implementation of the present invention.
  • Each session will be conducted for non ASD similar age group and ASD as well. And take the test data.
  • Figure 14 shows an example level-3 ASD Content specific Diagnosis according to an exemplary implementation of the present invention.
  • Step 1 Create High Quality contents for Diagnosis in UNITY.
  • Step 2 The Eye tracking team and the AR/VR Team come together for testing the content through the Eye Tracking integrated HMD to record the feedback for the Diagnostic Contents into a dataset database cloud.
  • Step 3 create the Eye Tracking records for the diagnostic contents and need to create relations.
  • the present invention method and system provides outstanding visualization and immersive sessions that aren’t possible in traditional forms of ABA intervention.
  • the present invention method and system provides engaging and entertaining form of treatment intervention.
  • the data is collected in the dashboard which can be used by the Special Education Trainer as and when required, which is in turn used to predict the next intervention needed.
  • the present invention provides BCI diagnosis and Al based Behavior Performance data on cloud.
  • This scenario aims to improve the child's ability to follow instructions, visual attention, memory and also help the patient learn and get a much clearer understanding about colours that would help them to familiarize and discriminate between colours.
  • This intervention also teaches the patient to be calm and also understand what colours the patient prefers more compared to other colours.
  • the doctors could investigate the reporting system on the results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s learning.
  • the patient could draw through the 3D space in this scenario.
  • Patient is taught to draw in the 3D space using his fingers and he can select any color of his choice. His attempt to draw and choosing the color is updated in the database. This is a learning process.
  • the patient is allowed to stand at the side of the road. He/she is taught to understand the traffic colors.
  • the patient’s attempt to cross the road according to the traffic signals are updated in the database.
  • the patient will be at the zebra crossing with lights showing instructions and vehicles obeying traffic rules.
  • Patient scores points for walking in the zebra cross based on signal light.
  • Time elapsed for patient to walk from starting to end of zebra cross is also noted.
  • the database is updated according to the attempt. Intervention:
  • the patient will have to identify the correct item and put it in the basket, which involves the improvement of motor skills, attention skills, memory skills, visual and auditory attention etc.
  • the items are then to be taken to the billing section, where the patient keeps the taken items on the billing counter, which again involves the improvement of motor skills.
  • the patient has to pickup the correct amount from the displayed notes, which improves the memory power and the knowledge about handling money. If the amount paid is higher the balance will be given back, this helps the patient to understand the concept of money and transactions.
  • the present invention method and system of diagnosing or predicting the levels of autism spectrum disorders (ASD) using XR-AI platform disclosed herein are not limited to a particular computer system platform, controller, processor, operating system, or network.
  • the method and the system disclosed herein are not limited to be executable on any particular system or group of systems, and are not limited to any particular distributed architecture, network, or communication protocol.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Psychiatry (AREA)
  • Veterinary Medicine (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Neurology (AREA)
  • Physiology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Educational Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Neurosurgery (AREA)
  • Artificial Intelligence (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention relates to a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform (1200). In one embodiment, the method comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210); capturing, by a pair of eye tracking device and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user's eye gazing coordinates display on a screen (111) to predict the attention span and the user's cognitive functions, behavioral functions and linguistic functions during one or more specific interactive animation incidents occurring within said interactive animations, computing, by a processing unit of the VRD, an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions (1230), validating, by the processing unit of the VRD/HMD, the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score (1240).

Description

A SYSTEM AND METHOD OF DIAGNOSING OR PREDICTING THE LEVELS
OF AUTISM SPECTRUM DISORDERS (ASD) USING XR-AI PLATFORM
TECHNICAL FIELD
[0001] The present invention mainly relates to a system and method of diagnosing or predicting the levels of autism spectrum disorders (ASD) in individuals and/or related emotional, physiological disorders using assistive intervention/therapy & scoring using XR-AI Platform.
BACKGROUND
[0002] Autism spectrum disorder (ASD) is well known in the art which is a range of mental disorders of the neurodevelopmental type. It includes Childhood Disintegrative Disorder, Pervasive Developmental Disorder, Autism and Asperger syndrome. Autism spectrum disorder relates to a variety of brain developmental disorders. These conditions, commonly recognized as autism, are characterized by social skill problems, both verbal and nonverbal communication, repetitive and stereo typed behaviors, delayed childhood development, and other distinctive strengths and challenges.
[0003] While there are distinct kinds of ASD, prevalent experiences among individuals with the disorder include social disability and repetitive behavior acceptance. Some kids with autism may appear to have symptoms from birth, while others may develop more apparent signs as their age. Individuals on the spectrum often experience difficulties with social communication, interaction, restricted and repetitive patterns of behavior, interests, or activities. Symptoms are typically recognized between one and two years of age. Long-term problems may include difficulties in performing daily tasks, creating and keeping relationships, and maintaining a job. Further, according to estimates from the Centers for Disease Control (CDC), Autism and Developmental Disability Monitoring Network, every 1 child in 59 are recognized with autism spectrum disorder (ASD).
[0004] Over the years one of the major techniques that proved to be effective and useful for treatment is Applied Behavior Analysis (ABA). In ABA, the therapist/ Special Education Trainer initially attempts to know about the individual with ASD's specific behaviors. They also need to learn about the impacts on this conduct of their setting and how the individual learns. ABA seeks to boost desirable behaviors by using beneficial strengthening to decrease harmful or isolating behaviors. ABA can also assist enhance communication, memory, attention, concentrate and efficiency at the academic level.
[0005] Research indicates that many people with autism spectrum disorder have impairments related to their cognitive and social abilities in their daily living activities. Adaptive behavior encompasses daily activities important to functional independence, including communication, social, and daily living skills. Recently to address the lack of social and cognitive skills among autistic children, several experts and therapists/ Special Education Trainer have adopted virtual reality technology to provide social and cognitive training.
[0006] Virtual reality is a computer-based technology developed to create artificial simulation placing the user in an immersive experience. Virtual reality treatment is the use of virtual reality technology for different rehabilitation process. Several examples of virtual reality treatment include iBIoom VR, Neuro- Rehab VR, VAST Rehab, Verapy and Looxid VR. One of the prior arts uses virtual reality to provide a supplementary method of teaching social and communication skills for individuals with Autism Spectrum Disorder. Another prior art that uses virtual reality as a base level tool in support to the other assistive technology intervention while VHAB is another product that uses personalized simulated environments on-screen that children with autism or cerebral palsy can explore through motion sensors as they carry out their physiotherapy regimen. This is an on-screen therapy module which may result the autistic kids to have lower concentration levels since they will be distracted by other objects and sound stimulus in the room which prove to reduce their learning curve.
[0007] The above-mentioned prior arts have several disadvantages i.e. it requires complex hardware’s and software’s and time consuming. Lacks efficiency, design to analyze improvements, flexibility in terms of treatment scenarios, observations which leads to error prone and ineffectiveness, etc.
[0008] In addition, currently doctors who are dealing with neurodevelopmental disorders have to go through extensive research and counselling to understand and determine the level of treatment required for each patient. This has resulted in delayed or generalized treatment method for different levels of neurodevelopmental disorders. [0009] Therefore, there is a need in the art with a system and method of diagnosing or predicting the levels of autism spectrum disorders (ASD) and to solve the above-mentioned limitations.
SUMMARY OF THE INVENTION
[0010] An aspect of the present invention is to address at least the above- mentioned problems and/or disadvantages and to provide at least the advantages described below.
[0011] Accordingly, in one aspect of the present invention relates to a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform (1200), the method comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210), capturing, by a pair of eye tracking device and a combination of one or more BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user’s eye gazing coordinates display on a screen (111 ) to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and capturing by the BCI, EEG and fNIR sensors, the user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations (1220), computing, by processing unit of the VRD, an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions (1230), and validating, by the processing unit of the VRD/HMD, the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score (1240).
[0012] Another aspect of the present invention relates to a system for predicting a level of autism spectrum disorders (ASD) using extended reality platform, the system comprising: a Virtual Reality Display (VRD) / Head Mounted Display (HMD) (110) configured to display one or more interactive animations to a user, a pair of eye tracking devices (112) and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) configured to capture the user’s eye gazing coordinates display on (111 ) a screen to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and configured to capture user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations, a processing unit (116) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the computed Autism Rating score of the user with a prestored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score.
[0013] Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS [0014] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0015] Figure 1 shows an outline of the ABA system using XR-AI Platform according to an exemplary implementation of the present invention.
[0016] Figure 2 shows a hardware perspective of the system using XR-AI
Platform according to an exemplary implementation of the present invention.
[0017] Figure 3 shows an example working of the system according to an exemplary implementation of the present invention.
[0018] Figure 4 shows an example results of color identification and learning according to an exemplary implementation of the present invention.
[0019] Figure 5 shows an example login process of XR-CARE System
Requirement Flow according to an exemplary implementation of the present invention. [0020] Figure 6 shows an example choosing diagnosis or intervention option of the system according to an exemplary implementation of the present invention. [0021] Figure 7 shows an example diagnosis process according to an exemplary implementation of the present invention.
[0022] Figure 8 shows an example intervention process according to an exemplary implementation of the present invention.
[0023] Figure 9 shows an example graph of color identification score card according to an exemplary implementation of the present invention.
[0024] Figure 10 shows an ABA system according to an exemplary implementation of the present invention.
[0025] Figure 11 shows an example step by step process of the system according to an exemplary implementation of the present invention.
[0026] Figure 12 shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention.
[0027] Figure 13 shows an example level-1 diagnosis - Inter-Pupillary
Distance and Eye Alignments according to an exemplary implementation of the present invention.
[0028] Figure 14 shows an example level-2 diagnosis - ASD & Non ASD
Comparison according to an exemplary implementation of the present invention. [0029] Figure 15 shows an example level-3 ASD Content specific Diagnosis according to an exemplary implementation of the present invention.
[0030] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present invention. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0031] The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
[0032] The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
[0033] It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
[0034] By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
[0035] Figures discussed below, and the various embodiments used to describe the principles of the present invention in this patent document are by way of illustration only and should not be construed in any way that would limit the scope of the invention. Those skilled in the art will understand that the principles of the present invention may be implemented in any suitably arranged communications system. The terms used to describe various embodiments are exemplary. It should be understood that these are provided to merely aid the understanding of the description, and that their use and definitions in no way limit the scope of the invention. Terms first, second, and the like are used to differentiate between objects having the same terminology and are in no way intended to represent a chronological order, unless where explicitly stated otherwise. A set is defined as a non-empty set including at least one element.
[0036] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present disclosure/ invention. It will be apparent, however, to one skilled in the art that the present disclosure/ invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0037] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the presently invention and are meant to avoid obscuring of the presently disclosure/ invention.
[0038] The various embodiments of the present invention describe about an applied behavior analysis (ABA) system which helps the physicians/therapists/Special Education Trainer /doctors to perform diagnosis and intervention /therapy/skill development for the patients/users. The present invention system helps the doctors with precise and live feedback of the diagnosis and intervention/therapy/skill development to take knowledgeable decisions on patients.
[0039] The present invention provides a revolutionized product which is based on Virtual Reality to provide proper diagnosis and treatment for such patients. The present invention product trains and improve the responses and skills of high functionality autistic people. The patients/users/children’s who have trouble in concentrating on a task can use the Virtual Reality headgear where they get to see realistic images in front of them in 3 Dimension. This gives a trigger to the brain, making them believe that they are in a real world.
[0040] A Virtual Reality environment provides a system where children diagnosed with high functional autism can practice cognitive, social, communication and self-care interactions in a safe and controlled environment. By the repetition and analysis of these virtual interactions, children can improve their cognitive, social, communication and self-care skills, etc. The present invention is designed and implemented with various scenarios, a virtual reality (VR) environment provides the data for Doctors to analyze the autistic children. The present invention system integrates sensors such as eye tracker, i.e. a pair of eye tracking devices and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors, in order to get more informative data and using VR-based assistive technology to help users with neurodiverse backgrounds to practice the interactive and self-care skills. An interactive character i.e. avatar is interacting with children during session, it will increase the chances of interaction with humans. The present invention is an autism diagnosis and intervention/therapy/skill development module which is an immersive experience designed to help children with high functional autism practice cognitive, social and self-care skills.
[0041] The present invention system is based on building an Al (Artificial intelligence) engine algorithm which help to monitor and analyze the patterns of behavior shown by high functional autistic children and other neuro- developmental disorders. Through Al (Artificial intelligence) engine method steps, the patient’s responses to various scenarios are analyzed. The scenarios include various activities which improve the cognitive, social, motor, self-care and creative skills of the autistic and other disabled children through the ABA intervention/therapy/skill development and virtual reality treatment system. The VR therapy system is more fun and engaging than traditional therapy. It also yields more result than the traditional ones. The child is encouraged and rewarded accordingly. The scenarios will help to understand the particular shapes, colors, objects and certain situations that trigger negative and aggressive behavior of the child. Finally, the results of the diagnosis are entered in the database cloud and the large set of values Al (Artificial intelligence) engine will populate the most specific patterns in behavior that causes hesitancy or aggressiveness in patients. The doctors will be able to provide improved treatment rather than uncomfortable and stressful environments from such exercises, which will help them to develop their cognitive, social and self-care skills faster.
[0042] Virtual Reality Flead Mounted Display (HMD) is provided to experience the 3-Dimensional contents of different scenarios in which a pupil detector is equipped, which helps in identifying the attention span of the child. This tracks the attentiveness of the child throughout the playing scenario and allows to know whether the child is interested in the current scenario. This pupil detector also generates a log data of span attention of the child which is further used by the doctors for training in the future. Vive tracker and leap motion are used to make the scenarios more interactive by capturing the hand motions realtime. With the help of EEG and fNIR doctors will get real-time live feed data about the emotional responses happening in the brain.
[0043] The technology used in the present invention is Virtual Reality. The scenarios are played with the help of high-end laptop and leap motion sensors transforms real world hand gestures into Virtual Reality.
[0044] Virtual Reality is an interactive experience in a simulated environment containing both auditory and visual contents. It presents a near to human’s experience with an immersive look around on realistic images giving a trigger to brain, making it believe being in a real world thus keeping a potential to stimulate and access the person. VR projections have not only been limited in entertainment, but also extended to social science, psychology, healthcare, clinical therapies, education, training, fine arts, tourism, engineering, safety, health, marketing etc.
[0045] The CARS (Childhood Autism Rating Scale) score method works by rating child’s behavior, characteristics and abilities. Also, ADOS-2 (Autism Diagnostic Observation Schedule) is an instrument used to evaluate communication and socialization characteristics of autistic children. These are the manual methods, which gives attention span, motor skills and haptics.
[0046] The present invention product is an observation assistive tool that helps to maintain the CARS score. These observations are the inputs and by using Applied Behavior Analysis (ABA), psychologists, occupational Therapist and speech-Language Pathologist validate the CARS score and IQ.
[0047] In the present invention system and method, the input is the virtual observation through machine instead manual observation. Here scoring is based on virtual reality scenarios. The present invention has categorized intervention/therapy module for observing attention span, gross motor skills, haptic feedbacks and social interactions through the sensors such as eye tracker, fNIR and EEG that is processed on a BCI platform that is integrated with Head Mounted Display. The present invention system further has a clinical validation and based on that one will get outcomes, i.e. , success/ failure count. The present invention product is HIPPA compliance.
[0048] In an example, the ABA system may be referred as Auticare system.
[0049] Figure 1 shows an outline of the ABA diagnosis system using XR-
Al Platform according to an exemplary implementation of the present invention. [0050] Figure 1a shows an example ABA diagnosis system using XR-AI
Platform according to an exemplary implementation of the present invention.
[0051] The present invention system has created a set of immersive experiences that are aimed at improving the cognitive, social and self-care skills of a patient. The present invention system acquires input: observing the character of the autistic children through VR (Immersive Technology) and provides output: success & failure count. These exercises are arranged in such a way that the patients are introduced into these immersive experiences through simple steps and going more specific as the patient shows signs of improvement. The next phase of development will incorporate eye tracking and BCI sensors that will give valuable feedback to doctors for them to analyze the patient’s difficulty in detail. These details will be uploaded in cloud and can be accessed by doctors through the patient’s treatment to note changes in behavior. [0052] The present invention is not a hospital management system. The present invention system is an ABA treatment system which helps the physicians/Special Education Trainer/Doctors to do diagnosis and intervention for the patients. The present invention system helps the doctors with precise and live feedback of the diagnosis and intervention to take knowledgeable decisions on patients. The first version of the system will have the Color Learning Scenarios for cognitive development. The present invention system does the diagnosis and intervention for the patients by engaging the patients playing with colors. This scenario checks the gross motor skills of the patient by touching the colors with their hands and improve these skills by further intervention. The scenario that helps the patient to hold the brush to paint will check the Fine motor skills. Visual abilities with ability to focus their eyes for visual information is the fundamental for using the present invention system. Healthy eyes and good vision play an important role even before they learn to reach and grab with their hands which are unique offerings. This exercise will help the physician to understand the visual and hearing disabilities and to take decision on to go with next set of scenarios. Without passing this scenario, one cannot do the rest of the scenarios. This is an XR base system with high quality visual and audio effect with optimized color experience for the patients. The present invention system has naming, identifying and matching of color scenarios. Incorrectly identifying all colors all the times can be overcome by learning, it’s a developmental issue. Incorrectly identifying red as yellow is color blindness.
[0053] The present invention system has interactive character (i.e. avatar) which assists the patients with instructions, helps the Doctors to understand, if the patient can take the instruction, process it and execute it which will also help the Doctors to understand the hearing disabilities (Hearing Impairment).
[0054] Other Cognitive development scenarios include Ball Picking
(Count), Basket Interaction (Count), Fruit Grouping (Count), Object Identification (Count), Painting.
[0055] Social - Scenarios include Road Crossing Teaching, Road
Crossing, Super Market, etc.
[0056] Selfcare - Scenarios include European Restroom, Indian
Restroom, etc.
[0057] Initially, integrate the content/scenario data through cloud. Then capture the live diagnosis and intervention data combined and report through cloud. Through the data capture, every action is obtained with high accuracy of hundreds of patients and the reports of these actions are obtained in a high- quality visual user interface. This helps the doctors to see the data with high accuracy to take wise decisions. From these data get an accurate result of the patience activities reducing the errors by 100% and get a comparison reports of each visits and sessions. It can be used to find out the progression of each patient.
[0058] The present invention relates to a system and method of diagnosing or predicting the levels of autism spectrum disorders (ASD) in individuals and/or related emotional, physiological disorders using assistive intervention/therapy & scoring using XR-AI Platform. In present invention system, the sensors such as eye tracker, BCI, EEG and fNIR integrated with Head Mounted Display (HMD)/ Virtual Reality Display (VRD) unit, which includes the wireless edge computing device. The captured data from edge computing device is transferred to the back end i.e. unity platform. These obtained data will be validated by Doctors and label it. These labeled data come back to unity platform. The process goes continuously and integrating it with Al system for prediction (ASD behavior logging and therapy prediction Al dashboard) (as shown in Figure 1(a)). The Virtual Reality Display (VRD) may be a Head Mounted Display (HMD) or any other three dimensional display device.
[0059] In one embodiment, the present invention relates to a system for predicting a level of autism spectrum disorders (ASD) using extended reality platform, the system comprising: a Virtual Reality Display (VRD) / Head Mounted Display (HMD) (110) configured to display one or more interactive animations to a user, a pair of eye tracking devices (112) (not shown) and a combination of one or more BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) (not shown) configured to capture the user’s eye gazing coordinates display on (111) a screen to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and configured to capture user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations, one or more processing unit (116) (not shown) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score. The Virtual Reality Display (VRD) / Head Mounted Display (HMD) is integrated with combination of one or more sensors and processing unit.
[0060] The cognitive functions includes attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation, behavioral functions includes gross motor skill, sensory simulation, escape, access to attention and access to tangible and linguistic functions includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending. Further, the sensors also measure gross motor skill, functional response, anxiety and relaxation, emotions, haptic feedback (tactile, Proprioception), reaction time, hemodynamics, audio response, success score of each sub task on the virtual simulated environments during specific animation incidents of the user.
[0061] The processing unit of VRD/HMD is further configured to monitor and analyze patterns of behaviors of the user based on eye gazing coordinates, categorize, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculate the attention span of the user during said specific interactive animation incidents occurring within said interactive animations and further configured to monitor and analyze patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions and calculate attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
[0062] The processing unit of the VRD is configured to compute the
Autism Rating score of the user comprises: computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
[0063] The eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc. [0064] The processing unit of the VRD is configured to compute the
Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations, capturing, by the BCI (Brain Computer Interfaces), EEG (Electroencephalogram) and fNIR (Functional near-infrared spectroscopy) sensors of the VRD/FIMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
[0065] The processing unit of the VRD is configured to compute the
Autism Rating score of the user comprises few steps: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations, capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity, capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold.
[0066] The system further comprising, a voice communication device
(118) (not shown) via interactive character i.e. Avatar which provides voice communication to the user to identify and perform the activity in specific animation incidents occurring within the interactive animation. The computed score is validated with standardized measure i.e. CFCS (Communication Function Classification System).
[0067] Figure 2 shows a hardware perspective of the system according to an exemplary implementation of the present invention.
[0068] The figure shows an example hardware perspective of the system.
The present invention provides diagnosis and intervention for ASD children and adults with a pair of eye tracking devices (112) (not shown) and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) fitted in an FIMD through high quality 3D graphical contents.
Diagnosis: 1 . Develop ASD specific diagnostic solution for ASD children and Adults with eye tracking, sensors such as Brain-computer interface (BCI) (such as EEG) and fNIR and HMD integrated hardware’s and software’s.
2. Develop ASD specific diagnostic solution for ASD children and Adults through virtual reality contents.
Intervention:
Develop therapeutic solution for ASD children and Adults, by creating ASD specific high-quality graphical contents in Virtual Reality.
[0069] The present invention system and method is an Affordable Autism
Intervention VR Kit for High Functional ASD (Autism) which subjects the child to experience different virtual environments/ scenarios. Such immersive and interactive medium helps them in differentiating and adapting to distinct situations. The contents in the scenario gives an immersive look around on realistic images giving a trigger to brain, making it believe being in a real world. The developed contents are based on high intensity and task-specific practices at the comfort of their homes giving them more exposure to the outside world. By tracking the attention span of the child through pupil detection, intervention analysis can be done, and these data logs can be stored for further trainings in future. Different VR environment thus enriches the child with awareness in availability of freedom in their movements, ensuring their engagement in social and meaningful way.
[0070] In the present invention system, the pair of eye tracking devices
(112) (not shown) and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) (not shown) configured to capture the user’s eye gazing coordinates display on (111 ) a screen to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and configured to capture user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations. The processing unit (116) (not shown) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score. The Virtual Reality Display (VRD) / Head Mounted Display (HMD) is integrated with combination of sensors and processing unit. The processing unit of VRD/HMD is further configured to monitor and analyze patterns of behaviors of the user based on eye gazing coordinates, categorize, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculate the attention span of the user during said specific interactive animation incidents occurring within said interactive animations and further configured to monitor and analyze patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions and calculate attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation. [0071] In one embodiment, the processing unit of the VRD is configured to compute eye gazing coordinates parameters scores based on a specific threshold, cognitive functions, behavioral functions and linguistic functions parameters scores based on a specific threshold, performance of the user/patient/children’s scores based on a specific threshold, total number of attempts, success and failure scores based on a specific threshold, process and predicts the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score.
[0072] In one embodiment, the Autism Rating score may be based on
CARS 1 (Childhood Autism Rating Scale), CARS 2 (Childhood Autism Rating Scale), CARS 3 (Childhood Autism Rating Scale), ISAA (Indian Scale for Assessment of Autism) score. Further, the scales like COPM (Canadian Occupational Performance Measure) may be used.
[0073] Virtual Reality Head Mounted Display (HMD) is provided to experience the 3- dimensional contents of different scenarios in which a pupil detector is equipped, which helps in identifying the attention span of the child. This tracks the attentiveness of the child throughout the playing scenario and lets to know if the child is interested in the current scenario. This pupil detector also generates a log data of span attention of the child which is further used by the doctors for training in the future. Vive tracker and leap motion are used to make the scenarios more interactive by capturing the hand motions real-time. With the help of BCI, doctors will get a real-time live feed data about the emotional responses happening in the brain.
[0074] The present invention system uses a game engine platform which is used for the development of 2D and 3D contents. The interactive contents in virtual environment/ scenarios are created according to the requirement in Unity Engine.
[0075] Figure 3 shows an example working of the system according to an exemplary implementation of the present invention.
[0076] The figure shows an example embodiment working of the system. In an example, the figure shows capturing live data of the diagnosis and intervention:
Once the doctor/Special Education Trainer login to the Diagnosis/Therapy (Intervention) System (DTS) for the treatment, the data is captured live to the scenario database. For the color identification and learning scenario,
Diagnosis:
1) If the patient/user/children is asked (voice over by interactive character i.e. Avatar) to touch the Red Color for 5 Times, the database should get updated with 5 Attempts.
2) If the patient touches the Red color once, the database should get updated with “success” 1 time.
3) Further, the database should be updated with “Fail” 4 Times
4) When physician completes the diagnosis and hit the finish or complete or closes the session, the database should get updated with the diagnosis details.
5) Now the Doctor investigates the Reporting System on the Diagnosis Results.
6) Fie will be able to see the Results in a Graphical and Textual Format. Intervention:
1) If the patient is asked (voice over by interactive character i.e. Avatar) to touch a color and based on the color chosen, the details should get updated to the system, if the patient chooses Yellow color 5 times and Red Color for 4 Times, the database should get updated with 5 Attempts of Yellow Color and 4 attempts of Red Color learning. Whenever a color is touched, Avatar should voice over on the color touched.
2) When the physician completes the intervention and hit the finish or complete or closes the session, the database should get updated with the intervention details.
3) Now the doctor will investigate the reporting system on the intervention results.
4) He will be able to see the results in a graphical and textual format.
5) Then the doctor can go for another diagnosis to check the improvement.
[0077] Figure 4 shows an example results of color identification and learning according to an exemplary implementation of the present invention. For color Identification and learning, the Results should be specific to the attempts based on the Colors, graph and Text should specify the colors.
[0078] Figure 5 shows an example login process of XR-CARE system requirement flow according to an exemplary implementation of the present invention.
[0079] The figure shows an example embodiment login process of XR-
CARE system requirement flow has various steps.
1) Physician/Special Education Trainer has to login to both the Auticare Reporting System (Front-end/Dashboard) and to the Auticare Diagnosis/Intervention System (UNITY).
2) Both the systems should have one common Authentication Database to register the Special Education Trainer/doctors with their login and password credentials and other related information of the Special Education Trainer, hospitals address etc.
3) If there are individual databases for each system to store the physicians/ Special Education Trainer credentials, there should be a mechanism to synchronize these database details.
4) Before doing any diagnosis or intervention OR before getting into the Auticare Diagnosis/Intervention system (UNITY), the physician/ Special Education Trainer has to login to the Auticare Reporting System (Front- end/Dashboard).
5) For new registration - all the relevant details related to the new patient need to be updated in the patient database through the Auticare Reporting system.
6) Once the details are available in the patient database, Auticare Diagnosis/Intervention System will display the details based on categories and VR intervention will be assigned to identified patients (using retina scan and two verification log in).
[0080] Figure 6 shows an example choosing diagnosis / intervention option of the system according to an exemplary implementation of the present invention. [0081] The figure shows an example choosing diagnosis / intervention option of the system has various steps.
1) After the physician login to the Auticare Reporting System, he enters the details of the patient, based on the new or existing patient details. And choose the option to diagnose OR intervention. a. If the physician/ Special Education Trainer chooses diagnosis, and if it’s a new patient, all the details pertaining to the patient, Special Education Trainer, hospital needs to be updated into the patient database. b. If the physician/ Special Education Trainer chooses intervention, and if it’s an existing patient, all the details pertaining to the patient, Special Education Trainer, hospital needs to be updated into the patient database.
2) Once the physician/ Special Education Trainer chooses either one of these options (Diagnosis or Intervention), the patient is ready for the Diagnosis or Intervention.
3) Now the physician/ Special Education Trainer has to login to Auticare diagnosis/intervention system to proceed with option that is chosen a. Auticare diagnosis/intervention system will have the list of patients registered for diagnosis/intervention in queue for the day. These patients should be listed or can be searched from the patient database.
[0082] Figure 7 shows an example diagnosis process according to an exemplary implementation of the present invention. The figure shows an example embodiment diagnosis process which has various steps.
1 . Ask the patient to identify the colors. a) Ask the patient - voice over - “Please touch the Red color” - In English/Malayalam/any other language. b) If patient touches - successful - update the database with successful count c) if the patient didn’t touch the correct color - Unsuccessful - Update the database with unsuccessful count d) repeating this process choosing colors with various colors for number of counts as per physicians’ discretion. e) Complete the Diagnosis and hit Done/Submission button.
2. Appreciation voice/ Animations over for each successful intervention. (Reward System) a. In an example, stars busting, Gold coins, Clap sound. b. In an example, Voice over “Congratulations!
3. Every voice over should have English and Malayalam / any other language version
4. Session count should Auto Generate
5. Start Date and Time of each session [0083] Figure 8 shows an example intervention process according to an exemplary implementation of the present invention.
[0084] The figure shows an example embodiment intervention process which has various steps.
As per the Doctors discretion intervention sessions are conducted. In an example embodiment,
“Let us see if you can identify these colors!” Come on let’s have some fun with colors” Are you ready! Here we go! I I. Ask the patient - “Please choose a color!”. On choosing a color - if the patient has chosen Red color - “It’s a Red color” - Repeat as many times as you want
II II. Ask the patient - “Please choose a color!”. On choosing a color - if the patient has chosen Yellow color - “It’s a Yellow color”
III. Repeat I and II and go for Diagnosis to check the learning.
[0085] Figure 9 shows an example graph of color identification score card according to an exemplary implementation of the present invention.
[0086] The figure shows an example graph of color identification score card.
A set of various (e.g. 13) scenarios have been developed for the patients with focus given to social, self-care and cognitive intervention.
[0087] Figure 10 shows an ABA system (Auticare Diagnosis/Intervention
System) using XR-AI Platform according to an exemplary implementation of the present invention.
[0088] The figure shows an ABA system which has various steps.
1. All Scenarios to have Al audio that gives instruction at the start and invites users to engage in the intervention scenario.
2. Video Based Login Page for Special Education Trainer & patients
3. New build to be created with room setup as per client side and avatar animation repositioned to be visible to users.
4. Data from all scenarios are taken after each session, so database need to arrange accordingly.
5. Screenshot option in 3D view.
6. Ball throwing target to lit up in green when ball hits.
7. Audio of repeated instructions to be routed from a set of 5 choices (Both English and Malayalam/any other language) looped in random.
8. Tone of voice to be pleasant and encouraging
9. Audio Language choice (English or Malayalam or any other language) to be added at the Exercise Choosing Page
10. Special Education Trainer login from different locations may not be permitted. Data of one institution accessible to within a closed loop and maximum permitted to institutions within that network. 11. Searching for old records of student to be based on Name
12. When a session ends, the page will have a congratulation message and few interactive balloons for the patient to play with.
13. Animation to be developed based on Audio and Rigged for hand and Facial interaction.
14. Body animation rigger to be identified and procured.
15. Timeline based animation need to be created for better interaction
16. Screenshot option to be available for all the exercises with date and time stamp. Document saved in folder with patient’s name
17. Every interaction to have game-based start (3... 2....1... Start)
18. Road Crossing Score pattern: a. Reaction time (first step) to signal change (Red to Green) is recorded b. Time taken to reach from one side to other side is noted c. Time elapsed standing on the middle of the road is noted d. Percentage of distance covered is noted (How Far patient went)
19. Road Crossing with one car non stopping a. Reaction to cars approach (Did Car Hit or Miss) b. Time taken for patient to take Evasive Action c. Time taken to reach from one side to other side is noted d. Percentage of distance covered is noted (How Far patient went)
[0089] The patient is instructed by the doctor with the assistance of a portable interphase such as Tab and the patient is asked to wear the head mount display. If a patient is not willing to put on at first, a projected environment will be displayed on screen in front of the patient and the experience is given. The patient is logged in through retina scan and the data’s generated through the VR experience is send to the cloud network. The doctor will be able to monitor a patient based on live data output from the eye tacker and the BCI (Brain Computer Interphase). For the first time, doctors will be able to monitor a patient through these feedbacks which will give them a clear picture about the patient. These feedbacks will be recorded every time a patient is logged in and so the continuous monitoring will be possible. The doctors thus can understand in depth about the disability and will be able to prescribe treatment methods for a patient in the individual level. This will help these autism patients to overcome their communicative and interactive skill to a great level.
[0090] Figure 11 shows an example step by step process of the system according to an exemplary implementation of the present invention.
[0091] The figure shows an example step by step process of the system.
Step 1 : Profile Management
• Registration & Sign in
• With Biometric iris recognition technology
• Using Eye Tracking - HMD
Step 2: First Level of Diagnose with Eye Tracker Eye Alignment Pupil Detection
• Check if the eyes are centered within the camera frame
• Check if the camera is able to detect the pupil
Step 3: Second Level of Diagnose with Eye Tracker Datasets Comparison
• Check Gaze origin
• Check Gaze direction
• Analyze the Heatmap
• Check the pupil position
• Check the pupil size • Fear Factor
• Lights
• Sounds
• Emotions
Step 4: Intervention High Quality Graphical Content Based - Unity.
[0092] Figure 12 shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention.
[0093] The figure shows a method for predicting a level of autism spectrum disorders (ASD) using extended reality platform according to an exemplary implementation of the present invention. In one embodiment, the method for predicting a level of autism spectrum disorders (ASD) using extended reality platform, the method comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210), capturing, by a pair of eye tracking device and a combination of one or more BCI (Brain Computer Interfaces), EEG (Electroencephalogram) and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user’s eye gazing coordinates display on a screen (111 ) to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and capturing by the BCI, EEG and fNIR sensors, the user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations (1220), computing, by a processing unit of the VRD, an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions (1230) and validating, by the processing unit of the VRD/HMD, the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score (1240).
[0094] The cognitive function includes gross motor skill, attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation behavioral function includes sensory simulation, escape, access to attention and access to tangible and linguistic function includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending. Further, the sensors also measure gross motor skill, functional response, anxiety and relaxation, emotions, haptic feedback (tactile, Proprioception), reaction time, hemodynamics, audio response, success score of each sub task on the virtual simulated environments during specific animation incidents of the user.
[0095] The method further comprising, monitoring and analyzing patterns of behaviors of the user based on eye gazing coordinates, categorizing, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculating the attention span of the user during said specific interactive animation incidents occurring within said interactive animations.
[0096] The method further comprising, monitoring and analyzing patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions, calculating attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation. [0097] The method of computing the Autism Rating score of the user comprises computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value. The eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc. [0098] The method of computing Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations, capturing, by the BCI, EEG and fNIR sensors of the VRD/HMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity, computing the Autism Rating score, if the computed difference is greater than a threshold level and validating the computed Autism Rating score of the user based on the computed difference value.
[0099] The method of computing Autism Rating score of the user further comprises: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations, capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity, capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold. [00100] The method further comprising, providing a voice communication to the user to identify and perform the activity in specific animation incidents occurring within the interactive animation.
[00101] Figure 12 shows an example level-1 diagnosis - Inter-Pupillary Distance and Eye Alignments according to an exemplary implementation of the present invention.
[00102] The figure shows an example level-1 diagnosis - Inter-Pupillary Distance and Eye Alignments has various steps.
Step 1 : Measure the distance between the eyes with the Eye Tracker integrated with Head Mounted Display (HMD).
Step 2: Check if the eyes are Crossed Eye/Lazy Eye OR if the visual axis of one eye higher than the other eye OR if the eyes are outward turning. The database looks for a match and allow the sign in.
Step 3: Check if the Eye Tracker can adjust and scan these pupil position Step 4: Check if the eyes are centered within the frame by comparing with reference point
Step 5: Check if the eyes are out of focus by comparing with reference point
Step 6: Check if the Eyes can verge and accommodate to a focal point by comparing with reference point
Step 7: If the above diagnosis is successful, then the next step is Diagnosis Level-2
Step 8: If the above diagnosis fails, then the VR intervention cannot be done.
[00103] Figure 13 shows an example level-2 diagnosis - ASD & Non ASD Comparison according to an exemplary implementation of the present invention. [00104] Each session will be conducted for non ASD similar age group and ASD as well. And take the test data.
[00105] Figure 14 shows an example level-3 ASD Content specific Diagnosis according to an exemplary implementation of the present invention.
[00106] The figure shows an example level-3 ASD Content specific Diagnosis of the system involves various steps. Step 1 : Create High Quality contents for Diagnosis in UNITY.
Scenarios like
Checking for Fear - Fire, Heights, Hot, Cold, Thunder, Lightning etc based on threshold level.
Sounds
Sound of Thunder Lightning Sound of vehicles Sound of Rain, Wind Lights
Various Colors Various Intensity Events
Reaction to various events Emotions with interactive character (Avatars)
Reaction to Emotions - Happy, sad etc.
Step 2: The Eye tracking team and the AR/VR Team come together for testing the content through the Eye Tracking integrated HMD to record the feedback for the Diagnostic Contents into a dataset database cloud.
Step 3: create the Eye Tracking records for the diagnostic contents and need to create relations.
[00107] Advantages of the present invention:
The present invention method and system provides outstanding visualization and immersive sessions that aren’t possible in traditional forms of ABA intervention.
The present invention method and system provides engaging and entertaining form of treatment intervention.
Increase client’s engagement Eliminating language barrier Improve quality of teaching
Helps to modify child’s social and emotional behavior
Improves efficiently in areas like social, cognitive and selfcare skills The data is collected in the dashboard which can be used by the Special Education Trainer as and when required, which is in turn used to predict the next intervention needed.
The present invention provides BCI diagnosis and Al based Behavior Performance data on cloud.
[00108] Examples embodiments of skill development scenarios 1) BALL PICKING AND TARGET PRACTICE SCENARIO Procedures involved in Diagnosis:
1) The patient is asked (voice over by interactive character i.e. avatar) to pick up each ball with their hands from the table and throw them towards the target on the wall in front of them, the database will get updated based on the attempts taken by the patient to pick the ball. 2) If the patient hits the target on the wall in front of them once, the database will get updated with success 1 time. 3) The database will also get updated with failed attempts. 4) When the physician completes the diagnosis and hit the finish or complete or closes the session, the database should get updated with diagnosis details. 5) Now the doctor will investigate the reporting system on the diagnosis results. 6) He/ She will be able to see the results in the graphical and textual format.
Intervention:
Autistic children often find it difficult and lack interest to engage in physical exercises, sensory and social interaction. This scenario aims to improve patient’s physical activity, sensory stimulation, visual attention, fine motor control and balance that involves grasping, palmar grasp, attention, concentration and manipulative skills of the patient in a more fun and engaging way. At the end of the therapeutic session the doctors could investigate the reporting system on the intervention results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s learning.
2) BALL IN THE BASKET GAME Procedures involved in Diagnosis:
1) The patient is asked (voice over by interactive character i.e. avatar) to pick up each ball with their hands from the table and put them in a basket placed next to the table, the database will get updated based on the attempts taken by the patient to drop the ball in the basket. 2) If the patient gets points when the ball is correctly dropped into the basket. The database will get updated based on the attempts taken by the patient to grab the ball, successful attempts and also on failed attempts. 3) When the physician completes the diagnosis and hit the finish or complete or closes the session, the database should get updated with diagnosis details. 4) Now the doctor will investigate the reporting system on the diagnosis results. 5) He/ She will be able to see the results in the graphical and textual format.
Intervention:
Autistic children often are found to lack interest to engage in physical exercises, sensory and social interaction. This scenario aims to improve patient’s motor activity, visual attention, fine motor control and balance that involves grasping, palmar grasp and attention skills of the patient in a more fun and engaging way. At the end of the therapeutic session the doctors could investigate the reporting system on the intervention results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s skill learning.
3) COLOR MEMORIZING SCENARIO Procedures involved in Diagnosis:
1) The patient is encouraged to touch toys placed on the shelf, as soon as the patient touches the toy, the colour of the toy will be announced by the avatar.
2)The patient is frequently reminded by the avatar the colour of each object every time he/she touches any object. The database will get updated based on the number of times the patient touches the object. 3) When the physician completes the session and hit the finish or complete or closes the session, the database should get updated with the necessary details. 4) Now the doctor will investigate the reporting system on the diagnosis results. 5) He/she will be able to see the results in the graphical and textual format.
Intervention:
Autistic children often are found to have difficulty in naming colours and understanding different colors. Most often they fail when they are requested to name the colours. This scenario aims to improve the child's ability to follow instructions, visual attention, memory and also help the patient learn about colours that would help them to discriminate between colours in a more engaging and entertaining manner. At the end of the therapeutic session the doctors could investigate the reporting system on the results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s learning.
4) COLOR MATCHING SCENARIO Procedures involved in Diagnosis:
1) The patient/user/children are shown with different coloured objects and are encouraged to group same coloured objects. 2) The patient gets points when the same coloured toys are picked up and grouped together which will be announced by the avatar i.e. interactive character. 3) The database will get updated based on the successful attempts made by the patient to pick up the right same coloured object and place it on the shelf. 3) When the physician completes the diagnosis and hit the finish or complete or closes the session, the database should get updated with the necessary details 4) Now the doctor will investigate the reporting system on the results 5) He/ She will be able to see the results in the graphical and textual format.
Intervention:
This scenario aims to improve the child's ability to follow instructions, visual attention, memory and also help the patient learn and get a much clearer understanding about colours that would help them to familiarize and discriminate between colours. This intervention also teaches the patient to be calm and also understand what colours the patient prefers more compared to other colours. At the end of the therapeutic session the doctors could investigate the reporting system on the results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s learning.
5) OBJECT NAME LEARNING SCENARIO Procedures involved in Diagnosis:
1) The patient is encouraged to touch toys placed on the shelf, as soon as the patient touches any toy, the name of the toy will be announced by the avatar.
2)The database will get updated based on the count of number of toys touched by the patient each time. 3) When the physician completes the session and hit the finish or complete or closes the session, the database should get updated with the necessary details. 4) Now the doctor will investigate the reporting system on the diagnosis results 5) He/ She will be able to see the results in the graphical and textual format.
Intervention:
Most often autistic children are found to be afraid to explore a new object or environment. This scenario aims to improve the child's ability to follow instructions, improve adaptive skills, play skills, visual attention skills, manipulation skills, sensory interaction, memory and help the patient get a much clearer understanding about different objects. This session would also help them to familiarize and discriminate between objects in. At the end of the therapeutic session the doctors could investigate the reporting system on the results. The doctor can either go for the next diagnosis or can keep doing exercise to improve the patient’s learning.
6) 3D PAINTING Procedures involved in Diagnosis:
The patient could draw through the 3D space in this scenario. Patient is taught to draw in the 3D space using his fingers and he can select any color of his choice. His attempt to draw and choosing the color is updated in the database. This is a learning process.
Intervention:
This scenario helps to find out the creative and imaginative skill of the patient. Autistic children may struggle with their fine motor skills, for which the simple act of guiding crayons over 3D space can render a huge improvement. However, as well as honing their motor skills, making drawings allows autistic children to communicate thoughts and feelings, they may otherwise struggle to express, system and method provides complete freedom of expression. Viewing a child’s drawing opens a window into interests, preoccupation and emotions which may go unrewarded in a child with ASD.
7) COLOR IDENTIFICATION AND LEARNING (VIBGYOR) SCENARIO Procedures involved in Diagnosis:
1 ) If the patient is asked (voice over by Avatar) to touch the Red Color for 5 Times, the database should get updated with 5 Attempts 2) If the patient touched the Red color once the database should get updated with Success 1 time 3) The database should be updated with Fail 4 Times 4) When physician completes the Diagnosis and hit the Finish or Complete or closes the session, the Database should get updated with the Diagnosis details 5) Now the Doctor will investigate the Reporting System on the Diagnosis Results. 6) He will be able to see the Results in a Graphical and Textual Format.
Intervention
1) This is a great game for eliciting colors and giving the child immediate feedback. First, different coloured toys to the child and the avatar asks to touch a particular colour. When the child touches the correct color it will be recorded as a successful attempt. This shows the patient has recognized the colour correctly. This helps the child to improve attention skills, memory skills, visual skills, motor skills etc. When the attempt fails it means the patient has some problem in identifying the colour or can have a bad attitude that colour, thus can understand which color triggers the patient and can give therapies based on that Many children are motivated to get the answer right. This reinforces the color concepts with repeated exposure and trials.
2) For e.g.; If the child already knows pink but struggles with green, make sure to incorporate pink. This will also boost their esteem when they know they’ve got one right, (suggestion).
[00109] Example embodiments of social skill development scenarios
1) ZEBRA CROSS TRAFFIC SIGN LEARNING Procedures involved in Diagnosis:
The patient is allowed to stand at the side of the road. He/she is taught to understand the traffic colors. The patient’s attempt to cross the road according to the traffic signals are updated in the database.
Intervention:
This is a social exercise. The patient will learn the traffic rules. Using these may be a great teaching aid for the child to learn the names of the color and purposes of each. This is a simple and fun way to teach STOP and GO that is when the child has to move and stop. This will allow the child to move independently. The focus should be on preventing the child from moving to red and green signal (pedestrian signal) which indicates them to move by learning to control his or her behavior
2) ZEBRA CROSS LEARNING WITH VEHICLE INTERACTION Procedures involved in Diagnosis:
The patient will be at the zebra crossing with lights showing instructions and vehicles obeying traffic rules. Patient scores points for walking in the zebra cross based on signal light. Time elapsed for patient to walk from starting to end of zebra cross is also noted. The database is updated according to the attempt. Intervention:
This scenario is a social exercise Many children with autistic spectrum disorder (ASD) are not independent in street crossing. This skill is particularly crucial because it involves exposure to potentially dangerous situations and is an important step in the development of independence 3) SUPERMARKET Procedures involved in Diagnosis
1 )The patient is asked to pickup things from the shelves in the supermarket which are placed at different places and puts it into the basket. 2) If the patient picks up the correct object then the database will get updated as successful attempt. If not, it records as unsuccessful. 3) After picking up the things the patient goes to the counter and picks up the items from the basket and keeps it on the billing table. 4) After billing, the total amount of the purchase will be shown in the display and the patient has to pickup the correct note from the counter which is placed in front of them. If they do it will be recorded as successful attempt in the database.
Intervention
The patient will have to identify the correct item and put it in the basket, which involves the improvement of motor skills, attention skills, memory skills, visual and auditory attention etc. The items are then to be taken to the billing section, where the patient keeps the taken items on the billing counter, which again involves the improvement of motor skills. When the total amount is displayed, the patient has to pickup the correct amount from the displayed notes, which improves the memory power and the knowledge about handling money. If the amount paid is higher the balance will be given back, this helps the patient to understand the concept of money and transactions.
[00110] Example embodiments of selfcare skill development scenarios 1 ) INDIAN RESTROOM INTERACTION (MALE AND FEMALE SIGN) Procedures involved in Diagnosis:
1 ) The patient is encouraged to enter the ground by the avatar and is taught gender signs. Later the patient is encouraged to enter the Indian restroom based on the respective gender signs. 2)The patient gets points if he/she rightly chooses the restroom based his/her respective gender. 3)The database will get updated based on the number of times the patient chooses the right restrooms based on one’s gender and is also counted as successful attempts made by the patients to enter the restroom. 4) When the physician completes the session and hit the finish or complete or closes the session, the database should get updated with the necessary details. 5) Now the doctor will investigate the reporting system on the results. 6) He/ She will be able to see the results in the graphical and textual format.
Intervention
Autistic children are mostly found to be fearful by nature. This would make them hesitant to enter into close spaces and be all by themselves that often hinder to develop skills to perform daily life activities by themselves. This scenario aims to develop the patient’s interaction skills, help patient discriminate gender sign boards, familiarize with Indian closet and also help patients overcome fear of closed spaces. As self-care is an unavoidable part of life, it is necessary patient should be able to understand one's gender and also learn the difference and discriminate between male and female gender. This scenario also ensures to teach and train the patient to be calm and composed when entering in closed space alone.
2) EUROPEAN RESTROOM INTERACTION (MALE AND FEMALE SIGN) Procedures involved in Diagnosis:
1 ) The patient is encouraged to enter the ground by the avatar and is taught gender signs. Later the patient is encouraged to enter the European restroom based on the respective gender signs. 2)The patient gets points if he/she rightly chooses the restroom based his/her respective gender. 3)The database will get updated based on the number of times the patient chooses the right restrooms based on one’s gender and is also counted as successful attempts made by the patient to enter the restroom. 4) When the physician completes the session and hit the finish or complete or closes the session, the database should get updated with the necessary details. 5) Now the doctor will investigate the reporting system on the results 6) He/ She will be able to see the results in the graphical and textual format.
Intervention
Autistic children are mostly found to be fearful by nature. This would make them hesitant to enter into close spaces and be all by themselves that often hinder to develop skills to perform daily life activities by themselves. This scenario aims to develop the patient’s interaction skills, help patient discriminate gender sign boards, familiarize with European closet and also help patient overcome fear of closed spaces. As self care is an unavoidable part of life, it is necessary patient should be able to understand one's gender and also learn the difference and discriminate between male and female gender. This scenario also ensures to teach and train the patient to be calm and composed when entering in closed space alone.
[00111] The present invention method and system of diagnosing or predicting the levels of autism spectrum disorders (ASD) using XR-AI platform disclosed herein are not limited to a particular computer system platform, controller, processor, operating system, or network. The method and the system disclosed herein are not limited to be executable on any particular system or group of systems, and are not limited to any particular distributed architecture, network, or communication protocol.
[00112] Figures are merely representational and are not drawn to scale. Certain portions thereof may be exaggerated, while others may be minimized. Figures illustrate various embodiments of the invention that can be understood and appropriately carried out by those of ordinary skill in the art.
[00113] In the foregoing detailed description of embodiments of the invention, various features are grouped together in a single embodiment for the purpose of streamlining the invention. This method of invention is not to be interpreted as reflecting an intention that the claimed embodiments of the invention require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description of embodiments of the invention, with each claim standing on its own as a separate embodiment.
[00114] It is understood that the above description is intended to be illustrative, and not restrictive. It is intended to cover all alternatives, modifications and equivalents as may be included within the spirit and scope of the invention as defined in the appended claims. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively.

Claims

Claims :
1. A method for predicting a level of autism spectrum disorders (ASD) using extended reality platform (1200), the method comprising: displaying, by a Virtual Reality Display (VRD) / Head Mounted Display (HMD) a one or more interactive animations to a user (1210); capturing, by a pair of eye tracking device and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors of the VRD/HMD, the user’s eye gazing coordinates display on a screen (111 ) to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and capturing by the BCI, EEG and fNIR sensors, the user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations (1220); computing, by a processing unit of the VRD, an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions (1230); validating, by the processing unit of the VRD/HMD, the computed Autism Rating score of the user with a pre-stored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score (1240).
2. The method as claimed in claim 1 , wherein the cognitive function includes attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation, behavioral function includes gross motor skill, sensory simulation, escape, access to attention and access to tangible and linguistic function includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending.
3. The method as claimed in claim 1 , further comprising, monitoring and analyzing patterns of behaviors of the user based on eye gazing coordinates; categorizing, based on a reference point, the eyes of the user as normal eyes or abnormal eyes; and calculating the attention span of the user during said specific interactive animation incidents occurring within said interactive animations.
4. The method as claimed in claim 1 , further comprising, monitoring and analyzing patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions, calculating attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
5. The method as claimed in claim 1, wherein computing the Autism Rating score of the user comprises: computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates; computing the Autism Rating score, if the computed difference is greater than a threshold level; and validating the computed Autism Rating score of the user based on the computed difference value.
6. The method as claimed in claim 1, wherein the eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc.
7. The method as claimed in claim 1 , wherein Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations; capturing, by the BCI, EEG and fNIR sensors of the VRD/HMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity; computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity; computing the Autism Rating score, if the computed difference is greater than a threshold level; and validating the computed Autism Rating score of the user based on the computed difference value.
8. The method as claimed in claim 1 to 8, wherein computing Autism Rating score of the user further comprises: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations; capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity; capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity; and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold.
9. The method as claimed in claim 1 , further comprising, providing a voice communication to the user to identify and perform the activity in specific animation incidents occurring within the interactive animation.
10. The method as claimed in claim 1 , wherein the computed score is validated with standardized measure i.e. CFCS (Communication Function Classification System).
11. The method as claimed in claim 1 , wherein the specific animation incidents are associated with social, cognitive, communication, and selfcare skill development animations.
12. The method as claimed in claim 1 , further comprising providing a level of autism spectrum disorders (ASD) real time in graphical, textual format.
13. A system for predicting a level of autism spectrum disorders (ASD) using extended reality platform, the system comprising: a Virtual Reality Display (VRD) / Head Mounted Display (HMD) (110) configured to display one or more interactive animations to a user; a pair of eye tracking devices (112) and a combination of BCI, EEG and fNIR (Functional near infrared spectroscopy) sensors (114) configured to capture the user’s eye gazing coordinates display on (111 ) a screen to predict the attention span during one or more specific interactive animation incidents occurring within said interactive animations and configured to capture user’s cognitive functions, behavioral functions and linguistic functions during said specific interactive animation incidents occurring within said interactive animations; a processing unit (116) of the VRD/HMD configured to compute an Autism Rating score of the user based on the captured eye gazing coordinates, cognitive functions, behavioral functions and linguistic functions and further configured to validate the computed Autism Rating score of the user with a prestored Autism Rating score and predicting the level of autism spectrum disorders (ASD) of the user based on the validated Autism Rating score.
14. The system as claimed in claim 13, wherein the cognitive functions includes attention, memory, reasoning, problem solving, decision making, executive function, perception, emotion regulation, behavioral functions includes gross motor skill, sensory simulation, escape, access to attention and access to tangible and linguistic functions includes speech, language, initiating social interaction, expressing personal feeling, describing aspects of the word, requesting information and pretending.
15. The system as claimed in claim 13, wherein the processing unit of VRD/HMD is further configured to monitor and analyze patterns of behaviors of the user based on eye gazing coordinates, categorize, based on a reference point, the eyes of the user as normal eyes or abnormal eyes and calculate the attention span of the user during said specific interactive animation incidents occurring within said interactive animations and further configured to monitor and analyze patterns of behaviors of the user based on reference point of each of captured cognitive functions, behavioral functions and linguistic functions and calculate attention span, attention state, engagement level, mental functioning, emotions, haptic feedback, reaction time, success score of each subtask of the user during specific interactive animation incidents occurring within the interactive animation.
16. The system as claimed in claim 13, wherein processing unit of the VRD is configured to compute the Autism Rating score of the user comprises: computing a difference between the captured user’s eye gazing coordinates and pre-stored eye gazing coordinates; computing the Autism Rating score, if the computed difference is greater than a threshold level; and validating the computed Autism Rating score of the user based on the computed difference value.
17. The system as claimed in claim 13, wherein the eye gazing coordinates parameters include computing the inter pupillary distance, crossed Eye/Lazy Eye, visual axis of eye, eye focus, etc.
18. The system as claimed in claim 13, wherein the processing unit of the VRD is configured to compute the Autism Rating score of the user further includes: displaying and initiating the user to identify and perform an activity during said specific interactive animation incidents occurring within said interactive animations; capturing, by the BCI, EEG and fNIR sensors of the VRD/HMD, the user’s cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity; computing a difference between the captured cognitive function scores, behavioral function scores and linguistic function scores and pre-stored cognitive function scores, behavioral function scores and linguistic function scores while the user is identifying and performing the activity; computing the Autism Rating score, if the computed difference is greater than a threshold level; and validating the computed Autism Rating score of the user based on the computed difference value.
19. The system as claimed in claim 13 to 18, wherein the processing unit of the VRD is configured to compute the Autism Rating score of the user comprises few steps: capturing a total number of attempts performed by the user in identifying and performing the activity during said specific interactive animation incidents occurring within said interactive animations; capturing autism rating score based on a specific threshold, if the user correctly identifies and performs the activity; capturing autism rating score based on a specific threshold, if the user fails to identify and perform the activity; and validating the Autism Rating score of the user based on computed difference values, captured total number of attempts, autism rating score based on a specific threshold.
20. The system as claimed in claim 13, further comprising, a voice communication device (118) which provides voice communication to the user to identify and perform the activity in specific animation incidents occurring within the interactive animation.
21. The system as claimed in claim 13, wherein the specific animation incidents are associated with social, cognitive, communication and selfcare skill development animations.
22. The system as claimed in claim 13, wherein the Virtual Reality Display
(VRD) / Head Mounted Display (HMD) is integrated with combination of sensors and processing unit.
PCT/IN2020/050928 2019-11-04 2020-11-04 A system and method of diagnosing or predicting the levels of autism spectrum disorders (asd) using xr-ai platform WO2021090331A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941044629 2019-11-04
IN201941044629 2019-11-04

Publications (1)

Publication Number Publication Date
WO2021090331A1 true WO2021090331A1 (en) 2021-05-14

Family

ID=75849638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2020/050928 WO2021090331A1 (en) 2019-11-04 2020-11-04 A system and method of diagnosing or predicting the levels of autism spectrum disorders (asd) using xr-ai platform

Country Status (1)

Country Link
WO (1) WO2021090331A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113990449A (en) * 2021-09-30 2022-01-28 浙江强脑科技有限公司 Autism intervention training method and device, terminal device and readable storage medium
JP2023004800A (en) * 2021-06-30 2023-01-17 アクティブレイン バイオ インコーポレイテッド Digital content-based device for providing therapeutics information and method thereof
CN116269390A (en) * 2023-05-12 2023-06-23 深圳市心流科技有限公司 Autism evaluation method, device, electronic device, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017222997A1 (en) * 2016-06-20 2017-12-28 Magic Leap, Inc. Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
WO2018164960A1 (en) * 2017-03-07 2018-09-13 Cornell University Sensory evoked response based attention evaluation systems and methods
US20190066383A1 (en) * 2017-08-25 2019-02-28 National Taiwan Normal University Method and system for performing virtual-reality-based assessment of mental and behavioral condition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017222997A1 (en) * 2016-06-20 2017-12-28 Magic Leap, Inc. Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions
WO2018164960A1 (en) * 2017-03-07 2018-09-13 Cornell University Sensory evoked response based attention evaluation systems and methods
US20190066383A1 (en) * 2017-08-25 2019-02-28 National Taiwan Normal University Method and system for performing virtual-reality-based assessment of mental and behavioral condition

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023004800A (en) * 2021-06-30 2023-01-17 アクティブレイン バイオ インコーポレイテッド Digital content-based device for providing therapeutics information and method thereof
JP7307970B2 (en) 2021-06-30 2023-07-13 アクティブレイン バイオ インコーポレイテッド Treatment information provision device based on digital contents
CN113990449A (en) * 2021-09-30 2022-01-28 浙江强脑科技有限公司 Autism intervention training method and device, terminal device and readable storage medium
CN116269390A (en) * 2023-05-12 2023-06-23 深圳市心流科技有限公司 Autism evaluation method, device, electronic device, and storage medium
CN116269390B (en) * 2023-05-12 2023-08-11 深圳市心流科技有限公司 Autism evaluation method, device, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
US11961197B1 (en) XR health platform, system and method
Williams et al. Perseverations of the academy: A survey of wearable technologies applied to autism intervention
JP6470338B2 (en) Enhanced cognition in the presence of attention diversion and / or interference
Fox et al. Virtual reality: A survival guide for the social scientist
US20050216243A1 (en) Computer-simulated virtual reality environments for evaluation of neurobehavioral performance
CN110890140A (en) Virtual reality-based autism rehabilitation training and capability assessment system and method
KR101598531B1 (en) Polling for interest in computational user-health test output
WO2021090331A1 (en) A system and method of diagnosing or predicting the levels of autism spectrum disorders (asd) using xr-ai platform
US20220254506A1 (en) Extended reality systems and methods for special needs education and therapy
Novak et al. Assessment of student’s emotions in game-based learning
Peña et al. Circus in Motion: a multimodal exergame supporting vestibular therapy for children with autism
Wang et al. Attention-based applications in extended reality to support autistic users: a systematic review
Parsons et al. Neurocognitive and psychophysiological interfaces for adaptive virtual environments
Deusdado et al. Vr scenarios to treat mental health
Kargarandehkordi et al. Computer Vision Estimation of Stress and Anxiety Using a Gamified Mobile-based Ecological Momentary Assessment and Deep Learning: Research Protocol
US20240165518A1 (en) Methods for adaptive behavioral training using gaze-contingent eye tracking and devices thereof
Chen et al. A VR-based Training and Intelligent Assessment System Integrated with Multi-modal Sensing for Children with Autism Spectrum Disorder
Chen et al. Virtual-Reality-Based Supermarket for Intellectual Disability Classification, Diagnostics, and Assessment
Elor Development and evaluation of intelligent immersive virtual reality games to assist physical rehabilitation
Bisogni et al. Gaze analysis: A survey on its applications
US20240012860A1 (en) Systems, methods and computer readable media for special needs service provider matching and reviews
BEng Machine Learning and Electroencephalography for Enhanced Learning in Human-Computer Interaction
Sharma et al. Efficacy of an Android-based game intervention in the enhancement of face recognition skills for children with autism
Arslan Eye tracking for interactive accessibility: a usability analysis of communication interfaces
Whittington The development of a SmartAbility Framework to enhance multimodal interaction for people with reduced physical ability.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20884973

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20884973

Country of ref document: EP

Kind code of ref document: A1